Dec 01 13:58:05 crc systemd[1]: Starting Kubernetes Kubelet... Dec 01 13:58:05 crc restorecon[4584]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 13:58:05 crc restorecon[4584]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 13:58:05 crc restorecon[4584]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 01 13:58:06 crc kubenswrapper[4585]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 01 13:58:06 crc kubenswrapper[4585]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 01 13:58:06 crc kubenswrapper[4585]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 01 13:58:06 crc kubenswrapper[4585]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 01 13:58:06 crc kubenswrapper[4585]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 01 13:58:06 crc kubenswrapper[4585]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.192694 4585 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196300 4585 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196342 4585 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196349 4585 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196355 4585 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196361 4585 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196370 4585 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196381 4585 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196386 4585 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196393 4585 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196398 4585 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196403 4585 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196421 4585 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196427 4585 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196433 4585 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196438 4585 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196443 4585 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196448 4585 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196453 4585 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196459 4585 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196464 4585 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196468 4585 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196473 4585 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196478 4585 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196483 4585 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196487 4585 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196491 4585 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196495 4585 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196499 4585 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196504 4585 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196507 4585 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196512 4585 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196516 4585 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196521 4585 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196526 4585 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196533 4585 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196538 4585 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196544 4585 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196552 4585 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196559 4585 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196566 4585 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196571 4585 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196576 4585 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196580 4585 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196584 4585 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196588 4585 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196592 4585 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196595 4585 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196600 4585 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196604 4585 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196607 4585 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196611 4585 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196615 4585 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196620 4585 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196624 4585 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196628 4585 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196632 4585 feature_gate.go:330] unrecognized feature gate: Example Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196638 4585 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196643 4585 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196648 4585 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196653 4585 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196657 4585 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196661 4585 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196666 4585 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196669 4585 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196673 4585 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196676 4585 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196680 4585 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196683 4585 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196687 4585 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196690 4585 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.196693 4585 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.196817 4585 flags.go:64] FLAG: --address="0.0.0.0" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.196830 4585 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.196839 4585 flags.go:64] FLAG: --anonymous-auth="true" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.196846 4585 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.196853 4585 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.196859 4585 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.196868 4585 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.196875 4585 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.196881 4585 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.196885 4585 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.196891 4585 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.196896 4585 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.196903 4585 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.196908 4585 flags.go:64] FLAG: --cgroup-root="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.196915 4585 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.196920 4585 flags.go:64] FLAG: --client-ca-file="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.196926 4585 flags.go:64] FLAG: --cloud-config="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.196931 4585 flags.go:64] FLAG: --cloud-provider="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.196935 4585 flags.go:64] FLAG: --cluster-dns="[]" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.196943 4585 flags.go:64] FLAG: --cluster-domain="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.196949 4585 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.196955 4585 flags.go:64] FLAG: --config-dir="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.196960 4585 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.196987 4585 flags.go:64] FLAG: --container-log-max-files="5" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.196997 4585 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197002 4585 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197007 4585 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197013 4585 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197018 4585 flags.go:64] FLAG: --contention-profiling="false" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197025 4585 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197030 4585 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197035 4585 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197040 4585 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197047 4585 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197052 4585 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197058 4585 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197063 4585 flags.go:64] FLAG: --enable-load-reader="false" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197067 4585 flags.go:64] FLAG: --enable-server="true" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197073 4585 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197081 4585 flags.go:64] FLAG: --event-burst="100" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197086 4585 flags.go:64] FLAG: --event-qps="50" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197091 4585 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197095 4585 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197101 4585 flags.go:64] FLAG: --eviction-hard="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197108 4585 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197113 4585 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197118 4585 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197124 4585 flags.go:64] FLAG: --eviction-soft="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197132 4585 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197138 4585 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197144 4585 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197151 4585 flags.go:64] FLAG: --experimental-mounter-path="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197157 4585 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197163 4585 flags.go:64] FLAG: --fail-swap-on="true" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197169 4585 flags.go:64] FLAG: --feature-gates="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197177 4585 flags.go:64] FLAG: --file-check-frequency="20s" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197183 4585 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197187 4585 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197192 4585 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197197 4585 flags.go:64] FLAG: --healthz-port="10248" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197202 4585 flags.go:64] FLAG: --help="false" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197206 4585 flags.go:64] FLAG: --hostname-override="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197210 4585 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197215 4585 flags.go:64] FLAG: --http-check-frequency="20s" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197219 4585 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197224 4585 flags.go:64] FLAG: --image-credential-provider-config="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197228 4585 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197232 4585 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197236 4585 flags.go:64] FLAG: --image-service-endpoint="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197240 4585 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197244 4585 flags.go:64] FLAG: --kube-api-burst="100" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197249 4585 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197253 4585 flags.go:64] FLAG: --kube-api-qps="50" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197257 4585 flags.go:64] FLAG: --kube-reserved="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197261 4585 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197265 4585 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197270 4585 flags.go:64] FLAG: --kubelet-cgroups="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197273 4585 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197278 4585 flags.go:64] FLAG: --lock-file="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197282 4585 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197286 4585 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197291 4585 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197297 4585 flags.go:64] FLAG: --log-json-split-stream="false" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197302 4585 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197307 4585 flags.go:64] FLAG: --log-text-split-stream="false" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197311 4585 flags.go:64] FLAG: --logging-format="text" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197316 4585 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197320 4585 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197324 4585 flags.go:64] FLAG: --manifest-url="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197329 4585 flags.go:64] FLAG: --manifest-url-header="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197335 4585 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197340 4585 flags.go:64] FLAG: --max-open-files="1000000" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197347 4585 flags.go:64] FLAG: --max-pods="110" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197353 4585 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197358 4585 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197363 4585 flags.go:64] FLAG: --memory-manager-policy="None" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197368 4585 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197373 4585 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197378 4585 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197384 4585 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197397 4585 flags.go:64] FLAG: --node-status-max-images="50" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197403 4585 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197409 4585 flags.go:64] FLAG: --oom-score-adj="-999" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197414 4585 flags.go:64] FLAG: --pod-cidr="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197419 4585 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197427 4585 flags.go:64] FLAG: --pod-manifest-path="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197432 4585 flags.go:64] FLAG: --pod-max-pids="-1" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197437 4585 flags.go:64] FLAG: --pods-per-core="0" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197442 4585 flags.go:64] FLAG: --port="10250" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197447 4585 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197452 4585 flags.go:64] FLAG: --provider-id="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197457 4585 flags.go:64] FLAG: --qos-reserved="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197462 4585 flags.go:64] FLAG: --read-only-port="10255" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197467 4585 flags.go:64] FLAG: --register-node="true" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197472 4585 flags.go:64] FLAG: --register-schedulable="true" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197476 4585 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197486 4585 flags.go:64] FLAG: --registry-burst="10" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197490 4585 flags.go:64] FLAG: --registry-qps="5" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197494 4585 flags.go:64] FLAG: --reserved-cpus="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197499 4585 flags.go:64] FLAG: --reserved-memory="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197505 4585 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197510 4585 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197514 4585 flags.go:64] FLAG: --rotate-certificates="false" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197518 4585 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197523 4585 flags.go:64] FLAG: --runonce="false" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197528 4585 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197533 4585 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197539 4585 flags.go:64] FLAG: --seccomp-default="false" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197543 4585 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197548 4585 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197553 4585 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197559 4585 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197564 4585 flags.go:64] FLAG: --storage-driver-password="root" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197569 4585 flags.go:64] FLAG: --storage-driver-secure="false" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197574 4585 flags.go:64] FLAG: --storage-driver-table="stats" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197624 4585 flags.go:64] FLAG: --storage-driver-user="root" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197630 4585 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197636 4585 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197642 4585 flags.go:64] FLAG: --system-cgroups="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197647 4585 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197655 4585 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197660 4585 flags.go:64] FLAG: --tls-cert-file="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197665 4585 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197671 4585 flags.go:64] FLAG: --tls-min-version="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197676 4585 flags.go:64] FLAG: --tls-private-key-file="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197681 4585 flags.go:64] FLAG: --topology-manager-policy="none" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197687 4585 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197697 4585 flags.go:64] FLAG: --topology-manager-scope="container" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197702 4585 flags.go:64] FLAG: --v="2" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197709 4585 flags.go:64] FLAG: --version="false" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197716 4585 flags.go:64] FLAG: --vmodule="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197722 4585 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.197727 4585 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.197835 4585 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.197840 4585 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.197844 4585 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.197849 4585 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.197853 4585 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.197856 4585 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.197860 4585 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.197863 4585 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.197867 4585 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.197870 4585 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.197873 4585 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.197881 4585 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.197884 4585 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.197888 4585 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.197891 4585 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.197895 4585 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.197898 4585 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.197902 4585 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.197905 4585 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.197909 4585 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.197912 4585 feature_gate.go:330] unrecognized feature gate: Example Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.197916 4585 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.197919 4585 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.197923 4585 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.197926 4585 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.197930 4585 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.197935 4585 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.197938 4585 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.197943 4585 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.197948 4585 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.197952 4585 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.197957 4585 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.197961 4585 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.197965 4585 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.197988 4585 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.197992 4585 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.197996 4585 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.198000 4585 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.198003 4585 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.198007 4585 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.198011 4585 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.198014 4585 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.198018 4585 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.198024 4585 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.198027 4585 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.198031 4585 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.198034 4585 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.198037 4585 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.198041 4585 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.198044 4585 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.198047 4585 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.198051 4585 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.198054 4585 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.198058 4585 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.198061 4585 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.198065 4585 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.198068 4585 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.198072 4585 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.198077 4585 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.198080 4585 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.198083 4585 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.198087 4585 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.198091 4585 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.198096 4585 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.198100 4585 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.198104 4585 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.198107 4585 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.198111 4585 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.198115 4585 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.198118 4585 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.198122 4585 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.198129 4585 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.208609 4585 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.208676 4585 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.208798 4585 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.208808 4585 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.208815 4585 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.208821 4585 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.208827 4585 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.208832 4585 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.208837 4585 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.208843 4585 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.208848 4585 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.208853 4585 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.208860 4585 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.208869 4585 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.208879 4585 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.208886 4585 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.208892 4585 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.208898 4585 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.208904 4585 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.208910 4585 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.208916 4585 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.208922 4585 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.208929 4585 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.208938 4585 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.208953 4585 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.208960 4585 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.208995 4585 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209003 4585 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209010 4585 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209018 4585 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209025 4585 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209032 4585 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209038 4585 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209045 4585 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209051 4585 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209057 4585 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209063 4585 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209070 4585 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209076 4585 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209083 4585 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209089 4585 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209096 4585 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209102 4585 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209108 4585 feature_gate.go:330] unrecognized feature gate: Example Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209114 4585 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209119 4585 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209128 4585 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209137 4585 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209144 4585 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209150 4585 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209156 4585 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209162 4585 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209168 4585 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209174 4585 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209181 4585 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209188 4585 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209196 4585 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209203 4585 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209212 4585 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209224 4585 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209232 4585 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209238 4585 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209247 4585 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209254 4585 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209261 4585 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209268 4585 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209273 4585 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209279 4585 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209284 4585 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209290 4585 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209295 4585 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209300 4585 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209305 4585 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.209316 4585 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209543 4585 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209555 4585 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209561 4585 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209567 4585 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209573 4585 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209578 4585 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209584 4585 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209589 4585 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209595 4585 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209600 4585 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209605 4585 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209610 4585 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209615 4585 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209621 4585 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209627 4585 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209632 4585 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209637 4585 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209643 4585 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209648 4585 feature_gate.go:330] unrecognized feature gate: Example Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209654 4585 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209660 4585 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209666 4585 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209672 4585 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209678 4585 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209683 4585 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209688 4585 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209693 4585 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209699 4585 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209706 4585 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209714 4585 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209721 4585 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209727 4585 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209734 4585 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209740 4585 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209746 4585 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209752 4585 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209758 4585 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209765 4585 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209772 4585 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209777 4585 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209782 4585 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209788 4585 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209793 4585 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209798 4585 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209803 4585 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209808 4585 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209813 4585 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209818 4585 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209824 4585 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209829 4585 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209834 4585 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209839 4585 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209846 4585 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209852 4585 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209861 4585 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209869 4585 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209876 4585 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209884 4585 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209891 4585 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209899 4585 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209905 4585 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209912 4585 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209918 4585 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209924 4585 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209933 4585 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209941 4585 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209948 4585 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209955 4585 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209961 4585 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.209991 4585 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.210001 4585 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.210011 4585 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.210669 4585 server.go:940] "Client rotation is on, will bootstrap in background" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.214951 4585 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.215109 4585 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.215842 4585 server.go:997] "Starting client certificate rotation" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.215884 4585 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.216392 4585 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2026-01-11 22:43:05.795938371 +0000 UTC Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.216545 4585 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 992h44m59.579397681s for next certificate rotation Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.227362 4585 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.229845 4585 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.237627 4585 log.go:25] "Validated CRI v1 runtime API" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.252589 4585 log.go:25] "Validated CRI v1 image API" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.254505 4585 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.257785 4585 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-01-13-53-11-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.257818 4585 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.269450 4585 manager.go:217] Machine: {Timestamp:2025-12-01 13:58:06.268190907 +0000 UTC m=+0.252404782 CPUVendorID:AuthenticAMD NumCores:8 NumPhysicalCores:1 NumSockets:8 CpuFrequency:2800000 MemoryCapacity:25199480832 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:fcf25ef7-53cd-4591-aa80-a73d07c13768 BootID:e38ee8f4-73f3-495c-b626-577353e9a008 Filesystems:[{Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:2519945216 Type:vfs Inodes:615221 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:3076108 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:12599738368 Type:vfs Inodes:3076108 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:5039898624 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:12599742464 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:429496729600 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:b6:36:7a Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:b6:36:7a Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:93:0b:cf Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:7b:61:19 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:97:bb:a2 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:a4:e5:08 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:06:ab:82:e8:78:94 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:06:32:75:fa:7f:18 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:25199480832 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.269672 4585 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.269858 4585 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.271020 4585 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.271298 4585 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.271401 4585 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.271785 4585 topology_manager.go:138] "Creating topology manager with none policy" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.271800 4585 container_manager_linux.go:303] "Creating device plugin manager" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.272308 4585 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.272368 4585 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.273145 4585 state_mem.go:36] "Initialized new in-memory state store" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.273689 4585 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.275018 4585 kubelet.go:418] "Attempting to sync node with API server" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.275043 4585 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.275071 4585 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.275085 4585 kubelet.go:324] "Adding apiserver pod source" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.275098 4585 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.276985 4585 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.277366 4585 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.277491 4585 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.44:6443: connect: connection refused Dec 01 13:58:06 crc kubenswrapper[4585]: E1201 13:58:06.277936 4585 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.44:6443: connect: connection refused" logger="UnhandledError" Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.277471 4585 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.44:6443: connect: connection refused Dec 01 13:58:06 crc kubenswrapper[4585]: E1201 13:58:06.278070 4585 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.44:6443: connect: connection refused" logger="UnhandledError" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.279867 4585 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.281071 4585 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.281101 4585 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.281109 4585 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.281117 4585 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.281130 4585 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.281137 4585 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.281145 4585 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.281157 4585 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.281168 4585 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.281177 4585 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.281191 4585 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.281199 4585 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.281463 4585 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.281995 4585 server.go:1280] "Started kubelet" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.283333 4585 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.283394 4585 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.284413 4585 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.284576 4585 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.44:6443: connect: connection refused Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.285044 4585 server.go:460] "Adding debug handlers to kubelet server" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.286285 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.286586 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 16:35:40.444592449 +0000 UTC Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.286659 4585 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 1082h37m34.157958866s for next certificate rotation Dec 01 13:58:06 crc systemd[1]: Started Kubernetes Kubelet. Dec 01 13:58:06 crc kubenswrapper[4585]: E1201 13:58:06.286545 4585 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.44:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187d1c1085621b0e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-01 13:58:06.28195611 +0000 UTC m=+0.266169965,LastTimestamp:2025-12-01 13:58:06.28195611 +0000 UTC m=+0.266169965,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.287406 4585 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 01 13:58:06 crc kubenswrapper[4585]: E1201 13:58:06.287624 4585 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.287720 4585 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.287728 4585 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.287867 4585 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.288930 4585 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.44:6443: connect: connection refused Dec 01 13:58:06 crc kubenswrapper[4585]: E1201 13:58:06.289555 4585 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.44:6443: connect: connection refused" logger="UnhandledError" Dec 01 13:58:06 crc kubenswrapper[4585]: E1201 13:58:06.292517 4585 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.44:6443: connect: connection refused" interval="200ms" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.292632 4585 factory.go:153] Registering CRI-O factory Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.292650 4585 factory.go:221] Registration of the crio container factory successfully Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.292730 4585 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.292739 4585 factory.go:55] Registering systemd factory Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.292746 4585 factory.go:221] Registration of the systemd container factory successfully Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.292768 4585 factory.go:103] Registering Raw factory Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.292785 4585 manager.go:1196] Started watching for new ooms in manager Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.296057 4585 manager.go:319] Starting recovery of all containers Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.309011 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.309510 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.309603 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.309663 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.309762 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.309826 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.309896 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.309959 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.310084 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.310163 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.310264 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.310336 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.310395 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.310456 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.310518 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.310582 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.310647 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.310705 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.310775 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.310856 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.310936 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.311029 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.311124 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.311214 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.311300 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.311385 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.313367 4585 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.313428 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.313449 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.313461 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.313472 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.313482 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.313492 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.313503 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.313514 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.313525 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.313535 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.313546 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.313558 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.313569 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.313585 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.313596 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.313611 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.313624 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.313636 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.313649 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.313659 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.313672 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.313683 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.313693 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.313706 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.313718 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.313730 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.313743 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.313763 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.313776 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.313786 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.313797 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.313808 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.313818 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.313830 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.313877 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.313906 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.313917 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.313927 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.313939 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.313951 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.313961 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.313992 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314004 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314016 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314027 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314037 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314050 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314063 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314093 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314104 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314114 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314124 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314134 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314143 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314156 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314168 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314208 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314220 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314233 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314245 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314259 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314271 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314285 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314298 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314315 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314327 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314341 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314351 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314362 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314374 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314384 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314394 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314405 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314414 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314423 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314432 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314442 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314454 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314469 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314481 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314493 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314503 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314512 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314523 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314535 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314547 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314557 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314568 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314578 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314587 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314597 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314606 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314615 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314625 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314637 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314648 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314661 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314676 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314690 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314702 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314713 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314731 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314745 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314758 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314771 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314783 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314799 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314812 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314825 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314835 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314848 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314859 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314871 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314881 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314891 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314902 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314911 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314921 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314931 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314941 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314951 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.314960 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.315010 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.315020 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.315031 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.315041 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.315052 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.315065 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.315076 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.315086 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.315097 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.315107 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.315116 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.315129 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.315146 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.315155 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.315166 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.315178 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.315193 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.315203 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.315214 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.315224 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.315232 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.315243 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.315252 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.315262 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.315270 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.315280 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.315293 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.315305 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.315319 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.315332 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.315342 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.315355 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.315365 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.315378 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.315389 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.315401 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.315414 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.315428 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.315442 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.315453 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.315465 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.315476 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.315488 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.315502 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.315516 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.315530 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.315542 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.315556 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.315570 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.315584 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.315600 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.315615 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.315628 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.315644 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.315658 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.315672 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.315684 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.315697 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.315709 4585 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.315722 4585 reconstruct.go:97] "Volume reconstruction finished" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.315733 4585 reconciler.go:26] "Reconciler: start to sync state" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.318724 4585 manager.go:324] Recovery completed Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.331254 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.338153 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.338240 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.338251 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.342523 4585 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.342546 4585 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.342573 4585 state_mem.go:36] "Initialized new in-memory state store" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.347084 4585 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.349212 4585 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 01 13:58:06 crc kubenswrapper[4585]: E1201 13:58:06.388474 4585 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.411153 4585 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.411239 4585 kubelet.go:2335] "Starting kubelet main sync loop" Dec 01 13:58:06 crc kubenswrapper[4585]: E1201 13:58:06.411364 4585 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 01 13:58:06 crc kubenswrapper[4585]: W1201 13:58:06.412611 4585 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.44:6443: connect: connection refused Dec 01 13:58:06 crc kubenswrapper[4585]: E1201 13:58:06.412684 4585 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.44:6443: connect: connection refused" logger="UnhandledError" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.454480 4585 policy_none.go:49] "None policy: Start" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.455546 4585 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.455600 4585 state_mem.go:35] "Initializing new in-memory state store" Dec 01 13:58:06 crc kubenswrapper[4585]: E1201 13:58:06.488835 4585 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 01 13:58:06 crc kubenswrapper[4585]: E1201 13:58:06.493830 4585 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.44:6443: connect: connection refused" interval="400ms" Dec 01 13:58:06 crc kubenswrapper[4585]: E1201 13:58:06.511745 4585 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.514256 4585 manager.go:334] "Starting Device Plugin manager" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.514324 4585 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.514339 4585 server.go:79] "Starting device plugin registration server" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.515037 4585 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.515056 4585 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.515272 4585 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.515346 4585 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.515353 4585 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 01 13:58:06 crc kubenswrapper[4585]: E1201 13:58:06.522519 4585 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.616005 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.617904 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.617950 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.618021 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.618052 4585 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 13:58:06 crc kubenswrapper[4585]: E1201 13:58:06.618604 4585 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.44:6443: connect: connection refused" node="crc" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.712408 4585 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.712568 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.713868 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.713920 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.713929 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.714065 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.714402 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.714435 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.715040 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.715064 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.715073 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.715178 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.715554 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.715593 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.715994 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.716017 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.716026 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.716414 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.716436 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.716444 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.716603 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.716623 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.716677 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.716690 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.716747 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.716806 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.717331 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.717360 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.717372 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.717516 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.717695 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.717721 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.718044 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.718072 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.718082 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.718356 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.718380 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.718391 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.718527 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.718554 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.718526 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.718639 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.718694 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.719164 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.719189 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.719201 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.818920 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.819849 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.820134 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.820217 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.820350 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.820397 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.820415 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.820448 4585 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.820399 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.820543 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.820576 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.820606 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.820639 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.820695 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.820762 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.820798 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.820842 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.820879 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.820918 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.820949 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 13:58:06 crc kubenswrapper[4585]: E1201 13:58:06.821682 4585 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.44:6443: connect: connection refused" node="crc" Dec 01 13:58:06 crc kubenswrapper[4585]: E1201 13:58:06.895702 4585 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.44:6443: connect: connection refused" interval="800ms" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.922066 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.922292 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.922216 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.922317 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.922425 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.922440 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.922479 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.922514 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.922510 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.922560 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.922530 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.922584 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.922500 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.922611 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.922629 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.922612 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.922647 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.922666 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.922719 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.922704 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.922746 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.922748 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.922779 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.922807 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.922827 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.922849 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.922858 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.922883 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.922889 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 13:58:06 crc kubenswrapper[4585]: I1201 13:58:06.923001 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 13:58:07 crc kubenswrapper[4585]: I1201 13:58:07.056780 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 13:58:07 crc kubenswrapper[4585]: W1201 13:58:07.076427 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-b2b668364569c28ccbbdffc0f55fab420fd0d95bb72d617b8e7ff60a29c6b083 WatchSource:0}: Error finding container b2b668364569c28ccbbdffc0f55fab420fd0d95bb72d617b8e7ff60a29c6b083: Status 404 returned error can't find the container with id b2b668364569c28ccbbdffc0f55fab420fd0d95bb72d617b8e7ff60a29c6b083 Dec 01 13:58:07 crc kubenswrapper[4585]: I1201 13:58:07.083927 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 01 13:58:07 crc kubenswrapper[4585]: W1201 13:58:07.099426 4585 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.44:6443: connect: connection refused Dec 01 13:58:07 crc kubenswrapper[4585]: E1201 13:58:07.099525 4585 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.44:6443: connect: connection refused" logger="UnhandledError" Dec 01 13:58:07 crc kubenswrapper[4585]: I1201 13:58:07.100842 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 13:58:07 crc kubenswrapper[4585]: W1201 13:58:07.103534 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-078c055c119757693d5e5f90f25af65f83191dd0572da7b36bc6c1dbde95bb44 WatchSource:0}: Error finding container 078c055c119757693d5e5f90f25af65f83191dd0572da7b36bc6c1dbde95bb44: Status 404 returned error can't find the container with id 078c055c119757693d5e5f90f25af65f83191dd0572da7b36bc6c1dbde95bb44 Dec 01 13:58:07 crc kubenswrapper[4585]: W1201 13:58:07.115196 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-1a32b8037ff3eb1f3366e8c01053d64827e092614e9421d6fef45084490cf319 WatchSource:0}: Error finding container 1a32b8037ff3eb1f3366e8c01053d64827e092614e9421d6fef45084490cf319: Status 404 returned error can't find the container with id 1a32b8037ff3eb1f3366e8c01053d64827e092614e9421d6fef45084490cf319 Dec 01 13:58:07 crc kubenswrapper[4585]: I1201 13:58:07.123174 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 13:58:07 crc kubenswrapper[4585]: I1201 13:58:07.135510 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 13:58:07 crc kubenswrapper[4585]: W1201 13:58:07.135887 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-997bb5fbd0cbb338525a16d6bc42b904d6a072148971f8171824f380b045034c WatchSource:0}: Error finding container 997bb5fbd0cbb338525a16d6bc42b904d6a072148971f8171824f380b045034c: Status 404 returned error can't find the container with id 997bb5fbd0cbb338525a16d6bc42b904d6a072148971f8171824f380b045034c Dec 01 13:58:07 crc kubenswrapper[4585]: I1201 13:58:07.222614 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 13:58:07 crc kubenswrapper[4585]: I1201 13:58:07.225318 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:07 crc kubenswrapper[4585]: I1201 13:58:07.225382 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:07 crc kubenswrapper[4585]: I1201 13:58:07.225397 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:07 crc kubenswrapper[4585]: I1201 13:58:07.225432 4585 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 13:58:07 crc kubenswrapper[4585]: E1201 13:58:07.226082 4585 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.44:6443: connect: connection refused" node="crc" Dec 01 13:58:07 crc kubenswrapper[4585]: W1201 13:58:07.283464 4585 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.44:6443: connect: connection refused Dec 01 13:58:07 crc kubenswrapper[4585]: E1201 13:58:07.283584 4585 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.44:6443: connect: connection refused" logger="UnhandledError" Dec 01 13:58:07 crc kubenswrapper[4585]: I1201 13:58:07.285423 4585 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.44:6443: connect: connection refused Dec 01 13:58:07 crc kubenswrapper[4585]: I1201 13:58:07.418522 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"078c055c119757693d5e5f90f25af65f83191dd0572da7b36bc6c1dbde95bb44"} Dec 01 13:58:07 crc kubenswrapper[4585]: I1201 13:58:07.419797 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"b2b668364569c28ccbbdffc0f55fab420fd0d95bb72d617b8e7ff60a29c6b083"} Dec 01 13:58:07 crc kubenswrapper[4585]: I1201 13:58:07.420712 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"628e665df7997bb137045295b6e4d1e0fb0050445401ea1d3c9ecdf79fae834e"} Dec 01 13:58:07 crc kubenswrapper[4585]: I1201 13:58:07.421879 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"997bb5fbd0cbb338525a16d6bc42b904d6a072148971f8171824f380b045034c"} Dec 01 13:58:07 crc kubenswrapper[4585]: I1201 13:58:07.422881 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1a32b8037ff3eb1f3366e8c01053d64827e092614e9421d6fef45084490cf319"} Dec 01 13:58:07 crc kubenswrapper[4585]: W1201 13:58:07.603834 4585 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.44:6443: connect: connection refused Dec 01 13:58:07 crc kubenswrapper[4585]: E1201 13:58:07.603955 4585 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.44:6443: connect: connection refused" logger="UnhandledError" Dec 01 13:58:07 crc kubenswrapper[4585]: E1201 13:58:07.696878 4585 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.44:6443: connect: connection refused" interval="1.6s" Dec 01 13:58:07 crc kubenswrapper[4585]: W1201 13:58:07.953333 4585 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.44:6443: connect: connection refused Dec 01 13:58:07 crc kubenswrapper[4585]: E1201 13:58:07.953446 4585 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.44:6443: connect: connection refused" logger="UnhandledError" Dec 01 13:58:08 crc kubenswrapper[4585]: I1201 13:58:08.026873 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 13:58:08 crc kubenswrapper[4585]: I1201 13:58:08.034323 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:08 crc kubenswrapper[4585]: I1201 13:58:08.034366 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:08 crc kubenswrapper[4585]: I1201 13:58:08.034376 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:08 crc kubenswrapper[4585]: I1201 13:58:08.034402 4585 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 13:58:08 crc kubenswrapper[4585]: E1201 13:58:08.034940 4585 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.44:6443: connect: connection refused" node="crc" Dec 01 13:58:08 crc kubenswrapper[4585]: I1201 13:58:08.286127 4585 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.44:6443: connect: connection refused Dec 01 13:58:08 crc kubenswrapper[4585]: I1201 13:58:08.427799 4585 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805" exitCode=0 Dec 01 13:58:08 crc kubenswrapper[4585]: I1201 13:58:08.427950 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 13:58:08 crc kubenswrapper[4585]: I1201 13:58:08.427946 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805"} Dec 01 13:58:08 crc kubenswrapper[4585]: I1201 13:58:08.429184 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:08 crc kubenswrapper[4585]: I1201 13:58:08.429221 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:08 crc kubenswrapper[4585]: I1201 13:58:08.429231 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:08 crc kubenswrapper[4585]: I1201 13:58:08.431180 4585 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="e4586348a7e68fe58af97d73625dda61628f3b2176e495c513e783f9c8f19d33" exitCode=0 Dec 01 13:58:08 crc kubenswrapper[4585]: I1201 13:58:08.431261 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"e4586348a7e68fe58af97d73625dda61628f3b2176e495c513e783f9c8f19d33"} Dec 01 13:58:08 crc kubenswrapper[4585]: I1201 13:58:08.431300 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 13:58:08 crc kubenswrapper[4585]: I1201 13:58:08.432062 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:08 crc kubenswrapper[4585]: I1201 13:58:08.432092 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:08 crc kubenswrapper[4585]: I1201 13:58:08.432101 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:08 crc kubenswrapper[4585]: I1201 13:58:08.434454 4585 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="dfc93b1f443fde238aa3a3b7a475749b55ddefd79b2380833c6288aa73b131db" exitCode=0 Dec 01 13:58:08 crc kubenswrapper[4585]: I1201 13:58:08.434515 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 13:58:08 crc kubenswrapper[4585]: I1201 13:58:08.434514 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"dfc93b1f443fde238aa3a3b7a475749b55ddefd79b2380833c6288aa73b131db"} Dec 01 13:58:08 crc kubenswrapper[4585]: I1201 13:58:08.436066 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:08 crc kubenswrapper[4585]: I1201 13:58:08.436090 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:08 crc kubenswrapper[4585]: I1201 13:58:08.436100 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:08 crc kubenswrapper[4585]: I1201 13:58:08.437648 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f4434211239bc61f432b46918bbb691c6257fded2fbd9552ce3344d10e091b3c"} Dec 01 13:58:08 crc kubenswrapper[4585]: I1201 13:58:08.437683 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b6f66a3bd7503966a7b443b7ea1adbd8554c99e810749bc275e5cfa4e95c6b5f"} Dec 01 13:58:08 crc kubenswrapper[4585]: I1201 13:58:08.437694 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c2571804bbf066bfec1f4fde3430800aa87d9c5b6db48cc1f3d352d742657ca8"} Dec 01 13:58:08 crc kubenswrapper[4585]: I1201 13:58:08.437704 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d240ebf61216f20038983f7b8d2e0792fc0f1ed8db050debf02c777197d417b7"} Dec 01 13:58:08 crc kubenswrapper[4585]: I1201 13:58:08.437710 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 13:58:08 crc kubenswrapper[4585]: I1201 13:58:08.438470 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:08 crc kubenswrapper[4585]: I1201 13:58:08.438488 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:08 crc kubenswrapper[4585]: I1201 13:58:08.438497 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:08 crc kubenswrapper[4585]: I1201 13:58:08.438864 4585 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda" exitCode=0 Dec 01 13:58:08 crc kubenswrapper[4585]: I1201 13:58:08.438899 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda"} Dec 01 13:58:08 crc kubenswrapper[4585]: I1201 13:58:08.438986 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 13:58:08 crc kubenswrapper[4585]: I1201 13:58:08.439644 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:08 crc kubenswrapper[4585]: I1201 13:58:08.439668 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:08 crc kubenswrapper[4585]: I1201 13:58:08.439676 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:08 crc kubenswrapper[4585]: I1201 13:58:08.443806 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 13:58:08 crc kubenswrapper[4585]: I1201 13:58:08.444820 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:08 crc kubenswrapper[4585]: I1201 13:58:08.444844 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:08 crc kubenswrapper[4585]: I1201 13:58:08.444855 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:08 crc kubenswrapper[4585]: I1201 13:58:08.800897 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 13:58:09 crc kubenswrapper[4585]: I1201 13:58:09.445030 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"a2e64b25d69a7d86297a1761d8a2c7e62e508e69a19ff6d48679efee71724b3d"} Dec 01 13:58:09 crc kubenswrapper[4585]: I1201 13:58:09.445118 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"fbb6a609f35d4968e1f0a001362568ddcc21b80f897ad76b3b7f9e7f4b1651af"} Dec 01 13:58:09 crc kubenswrapper[4585]: I1201 13:58:09.445137 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"43ecbe420cb6fc8702eae53c7d3a369f12a986685824f1f08c15398e47985cca"} Dec 01 13:58:09 crc kubenswrapper[4585]: I1201 13:58:09.445179 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 13:58:09 crc kubenswrapper[4585]: I1201 13:58:09.446260 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:09 crc kubenswrapper[4585]: I1201 13:58:09.446296 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:09 crc kubenswrapper[4585]: I1201 13:58:09.446306 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:09 crc kubenswrapper[4585]: I1201 13:58:09.450751 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e7dfb22c5155fdc4bfdf9d54bf540e4018bca6a616835af7e8d5079bebcd45da"} Dec 01 13:58:09 crc kubenswrapper[4585]: I1201 13:58:09.450785 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7dc9645fabd7968d92ce9867130dddbb6c72664af68d6b5475cf3271d1975878"} Dec 01 13:58:09 crc kubenswrapper[4585]: I1201 13:58:09.450795 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"aa22ac0ee9ad705a8affafffed373135005ef0380c9a646e06229a24d04d20b4"} Dec 01 13:58:09 crc kubenswrapper[4585]: I1201 13:58:09.450804 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"949f7fe4da35866e66212aea18ec1ed303bd542b07e53ff6562618c19de112cc"} Dec 01 13:58:09 crc kubenswrapper[4585]: I1201 13:58:09.450813 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b40ee42d7d1c9043b0b5c0dea3baacc46c24842ec64ed3b0d0ae1acfaf65f8e0"} Dec 01 13:58:09 crc kubenswrapper[4585]: I1201 13:58:09.450918 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 13:58:09 crc kubenswrapper[4585]: I1201 13:58:09.451714 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:09 crc kubenswrapper[4585]: I1201 13:58:09.451735 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:09 crc kubenswrapper[4585]: I1201 13:58:09.451744 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:09 crc kubenswrapper[4585]: I1201 13:58:09.453392 4585 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5" exitCode=0 Dec 01 13:58:09 crc kubenswrapper[4585]: I1201 13:58:09.453433 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5"} Dec 01 13:58:09 crc kubenswrapper[4585]: I1201 13:58:09.453514 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 13:58:09 crc kubenswrapper[4585]: I1201 13:58:09.454166 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:09 crc kubenswrapper[4585]: I1201 13:58:09.454237 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:09 crc kubenswrapper[4585]: I1201 13:58:09.454299 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:09 crc kubenswrapper[4585]: I1201 13:58:09.457222 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"b4cd6715a83e1d1fc7d714956406fd0260653d09700c83f129dc4399bb089228"} Dec 01 13:58:09 crc kubenswrapper[4585]: I1201 13:58:09.457322 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 13:58:09 crc kubenswrapper[4585]: I1201 13:58:09.457357 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 13:58:09 crc kubenswrapper[4585]: I1201 13:58:09.458614 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:09 crc kubenswrapper[4585]: I1201 13:58:09.458654 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:09 crc kubenswrapper[4585]: I1201 13:58:09.458665 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:09 crc kubenswrapper[4585]: I1201 13:58:09.458754 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:09 crc kubenswrapper[4585]: I1201 13:58:09.458791 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:09 crc kubenswrapper[4585]: I1201 13:58:09.458805 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:09 crc kubenswrapper[4585]: I1201 13:58:09.550199 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 13:58:09 crc kubenswrapper[4585]: I1201 13:58:09.636055 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 13:58:09 crc kubenswrapper[4585]: I1201 13:58:09.637528 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:09 crc kubenswrapper[4585]: I1201 13:58:09.637575 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:09 crc kubenswrapper[4585]: I1201 13:58:09.637584 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:09 crc kubenswrapper[4585]: I1201 13:58:09.637615 4585 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 13:58:09 crc kubenswrapper[4585]: I1201 13:58:09.681356 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 13:58:10 crc kubenswrapper[4585]: I1201 13:58:10.061964 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 13:58:10 crc kubenswrapper[4585]: I1201 13:58:10.463840 4585 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967" exitCode=0 Dec 01 13:58:10 crc kubenswrapper[4585]: I1201 13:58:10.463904 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967"} Dec 01 13:58:10 crc kubenswrapper[4585]: I1201 13:58:10.463994 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 13:58:10 crc kubenswrapper[4585]: I1201 13:58:10.464012 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 13:58:10 crc kubenswrapper[4585]: I1201 13:58:10.464110 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 13:58:10 crc kubenswrapper[4585]: I1201 13:58:10.464671 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 13:58:10 crc kubenswrapper[4585]: I1201 13:58:10.464718 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 13:58:10 crc kubenswrapper[4585]: I1201 13:58:10.464996 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:10 crc kubenswrapper[4585]: I1201 13:58:10.465049 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:10 crc kubenswrapper[4585]: I1201 13:58:10.465050 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:10 crc kubenswrapper[4585]: I1201 13:58:10.465077 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:10 crc kubenswrapper[4585]: I1201 13:58:10.465087 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:10 crc kubenswrapper[4585]: I1201 13:58:10.465059 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:10 crc kubenswrapper[4585]: I1201 13:58:10.465665 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:10 crc kubenswrapper[4585]: I1201 13:58:10.465692 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:10 crc kubenswrapper[4585]: I1201 13:58:10.465703 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:10 crc kubenswrapper[4585]: I1201 13:58:10.465706 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:10 crc kubenswrapper[4585]: I1201 13:58:10.465718 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:10 crc kubenswrapper[4585]: I1201 13:58:10.465726 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:10 crc kubenswrapper[4585]: I1201 13:58:10.465715 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:10 crc kubenswrapper[4585]: I1201 13:58:10.465830 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:10 crc kubenswrapper[4585]: I1201 13:58:10.465839 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:10 crc kubenswrapper[4585]: I1201 13:58:10.927271 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 13:58:11 crc kubenswrapper[4585]: I1201 13:58:11.469578 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 13:58:11 crc kubenswrapper[4585]: I1201 13:58:11.471614 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 13:58:11 crc kubenswrapper[4585]: I1201 13:58:11.471648 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 13:58:11 crc kubenswrapper[4585]: I1201 13:58:11.471727 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 13:58:11 crc kubenswrapper[4585]: I1201 13:58:11.471640 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"9736ff6b49635cb7c26db7bc2292b36cf63053e0d1e4ffa2980a520dd71d9ae8"} Dec 01 13:58:11 crc kubenswrapper[4585]: I1201 13:58:11.471823 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"1e35deff79558e54b62aa868535d5bc4d4374ede161463dcf5f3a33584867447"} Dec 01 13:58:11 crc kubenswrapper[4585]: I1201 13:58:11.471847 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e952d1592344330416348b9ec4add60740a0d23ab554afb70c6419977fb12533"} Dec 01 13:58:11 crc kubenswrapper[4585]: I1201 13:58:11.471861 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"83ba28d0b57f4744f3f7f97915fde983e2de227958ddaecd43b3eae9eb406774"} Dec 01 13:58:11 crc kubenswrapper[4585]: I1201 13:58:11.473190 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:11 crc kubenswrapper[4585]: I1201 13:58:11.473227 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:11 crc kubenswrapper[4585]: I1201 13:58:11.473239 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:11 crc kubenswrapper[4585]: I1201 13:58:11.473279 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:11 crc kubenswrapper[4585]: I1201 13:58:11.473302 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:11 crc kubenswrapper[4585]: I1201 13:58:11.473315 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:11 crc kubenswrapper[4585]: I1201 13:58:11.473530 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:11 crc kubenswrapper[4585]: I1201 13:58:11.473563 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:11 crc kubenswrapper[4585]: I1201 13:58:11.473574 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:11 crc kubenswrapper[4585]: I1201 13:58:11.802148 4585 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 01 13:58:11 crc kubenswrapper[4585]: I1201 13:58:11.802249 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 13:58:12 crc kubenswrapper[4585]: I1201 13:58:12.345375 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 13:58:12 crc kubenswrapper[4585]: I1201 13:58:12.478338 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3fe7079ed6fa6b8b64d16c67287c33c2bffa0f42df2166fe66e3f8e173ca87d8"} Dec 01 13:58:12 crc kubenswrapper[4585]: I1201 13:58:12.478517 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 13:58:12 crc kubenswrapper[4585]: I1201 13:58:12.479173 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 13:58:12 crc kubenswrapper[4585]: I1201 13:58:12.479180 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 13:58:12 crc kubenswrapper[4585]: I1201 13:58:12.479239 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:12 crc kubenswrapper[4585]: I1201 13:58:12.479441 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:12 crc kubenswrapper[4585]: I1201 13:58:12.479456 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:12 crc kubenswrapper[4585]: I1201 13:58:12.479803 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:12 crc kubenswrapper[4585]: I1201 13:58:12.479826 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:12 crc kubenswrapper[4585]: I1201 13:58:12.479837 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:12 crc kubenswrapper[4585]: I1201 13:58:12.480651 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:12 crc kubenswrapper[4585]: I1201 13:58:12.480675 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:12 crc kubenswrapper[4585]: I1201 13:58:12.480686 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:12 crc kubenswrapper[4585]: I1201 13:58:12.863740 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 01 13:58:13 crc kubenswrapper[4585]: I1201 13:58:13.480600 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 13:58:13 crc kubenswrapper[4585]: I1201 13:58:13.481718 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 13:58:13 crc kubenswrapper[4585]: I1201 13:58:13.481933 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:13 crc kubenswrapper[4585]: I1201 13:58:13.481958 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:13 crc kubenswrapper[4585]: I1201 13:58:13.481967 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:13 crc kubenswrapper[4585]: I1201 13:58:13.482729 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:13 crc kubenswrapper[4585]: I1201 13:58:13.482778 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:13 crc kubenswrapper[4585]: I1201 13:58:13.482787 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:14 crc kubenswrapper[4585]: I1201 13:58:14.483585 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 13:58:14 crc kubenswrapper[4585]: I1201 13:58:14.485015 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:14 crc kubenswrapper[4585]: I1201 13:58:14.485052 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:14 crc kubenswrapper[4585]: I1201 13:58:14.485081 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:15 crc kubenswrapper[4585]: I1201 13:58:15.537350 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 13:58:15 crc kubenswrapper[4585]: I1201 13:58:15.537599 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 13:58:15 crc kubenswrapper[4585]: I1201 13:58:15.538887 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:15 crc kubenswrapper[4585]: I1201 13:58:15.538927 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:15 crc kubenswrapper[4585]: I1201 13:58:15.538937 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:15 crc kubenswrapper[4585]: I1201 13:58:15.542184 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 13:58:16 crc kubenswrapper[4585]: I1201 13:58:16.489851 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 13:58:16 crc kubenswrapper[4585]: I1201 13:58:16.491429 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:16 crc kubenswrapper[4585]: I1201 13:58:16.491696 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:16 crc kubenswrapper[4585]: I1201 13:58:16.491943 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:16 crc kubenswrapper[4585]: E1201 13:58:16.522651 4585 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 01 13:58:19 crc kubenswrapper[4585]: I1201 13:58:19.042880 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 01 13:58:19 crc kubenswrapper[4585]: I1201 13:58:19.043270 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 13:58:19 crc kubenswrapper[4585]: I1201 13:58:19.045148 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:19 crc kubenswrapper[4585]: I1201 13:58:19.045203 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:19 crc kubenswrapper[4585]: I1201 13:58:19.045218 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:19 crc kubenswrapper[4585]: I1201 13:58:19.285777 4585 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Dec 01 13:58:19 crc kubenswrapper[4585]: E1201 13:58:19.298440 4585 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" interval="3.2s" Dec 01 13:58:19 crc kubenswrapper[4585]: I1201 13:58:19.551070 4585 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 01 13:58:19 crc kubenswrapper[4585]: I1201 13:58:19.551168 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 01 13:58:19 crc kubenswrapper[4585]: E1201 13:58:19.638613 4585 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": net/http: TLS handshake timeout" node="crc" Dec 01 13:58:19 crc kubenswrapper[4585]: W1201 13:58:19.683754 4585 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 01 13:58:19 crc kubenswrapper[4585]: I1201 13:58:19.684093 4585 trace.go:236] Trace[2002595135]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Dec-2025 13:58:09.682) (total time: 10001ms): Dec 01 13:58:19 crc kubenswrapper[4585]: Trace[2002595135]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (13:58:19.683) Dec 01 13:58:19 crc kubenswrapper[4585]: Trace[2002595135]: [10.001656893s] [10.001656893s] END Dec 01 13:58:19 crc kubenswrapper[4585]: E1201 13:58:19.684134 4585 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 01 13:58:19 crc kubenswrapper[4585]: I1201 13:58:19.686366 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 13:58:19 crc kubenswrapper[4585]: I1201 13:58:19.686495 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 13:58:19 crc kubenswrapper[4585]: I1201 13:58:19.687661 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:19 crc kubenswrapper[4585]: I1201 13:58:19.687699 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:19 crc kubenswrapper[4585]: I1201 13:58:19.687710 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:19 crc kubenswrapper[4585]: W1201 13:58:19.794079 4585 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 01 13:58:19 crc kubenswrapper[4585]: I1201 13:58:19.794232 4585 trace.go:236] Trace[1858574104]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Dec-2025 13:58:09.792) (total time: 10001ms): Dec 01 13:58:19 crc kubenswrapper[4585]: Trace[1858574104]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (13:58:19.794) Dec 01 13:58:19 crc kubenswrapper[4585]: Trace[1858574104]: [10.001652353s] [10.001652353s] END Dec 01 13:58:19 crc kubenswrapper[4585]: E1201 13:58:19.794257 4585 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 01 13:58:19 crc kubenswrapper[4585]: W1201 13:58:19.917027 4585 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 01 13:58:19 crc kubenswrapper[4585]: I1201 13:58:19.917137 4585 trace.go:236] Trace[262640341]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Dec-2025 13:58:09.915) (total time: 10001ms): Dec 01 13:58:19 crc kubenswrapper[4585]: Trace[262640341]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (13:58:19.917) Dec 01 13:58:19 crc kubenswrapper[4585]: Trace[262640341]: [10.00124783s] [10.00124783s] END Dec 01 13:58:19 crc kubenswrapper[4585]: E1201 13:58:19.917163 4585 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 01 13:58:20 crc kubenswrapper[4585]: I1201 13:58:20.415646 4585 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 01 13:58:20 crc kubenswrapper[4585]: I1201 13:58:20.415719 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 01 13:58:20 crc kubenswrapper[4585]: I1201 13:58:20.441218 4585 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 01 13:58:20 crc kubenswrapper[4585]: I1201 13:58:20.441299 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 01 13:58:21 crc kubenswrapper[4585]: I1201 13:58:21.802600 4585 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 01 13:58:21 crc kubenswrapper[4585]: I1201 13:58:21.802719 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 01 13:58:22 crc kubenswrapper[4585]: I1201 13:58:22.353105 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 13:58:22 crc kubenswrapper[4585]: I1201 13:58:22.353368 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 13:58:22 crc kubenswrapper[4585]: I1201 13:58:22.355152 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:22 crc kubenswrapper[4585]: I1201 13:58:22.355209 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:22 crc kubenswrapper[4585]: I1201 13:58:22.355234 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:22 crc kubenswrapper[4585]: I1201 13:58:22.359284 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 13:58:22 crc kubenswrapper[4585]: I1201 13:58:22.506627 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 13:58:22 crc kubenswrapper[4585]: I1201 13:58:22.508203 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:22 crc kubenswrapper[4585]: I1201 13:58:22.508264 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:22 crc kubenswrapper[4585]: I1201 13:58:22.508279 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:22 crc kubenswrapper[4585]: I1201 13:58:22.839074 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 13:58:22 crc kubenswrapper[4585]: I1201 13:58:22.841003 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:22 crc kubenswrapper[4585]: I1201 13:58:22.841057 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:22 crc kubenswrapper[4585]: I1201 13:58:22.841068 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:22 crc kubenswrapper[4585]: I1201 13:58:22.841103 4585 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 13:58:22 crc kubenswrapper[4585]: E1201 13:58:22.846754 4585 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Dec 01 13:58:23 crc kubenswrapper[4585]: I1201 13:58:23.587524 4585 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 01 13:58:24 crc kubenswrapper[4585]: I1201 13:58:24.144531 4585 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 01 13:58:24 crc kubenswrapper[4585]: I1201 13:58:24.286047 4585 apiserver.go:52] "Watching apiserver" Dec 01 13:58:24 crc kubenswrapper[4585]: I1201 13:58:24.289382 4585 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 01 13:58:24 crc kubenswrapper[4585]: I1201 13:58:24.289858 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb"] Dec 01 13:58:24 crc kubenswrapper[4585]: I1201 13:58:24.290394 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 13:58:24 crc kubenswrapper[4585]: I1201 13:58:24.290430 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 13:58:24 crc kubenswrapper[4585]: I1201 13:58:24.290644 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 13:58:24 crc kubenswrapper[4585]: I1201 13:58:24.291003 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 13:58:24 crc kubenswrapper[4585]: I1201 13:58:24.291066 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 13:58:24 crc kubenswrapper[4585]: E1201 13:58:24.291087 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 13:58:24 crc kubenswrapper[4585]: I1201 13:58:24.291849 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 13:58:24 crc kubenswrapper[4585]: E1201 13:58:24.291993 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 13:58:24 crc kubenswrapper[4585]: E1201 13:58:24.291836 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 13:58:24 crc kubenswrapper[4585]: I1201 13:58:24.293605 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 01 13:58:24 crc kubenswrapper[4585]: I1201 13:58:24.293854 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 01 13:58:24 crc kubenswrapper[4585]: I1201 13:58:24.293882 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 01 13:58:24 crc kubenswrapper[4585]: I1201 13:58:24.294010 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 01 13:58:24 crc kubenswrapper[4585]: I1201 13:58:24.294152 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 01 13:58:24 crc kubenswrapper[4585]: I1201 13:58:24.294271 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 01 13:58:24 crc kubenswrapper[4585]: I1201 13:58:24.294575 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 01 13:58:24 crc kubenswrapper[4585]: I1201 13:58:24.294584 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 01 13:58:24 crc kubenswrapper[4585]: I1201 13:58:24.294930 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 01 13:58:24 crc kubenswrapper[4585]: I1201 13:58:24.320780 4585 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 01 13:58:24 crc kubenswrapper[4585]: I1201 13:58:24.325189 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 13:58:24 crc kubenswrapper[4585]: I1201 13:58:24.339564 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 13:58:24 crc kubenswrapper[4585]: I1201 13:58:24.354741 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 13:58:24 crc kubenswrapper[4585]: I1201 13:58:24.370028 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 13:58:24 crc kubenswrapper[4585]: I1201 13:58:24.383103 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 13:58:24 crc kubenswrapper[4585]: I1201 13:58:24.388840 4585 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 01 13:58:24 crc kubenswrapper[4585]: I1201 13:58:24.394864 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 13:58:24 crc kubenswrapper[4585]: I1201 13:58:24.408915 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.446854 4585 trace.go:236] Trace[911374639]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Dec-2025 13:58:11.006) (total time: 14440ms): Dec 01 13:58:25 crc kubenswrapper[4585]: Trace[911374639]: ---"Objects listed" error: 14439ms (13:58:25.446) Dec 01 13:58:25 crc kubenswrapper[4585]: Trace[911374639]: [14.440009059s] [14.440009059s] END Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.446897 4585 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.449036 4585 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.488266 4585 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:48154->192.168.126.11:17697: read: connection reset by peer" start-of-body= Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.488588 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:48154->192.168.126.11:17697: read: connection reset by peer" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.489130 4585 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.489203 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.518076 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.520865 4585 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e7dfb22c5155fdc4bfdf9d54bf540e4018bca6a616835af7e8d5079bebcd45da" exitCode=255 Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.521021 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"e7dfb22c5155fdc4bfdf9d54bf540e4018bca6a616835af7e8d5079bebcd45da"} Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.536958 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.550368 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.550418 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.550440 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.550467 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.550500 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.550517 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.550533 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.550551 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.550573 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.550930 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.550967 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.550926 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.550948 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.551028 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.551101 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.551150 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.551191 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.551203 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.551212 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.551273 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.551298 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.551320 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.551340 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.551360 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.551378 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.551395 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.551416 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.551437 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.551454 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.551473 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.551491 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.551508 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.551524 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.551540 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.551557 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.551575 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.551593 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.551611 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.551631 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.551649 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.551666 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.551682 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.551698 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.551712 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.551730 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.551749 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.551763 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.551793 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.551807 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.551821 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.551836 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.551221 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.551893 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.551285 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.551396 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.551917 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.551966 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.552011 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.552028 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.552045 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.552066 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.552081 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.552100 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.552118 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.552142 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.552167 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.552183 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.552201 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.552217 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.552232 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.552249 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.552265 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.552281 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.552297 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.552314 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.552329 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.552344 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.552359 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.552376 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.552391 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.552408 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.552423 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.552437 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.552453 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.552470 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.552485 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.552502 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.552518 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.552533 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.552576 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.552591 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.552606 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.552626 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.552644 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.552660 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.552676 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.552694 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.552728 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.552744 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.552759 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.552776 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.552792 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.552807 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.552822 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.552838 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.552857 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.552875 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.552890 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.552905 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.552923 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.552938 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.552955 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.552990 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.553006 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.553022 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.553037 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.553053 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.553068 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.553083 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.553100 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.553115 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.553136 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.553157 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.553174 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.553193 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.553212 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.553227 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.553244 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.553260 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.553277 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.553293 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.553309 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.553326 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.553341 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.553357 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.553372 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.553389 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.553406 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.553421 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.553437 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.553452 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.553468 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.553485 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.553502 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.553520 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.553537 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.553569 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.553587 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.553605 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.553621 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.553638 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.553654 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.553670 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.553687 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.553706 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.553722 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.553737 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.553753 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.553770 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.553785 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.553801 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.553818 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.553833 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.553851 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.553867 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.553885 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.553901 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.553918 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.553935 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.553952 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.553967 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.554109 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.554133 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.554156 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.554176 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.554193 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.554210 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.554229 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.554459 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.554479 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.554496 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.554512 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.554535 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.554551 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.554567 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.554584 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.554603 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.554619 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.554635 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.554650 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.554672 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.554692 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.554708 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.554725 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.554742 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.554759 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.554776 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.554792 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.554808 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.554825 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.554841 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.554857 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.554872 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.554889 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.554931 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.554951 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.554988 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.555012 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.555056 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.555077 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.555096 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.555113 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.555135 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.555165 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.555183 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.555202 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.555222 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.555243 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.555305 4585 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.555317 4585 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.555327 4585 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.555336 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.555346 4585 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.555359 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.555368 4585 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.555377 4585 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.555387 4585 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.551570 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.551869 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.551990 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.552117 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.552435 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.552608 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.552847 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.553075 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.553205 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.553485 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.553482 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.553470 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.553744 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.553867 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.553991 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.554294 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.554647 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.554874 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.569411 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.555178 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.555315 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.555542 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.555783 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.556029 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.556147 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.556332 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.556758 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.556965 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.557191 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.557394 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.557431 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.557545 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.558540 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.558601 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.558961 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.559021 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.559708 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.559856 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.560007 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.560140 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.560342 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.560547 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.560711 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.560749 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.560878 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.560955 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.561123 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.561556 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.561855 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.561863 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.561921 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.562226 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.562502 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.562541 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.563126 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.563872 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.563896 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.563998 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.564143 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: E1201 13:58:25.563963 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 13:58:26.063930685 +0000 UTC m=+20.048144550 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.564282 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.564799 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.565166 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.565497 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.565507 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.565534 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.565589 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.565838 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.566268 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.566556 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.566828 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.566826 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.567250 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.567732 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.568545 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.568954 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.569894 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.570305 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.570528 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.570832 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.571101 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.571536 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.572427 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.574385 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.574729 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.575856 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.576253 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.576438 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.577436 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.577461 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.577768 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.580794 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.584165 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.584586 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.585044 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.585510 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.585930 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.586083 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.586274 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.586309 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.586798 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.587915 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.588024 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.588421 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.588553 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.589021 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.589309 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.589807 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.590208 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.590449 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.590667 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.590744 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.590923 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.591123 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.591328 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.591372 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.591679 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.591756 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.592050 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.592088 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.592161 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.592741 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.593277 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.593353 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.596796 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.596883 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.597208 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.597208 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.597370 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.597468 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.597367 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.597718 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.597945 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.598079 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.598191 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: E1201 13:58:25.598361 4585 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 13:58:25 crc kubenswrapper[4585]: E1201 13:58:25.598389 4585 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 13:58:25 crc kubenswrapper[4585]: E1201 13:58:25.598407 4585 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 13:58:25 crc kubenswrapper[4585]: E1201 13:58:25.598481 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 13:58:26.098455262 +0000 UTC m=+20.082669307 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.598924 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.599226 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.599461 4585 scope.go:117] "RemoveContainer" containerID="e7dfb22c5155fdc4bfdf9d54bf540e4018bca6a616835af7e8d5079bebcd45da" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.600268 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.600274 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.600384 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.600488 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.600672 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.601074 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.601218 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.601563 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.602533 4585 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.603002 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.603210 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: E1201 13:58:25.603300 4585 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 13:58:25 crc kubenswrapper[4585]: E1201 13:58:25.603442 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 13:58:26.103407422 +0000 UTC m=+20.087621417 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.603772 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.603959 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: E1201 13:58:25.604260 4585 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 13:58:25 crc kubenswrapper[4585]: E1201 13:58:25.604309 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 13:58:26.104291055 +0000 UTC m=+20.088504910 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.605399 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.605849 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.606245 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.577916 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.606363 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.606833 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.607310 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.607396 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.607406 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.607670 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.607707 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.608010 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.608193 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.608185 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.608202 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.608686 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.608803 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.608865 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.609377 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.609515 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.610434 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.611312 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.613006 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.614106 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.613081 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.613488 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.613549 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.613899 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.613998 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.614283 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.614663 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.581385 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.617064 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.611671 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: E1201 13:58:25.621777 4585 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 13:58:25 crc kubenswrapper[4585]: E1201 13:58:25.621873 4585 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 13:58:25 crc kubenswrapper[4585]: E1201 13:58:25.621893 4585 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 13:58:25 crc kubenswrapper[4585]: E1201 13:58:25.622035 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 13:58:26.121951129 +0000 UTC m=+20.106165164 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.622411 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.622507 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.625307 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.625599 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.626169 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.627311 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.627665 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.640353 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.642514 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.645610 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.645676 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.650084 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.652624 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.655872 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.655921 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656036 4585 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656050 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656059 4585 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656068 4585 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656077 4585 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656086 4585 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656094 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656103 4585 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656111 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656120 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656129 4585 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656138 4585 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656147 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656155 4585 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656164 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656175 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656183 4585 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656192 4585 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656199 4585 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656208 4585 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656217 4585 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656225 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656233 4585 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656242 4585 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656249 4585 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656257 4585 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656266 4585 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656273 4585 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656281 4585 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656290 4585 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656298 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656308 4585 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656316 4585 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656326 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656333 4585 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656342 4585 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656350 4585 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656360 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656368 4585 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656375 4585 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656383 4585 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656391 4585 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656398 4585 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656406 4585 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656414 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656422 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656430 4585 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656438 4585 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656445 4585 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656453 4585 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656461 4585 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656590 4585 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656602 4585 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656609 4585 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656618 4585 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656628 4585 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656637 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656645 4585 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656674 4585 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656683 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656695 4585 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656704 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656712 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656720 4585 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656747 4585 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656769 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656777 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656786 4585 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656794 4585 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656802 4585 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656831 4585 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656840 4585 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656849 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656857 4585 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656865 4585 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656874 4585 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656905 4585 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656914 4585 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656922 4585 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656931 4585 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656938 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656947 4585 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656955 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.656965 4585 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657176 4585 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657184 4585 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657193 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657202 4585 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657210 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657218 4585 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657225 4585 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657233 4585 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657241 4585 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657250 4585 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657259 4585 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657266 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657275 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657283 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657291 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657321 4585 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657333 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657343 4585 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657355 4585 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657367 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657379 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657389 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657399 4585 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657411 4585 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657423 4585 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657435 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657445 4585 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657454 4585 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657466 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657475 4585 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657485 4585 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657495 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657504 4585 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657517 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657525 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657533 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657542 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657550 4585 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657558 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657566 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657574 4585 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657582 4585 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657589 4585 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657597 4585 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657605 4585 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657612 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657621 4585 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657629 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657636 4585 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657644 4585 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657652 4585 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657660 4585 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657667 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657676 4585 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657687 4585 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657698 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657709 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657719 4585 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657730 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657741 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657752 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657764 4585 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657775 4585 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657786 4585 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657798 4585 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657811 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657822 4585 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657833 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657844 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657855 4585 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657865 4585 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657875 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657885 4585 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657896 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657906 4585 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657916 4585 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657926 4585 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657935 4585 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657945 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657955 4585 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657966 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.657993 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.658004 4585 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.658016 4585 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.658028 4585 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.658044 4585 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.658056 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.658067 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.658079 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.658089 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.658101 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.658112 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.658123 4585 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.658136 4585 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.658148 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.658161 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.658171 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.658184 4585 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.658195 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.658205 4585 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.658216 4585 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.658276 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.658276 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.658429 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.658743 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.662190 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.710546 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.812879 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.819829 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 13:58:25 crc kubenswrapper[4585]: I1201 13:58:25.828873 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 13:58:25 crc kubenswrapper[4585]: W1201 13:58:25.862772 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-ab3d844008fdcfa461504e5eb5de75d3ab6757cb25f1d32c338acf32d7e60452 WatchSource:0}: Error finding container ab3d844008fdcfa461504e5eb5de75d3ab6757cb25f1d32c338acf32d7e60452: Status 404 returned error can't find the container with id ab3d844008fdcfa461504e5eb5de75d3ab6757cb25f1d32c338acf32d7e60452 Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.164676 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.164778 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.164812 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.164843 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.164886 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 13:58:26 crc kubenswrapper[4585]: E1201 13:58:26.164991 4585 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 13:58:26 crc kubenswrapper[4585]: E1201 13:58:26.165065 4585 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 13:58:26 crc kubenswrapper[4585]: E1201 13:58:26.165115 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 13:58:27.165085442 +0000 UTC m=+21.149299347 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 13:58:26 crc kubenswrapper[4585]: E1201 13:58:26.165118 4585 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 13:58:26 crc kubenswrapper[4585]: E1201 13:58:26.165190 4585 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 13:58:26 crc kubenswrapper[4585]: E1201 13:58:26.165204 4585 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 13:58:26 crc kubenswrapper[4585]: E1201 13:58:26.165065 4585 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 13:58:26 crc kubenswrapper[4585]: E1201 13:58:26.165282 4585 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 13:58:26 crc kubenswrapper[4585]: E1201 13:58:26.165290 4585 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 13:58:26 crc kubenswrapper[4585]: E1201 13:58:26.165141 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 13:58:27.165131803 +0000 UTC m=+21.149345738 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 13:58:26 crc kubenswrapper[4585]: E1201 13:58:26.165338 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 13:58:27.165315428 +0000 UTC m=+21.149529283 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 13:58:26 crc kubenswrapper[4585]: E1201 13:58:26.165350 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 13:58:27.165345179 +0000 UTC m=+21.149559034 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 13:58:26 crc kubenswrapper[4585]: E1201 13:58:26.165359 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 13:58:27.165355239 +0000 UTC m=+21.149569094 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.232146 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-62bsn"] Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.232564 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-62bsn" Dec 01 13:58:26 crc kubenswrapper[4585]: W1201 13:58:26.236276 4585 reflector.go:561] object-"openshift-dns"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-dns": no relationship found between node 'crc' and this object Dec 01 13:58:26 crc kubenswrapper[4585]: E1201 13:58:26.236366 4585 reflector.go:158] "Unhandled Error" err="object-\"openshift-dns\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-dns\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 01 13:58:26 crc kubenswrapper[4585]: W1201 13:58:26.236393 4585 reflector.go:561] object-"openshift-dns"/"node-resolver-dockercfg-kz9s7": failed to list *v1.Secret: secrets "node-resolver-dockercfg-kz9s7" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-dns": no relationship found between node 'crc' and this object Dec 01 13:58:26 crc kubenswrapper[4585]: E1201 13:58:26.236491 4585 reflector.go:158] "Unhandled Error" err="object-\"openshift-dns\"/\"node-resolver-dockercfg-kz9s7\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"node-resolver-dockercfg-kz9s7\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-dns\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 01 13:58:26 crc kubenswrapper[4585]: W1201 13:58:26.237567 4585 reflector.go:561] object-"openshift-dns"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-dns": no relationship found between node 'crc' and this object Dec 01 13:58:26 crc kubenswrapper[4585]: E1201 13:58:26.237628 4585 reflector.go:158] "Unhandled Error" err="object-\"openshift-dns\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-dns\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.316834 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b10cdae7-154c-4fd6-a308-02843603d7ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b40ee42d7d1c9043b0b5c0dea3baacc46c24842ec64ed3b0d0ae1acfaf65f8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa22ac0ee9ad705a8affafffed373135005ef0380c9a646e06229a24d04d20b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://949f7fe4da35866e66212aea18ec1ed303bd542b07e53ff6562618c19de112cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7dfb22c5155fdc4bfdf9d54bf540e4018bca6a616835af7e8d5079bebcd45da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dfb22c5155fdc4bfdf9d54bf540e4018bca6a616835af7e8d5079bebcd45da\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T13:58:25Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 13:58:19.764445 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 13:58:19.765610 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3283210930/tls.crt::/tmp/serving-cert-3283210930/tls.key\\\\\\\"\\\\nI1201 13:58:25.461951 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 13:58:25.464641 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 13:58:25.464660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 13:58:25.464680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 13:58:25.464686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 13:58:25.473932 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 13:58:25.473958 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 13:58:25.473983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 13:58:25.473986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 13:58:25.473989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 13:58:25.474166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 13:58:25.477276 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc9645fabd7968d92ce9867130dddbb6c72664af68d6b5475cf3271d1975878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.359068 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.366942 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/98be7526-98f6-4a4a-b4a6-1d10e76b7a99-hosts-file\") pod \"node-resolver-62bsn\" (UID: \"98be7526-98f6-4a4a-b4a6-1d10e76b7a99\") " pod="openshift-dns/node-resolver-62bsn" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.367048 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxjs6\" (UniqueName: \"kubernetes.io/projected/98be7526-98f6-4a4a-b4a6-1d10e76b7a99-kube-api-access-dxjs6\") pod \"node-resolver-62bsn\" (UID: \"98be7526-98f6-4a4a-b4a6-1d10e76b7a99\") " pod="openshift-dns/node-resolver-62bsn" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.402224 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.411810 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.411815 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 13:58:26 crc kubenswrapper[4585]: E1201 13:58:26.412280 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 13:58:26 crc kubenswrapper[4585]: E1201 13:58:26.412542 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.411826 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 13:58:26 crc kubenswrapper[4585]: E1201 13:58:26.412649 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.417103 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.418013 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.419482 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.420172 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.421282 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.421877 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.422638 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.423708 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.424399 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.426448 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.427541 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.429658 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.429931 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.431137 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.432087 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.434377 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.435414 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.436997 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.437634 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.438705 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.440286 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.441187 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.442652 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.446735 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.447671 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.449792 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.453736 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.455243 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.455808 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.456208 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.457570 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.459013 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.460633 4585 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.461040 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.463359 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.464598 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.465193 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.467716 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.467714 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/98be7526-98f6-4a4a-b4a6-1d10e76b7a99-hosts-file\") pod \"node-resolver-62bsn\" (UID: \"98be7526-98f6-4a4a-b4a6-1d10e76b7a99\") " pod="openshift-dns/node-resolver-62bsn" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.468072 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxjs6\" (UniqueName: \"kubernetes.io/projected/98be7526-98f6-4a4a-b4a6-1d10e76b7a99-kube-api-access-dxjs6\") pod \"node-resolver-62bsn\" (UID: \"98be7526-98f6-4a4a-b4a6-1d10e76b7a99\") " pod="openshift-dns/node-resolver-62bsn" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.467785 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/98be7526-98f6-4a4a-b4a6-1d10e76b7a99-hosts-file\") pod \"node-resolver-62bsn\" (UID: \"98be7526-98f6-4a4a-b4a6-1d10e76b7a99\") " pod="openshift-dns/node-resolver-62bsn" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.469392 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.470049 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.471659 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.472478 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.473502 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.474376 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.475950 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.476384 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.477079 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.478077 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.478682 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.480667 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.481767 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.483324 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.484379 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.485372 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.486697 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.487507 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.488622 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.490385 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.504157 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-62bsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98be7526-98f6-4a4a-b4a6-1d10e76b7a99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxjs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-62bsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.518655 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.525664 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.529037 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b7d7c5d98b7f6dcade92ba77f78aa8345baf8161727831a616192726f6ef6f63"} Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.533894 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.534369 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"81b43aced4b2f90d0a2086832711d8df01b0b83b645ae5d070da9240b1842755"} Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.536366 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"5b3ea01760c8f1e58a55a59a0d453c25ca1298ede32d471f02092fd9d0a43830"} Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.536390 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"92e7aa15babbd745afed6bc0d4bbf5526180b8d32f86f8db736066fad3feb506"} Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.536400 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"ab3d844008fdcfa461504e5eb5de75d3ab6757cb25f1d32c338acf32d7e60452"} Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.538635 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"d9e9e83cdb2aa3565484c1ff3e5bbd72c7f50a132de50e60aec1aa88ad145ca7"} Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.538668 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"b9c9baaee4ccd9d44fa0944c912d0f13cd3748e27fcb3d279812bc08f5bce48c"} Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.564377 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.586094 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b10cdae7-154c-4fd6-a308-02843603d7ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b40ee42d7d1c9043b0b5c0dea3baacc46c24842ec64ed3b0d0ae1acfaf65f8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa22ac0ee9ad705a8affafffed373135005ef0380c9a646e06229a24d04d20b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://949f7fe4da35866e66212aea18ec1ed303bd542b07e53ff6562618c19de112cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7dfb22c5155fdc4bfdf9d54bf540e4018bca6a616835af7e8d5079bebcd45da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dfb22c5155fdc4bfdf9d54bf540e4018bca6a616835af7e8d5079bebcd45da\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T13:58:25Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 13:58:19.764445 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 13:58:19.765610 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3283210930/tls.crt::/tmp/serving-cert-3283210930/tls.key\\\\\\\"\\\\nI1201 13:58:25.461951 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 13:58:25.464641 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 13:58:25.464660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 13:58:25.464680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 13:58:25.464686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 13:58:25.473932 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 13:58:25.473958 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 13:58:25.473983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 13:58:25.473986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 13:58:25.473989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 13:58:25.474166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 13:58:25.477276 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc9645fabd7968d92ce9867130dddbb6c72664af68d6b5475cf3271d1975878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.602749 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.627326 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.638549 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.651413 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.660079 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-62bsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98be7526-98f6-4a4a-b4a6-1d10e76b7a99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxjs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-62bsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.673523 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.684504 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.702640 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-62bsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98be7526-98f6-4a4a-b4a6-1d10e76b7a99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxjs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-62bsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.864522 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.889938 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.917034 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b10cdae7-154c-4fd6-a308-02843603d7ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b40ee42d7d1c9043b0b5c0dea3baacc46c24842ec64ed3b0d0ae1acfaf65f8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa22ac0ee9ad705a8affafffed373135005ef0380c9a646e06229a24d04d20b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://949f7fe4da35866e66212aea18ec1ed303bd542b07e53ff6562618c19de112cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d7c5d98b7f6dcade92ba77f78aa8345baf8161727831a616192726f6ef6f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dfb22c5155fdc4bfdf9d54bf540e4018bca6a616835af7e8d5079bebcd45da\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T13:58:25Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 13:58:19.764445 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 13:58:19.765610 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3283210930/tls.crt::/tmp/serving-cert-3283210930/tls.key\\\\\\\"\\\\nI1201 13:58:25.461951 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 13:58:25.464641 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 13:58:25.464660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 13:58:25.464680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 13:58:25.464686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 13:58:25.473932 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 13:58:25.473958 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 13:58:25.473983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 13:58:25.473986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 13:58:25.473989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 13:58:25.474166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 13:58:25.477276 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc9645fabd7968d92ce9867130dddbb6c72664af68d6b5475cf3271d1975878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.959183 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e9e83cdb2aa3565484c1ff3e5bbd72c7f50a132de50e60aec1aa88ad145ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 13:58:26 crc kubenswrapper[4585]: I1201 13:58:26.979803 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b3ea01760c8f1e58a55a59a0d453c25ca1298ede32d471f02092fd9d0a43830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e7aa15babbd745afed6bc0d4bbf5526180b8d32f86f8db736066fad3feb506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:26Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.172556 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 13:58:27 crc kubenswrapper[4585]: E1201 13:58:27.172786 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 13:58:29.172748244 +0000 UTC m=+23.156962099 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.173196 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.173351 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.173449 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.173578 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 13:58:27 crc kubenswrapper[4585]: E1201 13:58:27.173417 4585 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 13:58:27 crc kubenswrapper[4585]: E1201 13:58:27.173811 4585 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 13:58:27 crc kubenswrapper[4585]: E1201 13:58:27.173884 4585 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 13:58:27 crc kubenswrapper[4585]: E1201 13:58:27.173521 4585 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 13:58:27 crc kubenswrapper[4585]: E1201 13:58:27.173673 4585 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 13:58:27 crc kubenswrapper[4585]: E1201 13:58:27.174046 4585 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 13:58:27 crc kubenswrapper[4585]: E1201 13:58:27.174063 4585 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 13:58:27 crc kubenswrapper[4585]: E1201 13:58:27.173709 4585 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 13:58:27 crc kubenswrapper[4585]: E1201 13:58:27.174370 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 13:58:29.174347926 +0000 UTC m=+23.158561781 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 13:58:27 crc kubenswrapper[4585]: E1201 13:58:27.174455 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 13:58:29.174446328 +0000 UTC m=+23.158660173 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 13:58:27 crc kubenswrapper[4585]: E1201 13:58:27.174534 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 13:58:29.17452339 +0000 UTC m=+23.158737245 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 13:58:27 crc kubenswrapper[4585]: E1201 13:58:27.174612 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 13:58:29.174600852 +0000 UTC m=+23.158814707 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.211366 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-xh4hc"] Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.212437 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-xh4hc" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.215418 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-9wjs5"] Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.215896 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-9wjs5" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.216930 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.216960 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.216996 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.217256 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.217702 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.219378 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-lj9gs"] Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.219594 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.219711 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.226844 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.237432 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.237709 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.239065 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.241641 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.242257 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.248875 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b10cdae7-154c-4fd6-a308-02843603d7ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b40ee42d7d1c9043b0b5c0dea3baacc46c24842ec64ed3b0d0ae1acfaf65f8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa22ac0ee9ad705a8affafffed373135005ef0380c9a646e06229a24d04d20b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://949f7fe4da35866e66212aea18ec1ed303bd542b07e53ff6562618c19de112cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d7c5d98b7f6dcade92ba77f78aa8345baf8161727831a616192726f6ef6f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dfb22c5155fdc4bfdf9d54bf540e4018bca6a616835af7e8d5079bebcd45da\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T13:58:25Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 13:58:19.764445 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 13:58:19.765610 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3283210930/tls.crt::/tmp/serving-cert-3283210930/tls.key\\\\\\\"\\\\nI1201 13:58:25.461951 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 13:58:25.464641 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 13:58:25.464660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 13:58:25.464680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 13:58:25.464686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 13:58:25.473932 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 13:58:25.473958 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 13:58:25.473983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 13:58:25.473986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 13:58:25.473989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 13:58:25.474166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 13:58:25.477276 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc9645fabd7968d92ce9867130dddbb6c72664af68d6b5475cf3271d1975878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:27Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.274217 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e0e1eaa6-bee9-401a-b01a-9bf49a938b29-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xh4hc\" (UID: \"e0e1eaa6-bee9-401a-b01a-9bf49a938b29\") " pod="openshift-multus/multus-additional-cni-plugins-xh4hc" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.274262 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e0e1eaa6-bee9-401a-b01a-9bf49a938b29-cni-binary-copy\") pod \"multus-additional-cni-plugins-xh4hc\" (UID: \"e0e1eaa6-bee9-401a-b01a-9bf49a938b29\") " pod="openshift-multus/multus-additional-cni-plugins-xh4hc" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.274301 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6e7ad3ad-7937-409b-b1c9-9c801f937400-multus-cni-dir\") pod \"multus-9wjs5\" (UID: \"6e7ad3ad-7937-409b-b1c9-9c801f937400\") " pod="openshift-multus/multus-9wjs5" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.274320 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6e7ad3ad-7937-409b-b1c9-9c801f937400-multus-socket-dir-parent\") pod \"multus-9wjs5\" (UID: \"6e7ad3ad-7937-409b-b1c9-9c801f937400\") " pod="openshift-multus/multus-9wjs5" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.274337 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6e7ad3ad-7937-409b-b1c9-9c801f937400-host-run-netns\") pod \"multus-9wjs5\" (UID: \"6e7ad3ad-7937-409b-b1c9-9c801f937400\") " pod="openshift-multus/multus-9wjs5" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.274353 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f7beb40d-bcd0-43c8-a9fe-c32408790a4c-rootfs\") pod \"machine-config-daemon-lj9gs\" (UID: \"f7beb40d-bcd0-43c8-a9fe-c32408790a4c\") " pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.274380 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6e7ad3ad-7937-409b-b1c9-9c801f937400-system-cni-dir\") pod \"multus-9wjs5\" (UID: \"6e7ad3ad-7937-409b-b1c9-9c801f937400\") " pod="openshift-multus/multus-9wjs5" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.274394 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6e7ad3ad-7937-409b-b1c9-9c801f937400-host-var-lib-cni-bin\") pod \"multus-9wjs5\" (UID: \"6e7ad3ad-7937-409b-b1c9-9c801f937400\") " pod="openshift-multus/multus-9wjs5" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.274417 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6e7ad3ad-7937-409b-b1c9-9c801f937400-hostroot\") pod \"multus-9wjs5\" (UID: \"6e7ad3ad-7937-409b-b1c9-9c801f937400\") " pod="openshift-multus/multus-9wjs5" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.274433 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvc7j\" (UniqueName: \"kubernetes.io/projected/6e7ad3ad-7937-409b-b1c9-9c801f937400-kube-api-access-xvc7j\") pod \"multus-9wjs5\" (UID: \"6e7ad3ad-7937-409b-b1c9-9c801f937400\") " pod="openshift-multus/multus-9wjs5" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.274448 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e0e1eaa6-bee9-401a-b01a-9bf49a938b29-cnibin\") pod \"multus-additional-cni-plugins-xh4hc\" (UID: \"e0e1eaa6-bee9-401a-b01a-9bf49a938b29\") " pod="openshift-multus/multus-additional-cni-plugins-xh4hc" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.274462 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zccd\" (UniqueName: \"kubernetes.io/projected/f7beb40d-bcd0-43c8-a9fe-c32408790a4c-kube-api-access-2zccd\") pod \"machine-config-daemon-lj9gs\" (UID: \"f7beb40d-bcd0-43c8-a9fe-c32408790a4c\") " pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.274477 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6e7ad3ad-7937-409b-b1c9-9c801f937400-cni-binary-copy\") pod \"multus-9wjs5\" (UID: \"6e7ad3ad-7937-409b-b1c9-9c801f937400\") " pod="openshift-multus/multus-9wjs5" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.274491 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6e7ad3ad-7937-409b-b1c9-9c801f937400-etc-kubernetes\") pod \"multus-9wjs5\" (UID: \"6e7ad3ad-7937-409b-b1c9-9c801f937400\") " pod="openshift-multus/multus-9wjs5" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.274507 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e0e1eaa6-bee9-401a-b01a-9bf49a938b29-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xh4hc\" (UID: \"e0e1eaa6-bee9-401a-b01a-9bf49a938b29\") " pod="openshift-multus/multus-additional-cni-plugins-xh4hc" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.274523 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6e7ad3ad-7937-409b-b1c9-9c801f937400-cnibin\") pod \"multus-9wjs5\" (UID: \"6e7ad3ad-7937-409b-b1c9-9c801f937400\") " pod="openshift-multus/multus-9wjs5" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.274538 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6e7ad3ad-7937-409b-b1c9-9c801f937400-host-var-lib-kubelet\") pod \"multus-9wjs5\" (UID: \"6e7ad3ad-7937-409b-b1c9-9c801f937400\") " pod="openshift-multus/multus-9wjs5" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.274555 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgw4r\" (UniqueName: \"kubernetes.io/projected/e0e1eaa6-bee9-401a-b01a-9bf49a938b29-kube-api-access-dgw4r\") pod \"multus-additional-cni-plugins-xh4hc\" (UID: \"e0e1eaa6-bee9-401a-b01a-9bf49a938b29\") " pod="openshift-multus/multus-additional-cni-plugins-xh4hc" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.274581 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6e7ad3ad-7937-409b-b1c9-9c801f937400-host-run-k8s-cni-cncf-io\") pod \"multus-9wjs5\" (UID: \"6e7ad3ad-7937-409b-b1c9-9c801f937400\") " pod="openshift-multus/multus-9wjs5" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.274596 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6e7ad3ad-7937-409b-b1c9-9c801f937400-host-run-multus-certs\") pod \"multus-9wjs5\" (UID: \"6e7ad3ad-7937-409b-b1c9-9c801f937400\") " pod="openshift-multus/multus-9wjs5" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.274613 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e0e1eaa6-bee9-401a-b01a-9bf49a938b29-system-cni-dir\") pod \"multus-additional-cni-plugins-xh4hc\" (UID: \"e0e1eaa6-bee9-401a-b01a-9bf49a938b29\") " pod="openshift-multus/multus-additional-cni-plugins-xh4hc" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.274627 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e0e1eaa6-bee9-401a-b01a-9bf49a938b29-os-release\") pod \"multus-additional-cni-plugins-xh4hc\" (UID: \"e0e1eaa6-bee9-401a-b01a-9bf49a938b29\") " pod="openshift-multus/multus-additional-cni-plugins-xh4hc" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.274645 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6e7ad3ad-7937-409b-b1c9-9c801f937400-multus-conf-dir\") pod \"multus-9wjs5\" (UID: \"6e7ad3ad-7937-409b-b1c9-9c801f937400\") " pod="openshift-multus/multus-9wjs5" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.274659 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6e7ad3ad-7937-409b-b1c9-9c801f937400-multus-daemon-config\") pod \"multus-9wjs5\" (UID: \"6e7ad3ad-7937-409b-b1c9-9c801f937400\") " pod="openshift-multus/multus-9wjs5" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.274678 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f7beb40d-bcd0-43c8-a9fe-c32408790a4c-proxy-tls\") pod \"machine-config-daemon-lj9gs\" (UID: \"f7beb40d-bcd0-43c8-a9fe-c32408790a4c\") " pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.274693 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6e7ad3ad-7937-409b-b1c9-9c801f937400-os-release\") pod \"multus-9wjs5\" (UID: \"6e7ad3ad-7937-409b-b1c9-9c801f937400\") " pod="openshift-multus/multus-9wjs5" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.274709 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6e7ad3ad-7937-409b-b1c9-9c801f937400-host-var-lib-cni-multus\") pod \"multus-9wjs5\" (UID: \"6e7ad3ad-7937-409b-b1c9-9c801f937400\") " pod="openshift-multus/multus-9wjs5" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.274726 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f7beb40d-bcd0-43c8-a9fe-c32408790a4c-mcd-auth-proxy-config\") pod \"machine-config-daemon-lj9gs\" (UID: \"f7beb40d-bcd0-43c8-a9fe-c32408790a4c\") " pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.292999 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e9e83cdb2aa3565484c1ff3e5bbd72c7f50a132de50e60aec1aa88ad145ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:27Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.344309 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b3ea01760c8f1e58a55a59a0d453c25ca1298ede32d471f02092fd9d0a43830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e7aa15babbd745afed6bc0d4bbf5526180b8d32f86f8db736066fad3feb506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:27Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.369091 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:27Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.375559 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6e7ad3ad-7937-409b-b1c9-9c801f937400-os-release\") pod \"multus-9wjs5\" (UID: \"6e7ad3ad-7937-409b-b1c9-9c801f937400\") " pod="openshift-multus/multus-9wjs5" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.375606 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6e7ad3ad-7937-409b-b1c9-9c801f937400-host-var-lib-cni-multus\") pod \"multus-9wjs5\" (UID: \"6e7ad3ad-7937-409b-b1c9-9c801f937400\") " pod="openshift-multus/multus-9wjs5" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.375627 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f7beb40d-bcd0-43c8-a9fe-c32408790a4c-mcd-auth-proxy-config\") pod \"machine-config-daemon-lj9gs\" (UID: \"f7beb40d-bcd0-43c8-a9fe-c32408790a4c\") " pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.375647 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e0e1eaa6-bee9-401a-b01a-9bf49a938b29-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xh4hc\" (UID: \"e0e1eaa6-bee9-401a-b01a-9bf49a938b29\") " pod="openshift-multus/multus-additional-cni-plugins-xh4hc" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.375667 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e0e1eaa6-bee9-401a-b01a-9bf49a938b29-cni-binary-copy\") pod \"multus-additional-cni-plugins-xh4hc\" (UID: \"e0e1eaa6-bee9-401a-b01a-9bf49a938b29\") " pod="openshift-multus/multus-additional-cni-plugins-xh4hc" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.375700 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6e7ad3ad-7937-409b-b1c9-9c801f937400-multus-cni-dir\") pod \"multus-9wjs5\" (UID: \"6e7ad3ad-7937-409b-b1c9-9c801f937400\") " pod="openshift-multus/multus-9wjs5" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.375717 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6e7ad3ad-7937-409b-b1c9-9c801f937400-multus-socket-dir-parent\") pod \"multus-9wjs5\" (UID: \"6e7ad3ad-7937-409b-b1c9-9c801f937400\") " pod="openshift-multus/multus-9wjs5" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.375733 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6e7ad3ad-7937-409b-b1c9-9c801f937400-host-run-netns\") pod \"multus-9wjs5\" (UID: \"6e7ad3ad-7937-409b-b1c9-9c801f937400\") " pod="openshift-multus/multus-9wjs5" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.375750 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f7beb40d-bcd0-43c8-a9fe-c32408790a4c-rootfs\") pod \"machine-config-daemon-lj9gs\" (UID: \"f7beb40d-bcd0-43c8-a9fe-c32408790a4c\") " pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.375775 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6e7ad3ad-7937-409b-b1c9-9c801f937400-system-cni-dir\") pod \"multus-9wjs5\" (UID: \"6e7ad3ad-7937-409b-b1c9-9c801f937400\") " pod="openshift-multus/multus-9wjs5" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.375795 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6e7ad3ad-7937-409b-b1c9-9c801f937400-host-var-lib-cni-bin\") pod \"multus-9wjs5\" (UID: \"6e7ad3ad-7937-409b-b1c9-9c801f937400\") " pod="openshift-multus/multus-9wjs5" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.375816 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6e7ad3ad-7937-409b-b1c9-9c801f937400-hostroot\") pod \"multus-9wjs5\" (UID: \"6e7ad3ad-7937-409b-b1c9-9c801f937400\") " pod="openshift-multus/multus-9wjs5" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.375833 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvc7j\" (UniqueName: \"kubernetes.io/projected/6e7ad3ad-7937-409b-b1c9-9c801f937400-kube-api-access-xvc7j\") pod \"multus-9wjs5\" (UID: \"6e7ad3ad-7937-409b-b1c9-9c801f937400\") " pod="openshift-multus/multus-9wjs5" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.375853 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e0e1eaa6-bee9-401a-b01a-9bf49a938b29-cnibin\") pod \"multus-additional-cni-plugins-xh4hc\" (UID: \"e0e1eaa6-bee9-401a-b01a-9bf49a938b29\") " pod="openshift-multus/multus-additional-cni-plugins-xh4hc" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.375873 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zccd\" (UniqueName: \"kubernetes.io/projected/f7beb40d-bcd0-43c8-a9fe-c32408790a4c-kube-api-access-2zccd\") pod \"machine-config-daemon-lj9gs\" (UID: \"f7beb40d-bcd0-43c8-a9fe-c32408790a4c\") " pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.375888 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6e7ad3ad-7937-409b-b1c9-9c801f937400-cni-binary-copy\") pod \"multus-9wjs5\" (UID: \"6e7ad3ad-7937-409b-b1c9-9c801f937400\") " pod="openshift-multus/multus-9wjs5" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.375902 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6e7ad3ad-7937-409b-b1c9-9c801f937400-etc-kubernetes\") pod \"multus-9wjs5\" (UID: \"6e7ad3ad-7937-409b-b1c9-9c801f937400\") " pod="openshift-multus/multus-9wjs5" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.375917 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e0e1eaa6-bee9-401a-b01a-9bf49a938b29-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xh4hc\" (UID: \"e0e1eaa6-bee9-401a-b01a-9bf49a938b29\") " pod="openshift-multus/multus-additional-cni-plugins-xh4hc" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.375932 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6e7ad3ad-7937-409b-b1c9-9c801f937400-cnibin\") pod \"multus-9wjs5\" (UID: \"6e7ad3ad-7937-409b-b1c9-9c801f937400\") " pod="openshift-multus/multus-9wjs5" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.375947 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6e7ad3ad-7937-409b-b1c9-9c801f937400-host-var-lib-kubelet\") pod \"multus-9wjs5\" (UID: \"6e7ad3ad-7937-409b-b1c9-9c801f937400\") " pod="openshift-multus/multus-9wjs5" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.375996 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgw4r\" (UniqueName: \"kubernetes.io/projected/e0e1eaa6-bee9-401a-b01a-9bf49a938b29-kube-api-access-dgw4r\") pod \"multus-additional-cni-plugins-xh4hc\" (UID: \"e0e1eaa6-bee9-401a-b01a-9bf49a938b29\") " pod="openshift-multus/multus-additional-cni-plugins-xh4hc" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.376010 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e0e1eaa6-bee9-401a-b01a-9bf49a938b29-system-cni-dir\") pod \"multus-additional-cni-plugins-xh4hc\" (UID: \"e0e1eaa6-bee9-401a-b01a-9bf49a938b29\") " pod="openshift-multus/multus-additional-cni-plugins-xh4hc" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.376025 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e0e1eaa6-bee9-401a-b01a-9bf49a938b29-os-release\") pod \"multus-additional-cni-plugins-xh4hc\" (UID: \"e0e1eaa6-bee9-401a-b01a-9bf49a938b29\") " pod="openshift-multus/multus-additional-cni-plugins-xh4hc" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.376046 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6e7ad3ad-7937-409b-b1c9-9c801f937400-host-run-k8s-cni-cncf-io\") pod \"multus-9wjs5\" (UID: \"6e7ad3ad-7937-409b-b1c9-9c801f937400\") " pod="openshift-multus/multus-9wjs5" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.376032 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6e7ad3ad-7937-409b-b1c9-9c801f937400-os-release\") pod \"multus-9wjs5\" (UID: \"6e7ad3ad-7937-409b-b1c9-9c801f937400\") " pod="openshift-multus/multus-9wjs5" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.376060 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6e7ad3ad-7937-409b-b1c9-9c801f937400-host-run-multus-certs\") pod \"multus-9wjs5\" (UID: \"6e7ad3ad-7937-409b-b1c9-9c801f937400\") " pod="openshift-multus/multus-9wjs5" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.376133 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6e7ad3ad-7937-409b-b1c9-9c801f937400-host-run-multus-certs\") pod \"multus-9wjs5\" (UID: \"6e7ad3ad-7937-409b-b1c9-9c801f937400\") " pod="openshift-multus/multus-9wjs5" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.376143 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6e7ad3ad-7937-409b-b1c9-9c801f937400-multus-conf-dir\") pod \"multus-9wjs5\" (UID: \"6e7ad3ad-7937-409b-b1c9-9c801f937400\") " pod="openshift-multus/multus-9wjs5" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.376163 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6e7ad3ad-7937-409b-b1c9-9c801f937400-multus-daemon-config\") pod \"multus-9wjs5\" (UID: \"6e7ad3ad-7937-409b-b1c9-9c801f937400\") " pod="openshift-multus/multus-9wjs5" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.376184 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6e7ad3ad-7937-409b-b1c9-9c801f937400-host-var-lib-cni-multus\") pod \"multus-9wjs5\" (UID: \"6e7ad3ad-7937-409b-b1c9-9c801f937400\") " pod="openshift-multus/multus-9wjs5" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.376187 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f7beb40d-bcd0-43c8-a9fe-c32408790a4c-proxy-tls\") pod \"machine-config-daemon-lj9gs\" (UID: \"f7beb40d-bcd0-43c8-a9fe-c32408790a4c\") " pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.376801 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e0e1eaa6-bee9-401a-b01a-9bf49a938b29-cnibin\") pod \"multus-additional-cni-plugins-xh4hc\" (UID: \"e0e1eaa6-bee9-401a-b01a-9bf49a938b29\") " pod="openshift-multus/multus-additional-cni-plugins-xh4hc" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.376990 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e0e1eaa6-bee9-401a-b01a-9bf49a938b29-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xh4hc\" (UID: \"e0e1eaa6-bee9-401a-b01a-9bf49a938b29\") " pod="openshift-multus/multus-additional-cni-plugins-xh4hc" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.377080 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6e7ad3ad-7937-409b-b1c9-9c801f937400-multus-conf-dir\") pod \"multus-9wjs5\" (UID: \"6e7ad3ad-7937-409b-b1c9-9c801f937400\") " pod="openshift-multus/multus-9wjs5" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.377092 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f7beb40d-bcd0-43c8-a9fe-c32408790a4c-mcd-auth-proxy-config\") pod \"machine-config-daemon-lj9gs\" (UID: \"f7beb40d-bcd0-43c8-a9fe-c32408790a4c\") " pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.377141 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f7beb40d-bcd0-43c8-a9fe-c32408790a4c-rootfs\") pod \"machine-config-daemon-lj9gs\" (UID: \"f7beb40d-bcd0-43c8-a9fe-c32408790a4c\") " pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.377192 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6e7ad3ad-7937-409b-b1c9-9c801f937400-multus-cni-dir\") pod \"multus-9wjs5\" (UID: \"6e7ad3ad-7937-409b-b1c9-9c801f937400\") " pod="openshift-multus/multus-9wjs5" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.377228 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6e7ad3ad-7937-409b-b1c9-9c801f937400-multus-socket-dir-parent\") pod \"multus-9wjs5\" (UID: \"6e7ad3ad-7937-409b-b1c9-9c801f937400\") " pod="openshift-multus/multus-9wjs5" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.377252 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6e7ad3ad-7937-409b-b1c9-9c801f937400-host-run-netns\") pod \"multus-9wjs5\" (UID: \"6e7ad3ad-7937-409b-b1c9-9c801f937400\") " pod="openshift-multus/multus-9wjs5" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.377274 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6e7ad3ad-7937-409b-b1c9-9c801f937400-host-var-lib-kubelet\") pod \"multus-9wjs5\" (UID: \"6e7ad3ad-7937-409b-b1c9-9c801f937400\") " pod="openshift-multus/multus-9wjs5" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.377429 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e0e1eaa6-bee9-401a-b01a-9bf49a938b29-cni-binary-copy\") pod \"multus-additional-cni-plugins-xh4hc\" (UID: \"e0e1eaa6-bee9-401a-b01a-9bf49a938b29\") " pod="openshift-multus/multus-additional-cni-plugins-xh4hc" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.377487 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6e7ad3ad-7937-409b-b1c9-9c801f937400-system-cni-dir\") pod \"multus-9wjs5\" (UID: \"6e7ad3ad-7937-409b-b1c9-9c801f937400\") " pod="openshift-multus/multus-9wjs5" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.377514 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6e7ad3ad-7937-409b-b1c9-9c801f937400-host-var-lib-cni-bin\") pod \"multus-9wjs5\" (UID: \"6e7ad3ad-7937-409b-b1c9-9c801f937400\") " pod="openshift-multus/multus-9wjs5" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.377539 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6e7ad3ad-7937-409b-b1c9-9c801f937400-hostroot\") pod \"multus-9wjs5\" (UID: \"6e7ad3ad-7937-409b-b1c9-9c801f937400\") " pod="openshift-multus/multus-9wjs5" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.377715 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6e7ad3ad-7937-409b-b1c9-9c801f937400-multus-daemon-config\") pod \"multus-9wjs5\" (UID: \"6e7ad3ad-7937-409b-b1c9-9c801f937400\") " pod="openshift-multus/multus-9wjs5" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.377771 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6e7ad3ad-7937-409b-b1c9-9c801f937400-cnibin\") pod \"multus-9wjs5\" (UID: \"6e7ad3ad-7937-409b-b1c9-9c801f937400\") " pod="openshift-multus/multus-9wjs5" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.377797 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6e7ad3ad-7937-409b-b1c9-9c801f937400-etc-kubernetes\") pod \"multus-9wjs5\" (UID: \"6e7ad3ad-7937-409b-b1c9-9c801f937400\") " pod="openshift-multus/multus-9wjs5" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.377820 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e0e1eaa6-bee9-401a-b01a-9bf49a938b29-system-cni-dir\") pod \"multus-additional-cni-plugins-xh4hc\" (UID: \"e0e1eaa6-bee9-401a-b01a-9bf49a938b29\") " pod="openshift-multus/multus-additional-cni-plugins-xh4hc" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.377873 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6e7ad3ad-7937-409b-b1c9-9c801f937400-cni-binary-copy\") pod \"multus-9wjs5\" (UID: \"6e7ad3ad-7937-409b-b1c9-9c801f937400\") " pod="openshift-multus/multus-9wjs5" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.377924 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6e7ad3ad-7937-409b-b1c9-9c801f937400-host-run-k8s-cni-cncf-io\") pod \"multus-9wjs5\" (UID: \"6e7ad3ad-7937-409b-b1c9-9c801f937400\") " pod="openshift-multus/multus-9wjs5" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.377991 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e0e1eaa6-bee9-401a-b01a-9bf49a938b29-os-release\") pod \"multus-additional-cni-plugins-xh4hc\" (UID: \"e0e1eaa6-bee9-401a-b01a-9bf49a938b29\") " pod="openshift-multus/multus-additional-cni-plugins-xh4hc" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.378169 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e0e1eaa6-bee9-401a-b01a-9bf49a938b29-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xh4hc\" (UID: \"e0e1eaa6-bee9-401a-b01a-9bf49a938b29\") " pod="openshift-multus/multus-additional-cni-plugins-xh4hc" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.383840 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f7beb40d-bcd0-43c8-a9fe-c32408790a4c-proxy-tls\") pod \"machine-config-daemon-lj9gs\" (UID: \"f7beb40d-bcd0-43c8-a9fe-c32408790a4c\") " pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.418837 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvc7j\" (UniqueName: \"kubernetes.io/projected/6e7ad3ad-7937-409b-b1c9-9c801f937400-kube-api-access-xvc7j\") pod \"multus-9wjs5\" (UID: \"6e7ad3ad-7937-409b-b1c9-9c801f937400\") " pod="openshift-multus/multus-9wjs5" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.431652 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgw4r\" (UniqueName: \"kubernetes.io/projected/e0e1eaa6-bee9-401a-b01a-9bf49a938b29-kube-api-access-dgw4r\") pod \"multus-additional-cni-plugins-xh4hc\" (UID: \"e0e1eaa6-bee9-401a-b01a-9bf49a938b29\") " pod="openshift-multus/multus-additional-cni-plugins-xh4hc" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.441157 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:27Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.474377 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.487022 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zccd\" (UniqueName: \"kubernetes.io/projected/f7beb40d-bcd0-43c8-a9fe-c32408790a4c-kube-api-access-2zccd\") pod \"machine-config-daemon-lj9gs\" (UID: \"f7beb40d-bcd0-43c8-a9fe-c32408790a4c\") " pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" Dec 01 13:58:27 crc kubenswrapper[4585]: E1201 13:58:27.490634 4585 projected.go:288] Couldn't get configMap openshift-dns/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Dec 01 13:58:27 crc kubenswrapper[4585]: E1201 13:58:27.490707 4585 projected.go:194] Error preparing data for projected volume kube-api-access-dxjs6 for pod openshift-dns/node-resolver-62bsn: failed to sync configmap cache: timed out waiting for the condition Dec 01 13:58:27 crc kubenswrapper[4585]: E1201 13:58:27.490799 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/98be7526-98f6-4a4a-b4a6-1d10e76b7a99-kube-api-access-dxjs6 podName:98be7526-98f6-4a4a-b4a6-1d10e76b7a99 nodeName:}" failed. No retries permitted until 2025-12-01 13:58:27.990769555 +0000 UTC m=+21.974983410 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-dxjs6" (UniqueName: "kubernetes.io/projected/98be7526-98f6-4a4a-b4a6-1d10e76b7a99-kube-api-access-dxjs6") pod "node-resolver-62bsn" (UID: "98be7526-98f6-4a4a-b4a6-1d10e76b7a99") : failed to sync configmap cache: timed out waiting for the condition Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.498094 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xh4hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0e1eaa6-bee9-401a-b01a-9bf49a938b29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xh4hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:27Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.525533 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-xh4hc" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.530788 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:27Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.531122 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-9wjs5" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.537836 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.560393 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.568429 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.574566 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:27Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.605731 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-62bsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98be7526-98f6-4a4a-b4a6-1d10e76b7a99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxjs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-62bsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:27Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.634055 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:27Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.640944 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-tjkqr"] Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.642401 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.647148 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.647267 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.647310 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.647368 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.647386 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.647442 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.647800 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.655564 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-62bsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98be7526-98f6-4a4a-b4a6-1d10e76b7a99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxjs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-62bsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:27Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.676704 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b10cdae7-154c-4fd6-a308-02843603d7ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b40ee42d7d1c9043b0b5c0dea3baacc46c24842ec64ed3b0d0ae1acfaf65f8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa22ac0ee9ad705a8affafffed373135005ef0380c9a646e06229a24d04d20b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://949f7fe4da35866e66212aea18ec1ed303bd542b07e53ff6562618c19de112cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d7c5d98b7f6dcade92ba77f78aa8345baf8161727831a616192726f6ef6f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dfb22c5155fdc4bfdf9d54bf540e4018bca6a616835af7e8d5079bebcd45da\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T13:58:25Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 13:58:19.764445 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 13:58:19.765610 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3283210930/tls.crt::/tmp/serving-cert-3283210930/tls.key\\\\\\\"\\\\nI1201 13:58:25.461951 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 13:58:25.464641 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 13:58:25.464660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 13:58:25.464680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 13:58:25.464686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 13:58:25.473932 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 13:58:25.473958 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 13:58:25.473983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 13:58:25.473986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 13:58:25.473989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 13:58:25.474166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 13:58:25.477276 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc9645fabd7968d92ce9867130dddbb6c72664af68d6b5475cf3271d1975878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:27Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.692928 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e9e83cdb2aa3565484c1ff3e5bbd72c7f50a132de50e60aec1aa88ad145ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:27Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.714444 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7beb40d-bcd0-43c8-a9fe-c32408790a4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lj9gs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:27Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.741729 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:27Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.771601 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9wjs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e7ad3ad-7937-409b-b1c9-9c801f937400\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvc7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9wjs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:27Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.781039 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tjkqr\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.781075 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-run-ovn\") pod \"ovnkube-node-tjkqr\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.781133 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-node-log\") pod \"ovnkube-node-tjkqr\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.781175 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-etc-openvswitch\") pod \"ovnkube-node-tjkqr\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.781197 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-var-lib-openvswitch\") pod \"ovnkube-node-tjkqr\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.781221 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-ovnkube-config\") pod \"ovnkube-node-tjkqr\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.781242 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-host-cni-bin\") pod \"ovnkube-node-tjkqr\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.781261 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-run-openvswitch\") pod \"ovnkube-node-tjkqr\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.781278 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-systemd-units\") pod \"ovnkube-node-tjkqr\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.781298 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-host-run-netns\") pod \"ovnkube-node-tjkqr\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.781318 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-run-systemd\") pod \"ovnkube-node-tjkqr\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.781346 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-host-slash\") pod \"ovnkube-node-tjkqr\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.781366 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-env-overrides\") pod \"ovnkube-node-tjkqr\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.781385 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-ovn-node-metrics-cert\") pod \"ovnkube-node-tjkqr\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.781414 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-host-cni-netd\") pod \"ovnkube-node-tjkqr\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.781433 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-log-socket\") pod \"ovnkube-node-tjkqr\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.781453 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjhfw\" (UniqueName: \"kubernetes.io/projected/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-kube-api-access-xjhfw\") pod \"ovnkube-node-tjkqr\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.781481 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-ovnkube-script-lib\") pod \"ovnkube-node-tjkqr\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.781499 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-host-kubelet\") pod \"ovnkube-node-tjkqr\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.781519 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-host-run-ovn-kubernetes\") pod \"ovnkube-node-tjkqr\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.798083 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b3ea01760c8f1e58a55a59a0d453c25ca1298ede32d471f02092fd9d0a43830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e7aa15babbd745afed6bc0d4bbf5526180b8d32f86f8db736066fad3feb506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:27Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.838419 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:27Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.855667 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:27Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.878257 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xh4hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0e1eaa6-bee9-401a-b01a-9bf49a938b29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xh4hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:27Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.881935 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-etc-openvswitch\") pod \"ovnkube-node-tjkqr\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.881988 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-var-lib-openvswitch\") pod \"ovnkube-node-tjkqr\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.882006 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-ovnkube-config\") pod \"ovnkube-node-tjkqr\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.882023 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-run-openvswitch\") pod \"ovnkube-node-tjkqr\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.882039 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-host-cni-bin\") pod \"ovnkube-node-tjkqr\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.882054 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-systemd-units\") pod \"ovnkube-node-tjkqr\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.882069 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-host-slash\") pod \"ovnkube-node-tjkqr\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.882088 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-host-run-netns\") pod \"ovnkube-node-tjkqr\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.882102 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-run-systemd\") pod \"ovnkube-node-tjkqr\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.882124 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-env-overrides\") pod \"ovnkube-node-tjkqr\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.882141 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-ovn-node-metrics-cert\") pod \"ovnkube-node-tjkqr\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.882164 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-log-socket\") pod \"ovnkube-node-tjkqr\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.882179 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-host-cni-netd\") pod \"ovnkube-node-tjkqr\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.882196 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjhfw\" (UniqueName: \"kubernetes.io/projected/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-kube-api-access-xjhfw\") pod \"ovnkube-node-tjkqr\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.882231 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-ovnkube-script-lib\") pod \"ovnkube-node-tjkqr\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.882252 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-host-kubelet\") pod \"ovnkube-node-tjkqr\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.882268 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-host-run-ovn-kubernetes\") pod \"ovnkube-node-tjkqr\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.882297 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-node-log\") pod \"ovnkube-node-tjkqr\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.882313 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tjkqr\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.882331 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-run-ovn\") pod \"ovnkube-node-tjkqr\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.882397 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-run-ovn\") pod \"ovnkube-node-tjkqr\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.882489 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-etc-openvswitch\") pod \"ovnkube-node-tjkqr\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.882515 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-var-lib-openvswitch\") pod \"ovnkube-node-tjkqr\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.883089 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-host-slash\") pod \"ovnkube-node-tjkqr\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.883097 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-host-cni-netd\") pod \"ovnkube-node-tjkqr\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.883129 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-systemd-units\") pod \"ovnkube-node-tjkqr\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.883139 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-log-socket\") pod \"ovnkube-node-tjkqr\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.883135 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-node-log\") pod \"ovnkube-node-tjkqr\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.883168 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-host-kubelet\") pod \"ovnkube-node-tjkqr\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.883220 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-run-systemd\") pod \"ovnkube-node-tjkqr\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.883236 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-host-cni-bin\") pod \"ovnkube-node-tjkqr\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.883251 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-host-run-netns\") pod \"ovnkube-node-tjkqr\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.883262 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-run-openvswitch\") pod \"ovnkube-node-tjkqr\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.883323 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tjkqr\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.883397 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-host-run-ovn-kubernetes\") pod \"ovnkube-node-tjkqr\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.883895 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-ovnkube-script-lib\") pod \"ovnkube-node-tjkqr\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.883918 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-env-overrides\") pod \"ovnkube-node-tjkqr\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.884328 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-ovnkube-config\") pod \"ovnkube-node-tjkqr\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.893174 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-ovn-node-metrics-cert\") pod \"ovnkube-node-tjkqr\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.904333 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b3ea01760c8f1e58a55a59a0d453c25ca1298ede32d471f02092fd9d0a43830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e7aa15babbd745afed6bc0d4bbf5526180b8d32f86f8db736066fad3feb506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:27Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.910572 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjhfw\" (UniqueName: \"kubernetes.io/projected/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-kube-api-access-xjhfw\") pod \"ovnkube-node-tjkqr\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.924562 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:27Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.946263 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:27Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.964391 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.969280 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xh4hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0e1eaa6-bee9-401a-b01a-9bf49a938b29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xh4hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:27Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:27 crc kubenswrapper[4585]: W1201 13:58:27.979820 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0b45150_070d_4f7c_b53a_d76dcbaa6e6d.slice/crio-8009062339cfd4914b028580ac141efa8f7917ae1acd3198f12de89af62f34b4 WatchSource:0}: Error finding container 8009062339cfd4914b028580ac141efa8f7917ae1acd3198f12de89af62f34b4: Status 404 returned error can't find the container with id 8009062339cfd4914b028580ac141efa8f7917ae1acd3198f12de89af62f34b4 Dec 01 13:58:27 crc kubenswrapper[4585]: I1201 13:58:27.989115 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:27Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:28 crc kubenswrapper[4585]: I1201 13:58:28.012896 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-62bsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98be7526-98f6-4a4a-b4a6-1d10e76b7a99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxjs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-62bsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:28Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:28 crc kubenswrapper[4585]: I1201 13:58:28.043790 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b10cdae7-154c-4fd6-a308-02843603d7ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b40ee42d7d1c9043b0b5c0dea3baacc46c24842ec64ed3b0d0ae1acfaf65f8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa22ac0ee9ad705a8affafffed373135005ef0380c9a646e06229a24d04d20b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://949f7fe4da35866e66212aea18ec1ed303bd542b07e53ff6562618c19de112cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d7c5d98b7f6dcade92ba77f78aa8345baf8161727831a616192726f6ef6f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dfb22c5155fdc4bfdf9d54bf540e4018bca6a616835af7e8d5079bebcd45da\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T13:58:25Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 13:58:19.764445 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 13:58:19.765610 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3283210930/tls.crt::/tmp/serving-cert-3283210930/tls.key\\\\\\\"\\\\nI1201 13:58:25.461951 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 13:58:25.464641 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 13:58:25.464660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 13:58:25.464680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 13:58:25.464686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 13:58:25.473932 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 13:58:25.473958 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 13:58:25.473983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 13:58:25.473986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 13:58:25.473989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 13:58:25.474166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 13:58:25.477276 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc9645fabd7968d92ce9867130dddbb6c72664af68d6b5475cf3271d1975878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:28Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:28 crc kubenswrapper[4585]: I1201 13:58:28.072900 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e9e83cdb2aa3565484c1ff3e5bbd72c7f50a132de50e60aec1aa88ad145ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:28Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:28 crc kubenswrapper[4585]: I1201 13:58:28.083601 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxjs6\" (UniqueName: \"kubernetes.io/projected/98be7526-98f6-4a4a-b4a6-1d10e76b7a99-kube-api-access-dxjs6\") pod \"node-resolver-62bsn\" (UID: \"98be7526-98f6-4a4a-b4a6-1d10e76b7a99\") " pod="openshift-dns/node-resolver-62bsn" Dec 01 13:58:28 crc kubenswrapper[4585]: I1201 13:58:28.093932 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxjs6\" (UniqueName: \"kubernetes.io/projected/98be7526-98f6-4a4a-b4a6-1d10e76b7a99-kube-api-access-dxjs6\") pod \"node-resolver-62bsn\" (UID: \"98be7526-98f6-4a4a-b4a6-1d10e76b7a99\") " pod="openshift-dns/node-resolver-62bsn" Dec 01 13:58:28 crc kubenswrapper[4585]: I1201 13:58:28.097774 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7beb40d-bcd0-43c8-a9fe-c32408790a4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lj9gs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:28Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:28 crc kubenswrapper[4585]: I1201 13:58:28.125149 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:28Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:28 crc kubenswrapper[4585]: I1201 13:58:28.144553 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:28Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:28 crc kubenswrapper[4585]: I1201 13:58:28.163262 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9wjs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e7ad3ad-7937-409b-b1c9-9c801f937400\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvc7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9wjs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:28Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:28 crc kubenswrapper[4585]: I1201 13:58:28.346542 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-62bsn" Dec 01 13:58:28 crc kubenswrapper[4585]: I1201 13:58:28.413994 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 13:58:28 crc kubenswrapper[4585]: E1201 13:58:28.414151 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 13:58:28 crc kubenswrapper[4585]: I1201 13:58:28.414614 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 13:58:28 crc kubenswrapper[4585]: E1201 13:58:28.414689 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 13:58:28 crc kubenswrapper[4585]: I1201 13:58:28.414742 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 13:58:28 crc kubenswrapper[4585]: E1201 13:58:28.414795 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 13:58:28 crc kubenswrapper[4585]: I1201 13:58:28.554855 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9wjs5" event={"ID":"6e7ad3ad-7937-409b-b1c9-9c801f937400","Type":"ContainerStarted","Data":"9a3662f79465b11c168f80f17c534c792fb1cc6b8a5c0115d219d39def6db073"} Dec 01 13:58:28 crc kubenswrapper[4585]: I1201 13:58:28.554910 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9wjs5" event={"ID":"6e7ad3ad-7937-409b-b1c9-9c801f937400","Type":"ContainerStarted","Data":"575d686b7f1fc91ee8954cb3271fd99f1623c758126869f7321b02a28ea96319"} Dec 01 13:58:28 crc kubenswrapper[4585]: I1201 13:58:28.566308 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"4694a9b9e5ea09fb1d1963504d94853e22fce9688a502036324f92f39bf8a8d8"} Dec 01 13:58:28 crc kubenswrapper[4585]: I1201 13:58:28.576494 4585 generic.go:334] "Generic (PLEG): container finished" podID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" containerID="3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58" exitCode=0 Dec 01 13:58:28 crc kubenswrapper[4585]: I1201 13:58:28.576578 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" event={"ID":"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d","Type":"ContainerDied","Data":"3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58"} Dec 01 13:58:28 crc kubenswrapper[4585]: I1201 13:58:28.576620 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" event={"ID":"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d","Type":"ContainerStarted","Data":"8009062339cfd4914b028580ac141efa8f7917ae1acd3198f12de89af62f34b4"} Dec 01 13:58:28 crc kubenswrapper[4585]: I1201 13:58:28.579388 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-62bsn" event={"ID":"98be7526-98f6-4a4a-b4a6-1d10e76b7a99","Type":"ContainerStarted","Data":"9f1496f1578ddd292fbb92d4ff3d99d2daf11459ddaadefec85c89810bd9054d"} Dec 01 13:58:28 crc kubenswrapper[4585]: I1201 13:58:28.582901 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:28Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:28 crc kubenswrapper[4585]: I1201 13:58:28.585200 4585 generic.go:334] "Generic (PLEG): container finished" podID="e0e1eaa6-bee9-401a-b01a-9bf49a938b29" containerID="4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b" exitCode=0 Dec 01 13:58:28 crc kubenswrapper[4585]: I1201 13:58:28.585292 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xh4hc" event={"ID":"e0e1eaa6-bee9-401a-b01a-9bf49a938b29","Type":"ContainerDied","Data":"4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b"} Dec 01 13:58:28 crc kubenswrapper[4585]: I1201 13:58:28.585352 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xh4hc" event={"ID":"e0e1eaa6-bee9-401a-b01a-9bf49a938b29","Type":"ContainerStarted","Data":"379bb05d8c7244a0935ee57abceeb68320d35fabf6a31e1c0667e8ac586bed10"} Dec 01 13:58:28 crc kubenswrapper[4585]: I1201 13:58:28.595123 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" event={"ID":"f7beb40d-bcd0-43c8-a9fe-c32408790a4c","Type":"ContainerStarted","Data":"59e8d51b7e0c8b526fab11f39a0a94685aecda8a1d300e1ee2259eb4bbac9003"} Dec 01 13:58:28 crc kubenswrapper[4585]: I1201 13:58:28.595159 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" event={"ID":"f7beb40d-bcd0-43c8-a9fe-c32408790a4c","Type":"ContainerStarted","Data":"4c2cf1c1e4965502ab56b6d9bbf7840d00225b831b1e613295befad0748d7d00"} Dec 01 13:58:28 crc kubenswrapper[4585]: I1201 13:58:28.595175 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" event={"ID":"f7beb40d-bcd0-43c8-a9fe-c32408790a4c","Type":"ContainerStarted","Data":"f7ae0b30c1ca32895a76b17cd02af221a4d5caf86ee0d2025ea6aef693c524b3"} Dec 01 13:58:28 crc kubenswrapper[4585]: I1201 13:58:28.604135 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-62bsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98be7526-98f6-4a4a-b4a6-1d10e76b7a99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxjs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-62bsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:28Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:28 crc kubenswrapper[4585]: I1201 13:58:28.638284 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b10cdae7-154c-4fd6-a308-02843603d7ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b40ee42d7d1c9043b0b5c0dea3baacc46c24842ec64ed3b0d0ae1acfaf65f8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa22ac0ee9ad705a8affafffed373135005ef0380c9a646e06229a24d04d20b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://949f7fe4da35866e66212aea18ec1ed303bd542b07e53ff6562618c19de112cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d7c5d98b7f6dcade92ba77f78aa8345baf8161727831a616192726f6ef6f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dfb22c5155fdc4bfdf9d54bf540e4018bca6a616835af7e8d5079bebcd45da\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T13:58:25Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 13:58:19.764445 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 13:58:19.765610 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3283210930/tls.crt::/tmp/serving-cert-3283210930/tls.key\\\\\\\"\\\\nI1201 13:58:25.461951 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 13:58:25.464641 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 13:58:25.464660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 13:58:25.464680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 13:58:25.464686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 13:58:25.473932 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 13:58:25.473958 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 13:58:25.473983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 13:58:25.473986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 13:58:25.473989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 13:58:25.474166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 13:58:25.477276 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc9645fabd7968d92ce9867130dddbb6c72664af68d6b5475cf3271d1975878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:28Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:28 crc kubenswrapper[4585]: I1201 13:58:28.652651 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e9e83cdb2aa3565484c1ff3e5bbd72c7f50a132de50e60aec1aa88ad145ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:28Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:28 crc kubenswrapper[4585]: I1201 13:58:28.662382 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7beb40d-bcd0-43c8-a9fe-c32408790a4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lj9gs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:28Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:28 crc kubenswrapper[4585]: I1201 13:58:28.687384 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:28Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:28 crc kubenswrapper[4585]: I1201 13:58:28.717582 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:28Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:28 crc kubenswrapper[4585]: I1201 13:58:28.738223 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9wjs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e7ad3ad-7937-409b-b1c9-9c801f937400\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3662f79465b11c168f80f17c534c792fb1cc6b8a5c0115d219d39def6db073\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvc7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9wjs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:28Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:28 crc kubenswrapper[4585]: I1201 13:58:28.752254 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b3ea01760c8f1e58a55a59a0d453c25ca1298ede32d471f02092fd9d0a43830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e7aa15babbd745afed6bc0d4bbf5526180b8d32f86f8db736066fad3feb506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:28Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:28 crc kubenswrapper[4585]: I1201 13:58:28.766062 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:28Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:28 crc kubenswrapper[4585]: I1201 13:58:28.784608 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:28Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:28 crc kubenswrapper[4585]: I1201 13:58:28.799287 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xh4hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0e1eaa6-bee9-401a-b01a-9bf49a938b29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xh4hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:28Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:28 crc kubenswrapper[4585]: I1201 13:58:28.808515 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 13:58:28 crc kubenswrapper[4585]: I1201 13:58:28.813910 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 13:58:28 crc kubenswrapper[4585]: I1201 13:58:28.815445 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:28Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:28 crc kubenswrapper[4585]: I1201 13:58:28.818685 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 01 13:58:28 crc kubenswrapper[4585]: I1201 13:58:28.832265 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-62bsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98be7526-98f6-4a4a-b4a6-1d10e76b7a99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxjs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-62bsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:28Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:28 crc kubenswrapper[4585]: I1201 13:58:28.846534 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b10cdae7-154c-4fd6-a308-02843603d7ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b40ee42d7d1c9043b0b5c0dea3baacc46c24842ec64ed3b0d0ae1acfaf65f8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa22ac0ee9ad705a8affafffed373135005ef0380c9a646e06229a24d04d20b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://949f7fe4da35866e66212aea18ec1ed303bd542b07e53ff6562618c19de112cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d7c5d98b7f6dcade92ba77f78aa8345baf8161727831a616192726f6ef6f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dfb22c5155fdc4bfdf9d54bf540e4018bca6a616835af7e8d5079bebcd45da\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T13:58:25Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 13:58:19.764445 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 13:58:19.765610 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3283210930/tls.crt::/tmp/serving-cert-3283210930/tls.key\\\\\\\"\\\\nI1201 13:58:25.461951 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 13:58:25.464641 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 13:58:25.464660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 13:58:25.464680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 13:58:25.464686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 13:58:25.473932 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 13:58:25.473958 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 13:58:25.473983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 13:58:25.473986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 13:58:25.473989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 13:58:25.474166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 13:58:25.477276 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc9645fabd7968d92ce9867130dddbb6c72664af68d6b5475cf3271d1975878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:28Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:28 crc kubenswrapper[4585]: I1201 13:58:28.863541 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e9e83cdb2aa3565484c1ff3e5bbd72c7f50a132de50e60aec1aa88ad145ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:28Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:28 crc kubenswrapper[4585]: I1201 13:58:28.874895 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7beb40d-bcd0-43c8-a9fe-c32408790a4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59e8d51b7e0c8b526fab11f39a0a94685aecda8a1d300e1ee2259eb4bbac9003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c2cf1c1e4965502ab56b6d9bbf7840d00225b831b1e613295befad0748d7d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lj9gs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:28Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:28 crc kubenswrapper[4585]: I1201 13:58:28.901800 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:28Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:28 crc kubenswrapper[4585]: I1201 13:58:28.919747 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:28Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:28 crc kubenswrapper[4585]: I1201 13:58:28.941000 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9wjs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e7ad3ad-7937-409b-b1c9-9c801f937400\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3662f79465b11c168f80f17c534c792fb1cc6b8a5c0115d219d39def6db073\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvc7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9wjs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:28Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:28 crc kubenswrapper[4585]: I1201 13:58:28.961855 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:28Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:28 crc kubenswrapper[4585]: I1201 13:58:28.982775 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xh4hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0e1eaa6-bee9-401a-b01a-9bf49a938b29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xh4hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:28Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.006918 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b3ea01760c8f1e58a55a59a0d453c25ca1298ede32d471f02092fd9d0a43830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e7aa15babbd745afed6bc0d4bbf5526180b8d32f86f8db736066fad3feb506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:29Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.031398 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4694a9b9e5ea09fb1d1963504d94853e22fce9688a502036324f92f39bf8a8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:29Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.077385 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:29Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.112269 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9wjs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e7ad3ad-7937-409b-b1c9-9c801f937400\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3662f79465b11c168f80f17c534c792fb1cc6b8a5c0115d219d39def6db073\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvc7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9wjs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:29Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.155091 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.164411 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4694a9b9e5ea09fb1d1963504d94853e22fce9688a502036324f92f39bf8a8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:29Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.194761 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.199608 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.199771 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.199812 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.199832 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.199853 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 13:58:29 crc kubenswrapper[4585]: E1201 13:58:29.199993 4585 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 13:58:29 crc kubenswrapper[4585]: E1201 13:58:29.200056 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 13:58:33.2000371 +0000 UTC m=+27.184250955 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.200064 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 01 13:58:29 crc kubenswrapper[4585]: E1201 13:58:29.200259 4585 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 13:58:29 crc kubenswrapper[4585]: E1201 13:58:29.200290 4585 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 13:58:29 crc kubenswrapper[4585]: E1201 13:58:29.200313 4585 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 13:58:29 crc kubenswrapper[4585]: E1201 13:58:29.200372 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 13:58:33.200351379 +0000 UTC m=+27.184565284 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 13:58:29 crc kubenswrapper[4585]: E1201 13:58:29.200457 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 13:58:33.200445771 +0000 UTC m=+27.184659716 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 13:58:29 crc kubenswrapper[4585]: E1201 13:58:29.200464 4585 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 13:58:29 crc kubenswrapper[4585]: E1201 13:58:29.200481 4585 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 13:58:29 crc kubenswrapper[4585]: E1201 13:58:29.200493 4585 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 13:58:29 crc kubenswrapper[4585]: E1201 13:58:29.200499 4585 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 13:58:29 crc kubenswrapper[4585]: E1201 13:58:29.200520 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 13:58:33.200513213 +0000 UTC m=+27.184727068 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 13:58:29 crc kubenswrapper[4585]: E1201 13:58:29.200535 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 13:58:33.200526833 +0000 UTC m=+27.184740768 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.207521 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:29Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.238323 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xh4hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0e1eaa6-bee9-401a-b01a-9bf49a938b29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xh4hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:29Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.247257 4585 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.253207 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.253255 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.253268 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.253405 4585 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.257115 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-nhp6c"] Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.257602 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-nhp6c" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.262193 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.262236 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.262471 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.262489 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.270548 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4030376c-31f1-420a-935f-f1ee174914e7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2571804bbf066bfec1f4fde3430800aa87d9c5b6db48cc1f3d352d742657ca8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d240ebf61216f20038983f7b8d2e0792fc0f1ed8db050debf02c777197d417b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f66a3bd7503966a7b443b7ea1adbd8554c99e810749bc275e5cfa4e95c6b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4434211239bc61f432b46918bbb691c6257fded2fbd9552ce3344d10e091b3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:29Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.275111 4585 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.275419 4585 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.278239 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.278265 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.278275 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.278293 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.278305 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:29Z","lastTransitionTime":"2025-12-01T13:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.291209 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b3ea01760c8f1e58a55a59a0d453c25ca1298ede32d471f02092fd9d0a43830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e7aa15babbd745afed6bc0d4bbf5526180b8d32f86f8db736066fad3feb506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:29Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.300498 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n25g8\" (UniqueName: \"kubernetes.io/projected/da6d315c-d478-4216-9f3d-57b20ce5ced8-kube-api-access-n25g8\") pod \"node-ca-nhp6c\" (UID: \"da6d315c-d478-4216-9f3d-57b20ce5ced8\") " pod="openshift-image-registry/node-ca-nhp6c" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.300571 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/da6d315c-d478-4216-9f3d-57b20ce5ced8-serviceca\") pod \"node-ca-nhp6c\" (UID: \"da6d315c-d478-4216-9f3d-57b20ce5ced8\") " pod="openshift-image-registry/node-ca-nhp6c" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.300597 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/da6d315c-d478-4216-9f3d-57b20ce5ced8-host\") pod \"node-ca-nhp6c\" (UID: \"da6d315c-d478-4216-9f3d-57b20ce5ced8\") " pod="openshift-image-registry/node-ca-nhp6c" Dec 01 13:58:29 crc kubenswrapper[4585]: E1201 13:58:29.306515 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e38ee8f4-73f3-495c-b626-577353e9a008\\\",\\\"systemUUID\\\":\\\"fcf25ef7-53cd-4591-aa80-a73d07c13768\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:29Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.321291 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.321339 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.321348 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.321367 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.321381 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:29Z","lastTransitionTime":"2025-12-01T13:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.336490 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:29Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:29 crc kubenswrapper[4585]: E1201 13:58:29.340945 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e38ee8f4-73f3-495c-b626-577353e9a008\\\",\\\"systemUUID\\\":\\\"fcf25ef7-53cd-4591-aa80-a73d07c13768\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:29Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.345305 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.345349 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.345365 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.345385 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.345399 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:29Z","lastTransitionTime":"2025-12-01T13:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.356799 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-62bsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98be7526-98f6-4a4a-b4a6-1d10e76b7a99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxjs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-62bsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:29Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:29 crc kubenswrapper[4585]: E1201 13:58:29.364751 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e38ee8f4-73f3-495c-b626-577353e9a008\\\",\\\"systemUUID\\\":\\\"fcf25ef7-53cd-4591-aa80-a73d07c13768\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:29Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.368248 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.368301 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.368312 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.368333 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.368344 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:29Z","lastTransitionTime":"2025-12-01T13:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.376002 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b10cdae7-154c-4fd6-a308-02843603d7ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b40ee42d7d1c9043b0b5c0dea3baacc46c24842ec64ed3b0d0ae1acfaf65f8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa22ac0ee9ad705a8affafffed373135005ef0380c9a646e06229a24d04d20b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://949f7fe4da35866e66212aea18ec1ed303bd542b07e53ff6562618c19de112cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d7c5d98b7f6dcade92ba77f78aa8345baf8161727831a616192726f6ef6f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dfb22c5155fdc4bfdf9d54bf540e4018bca6a616835af7e8d5079bebcd45da\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T13:58:25Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 13:58:19.764445 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 13:58:19.765610 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3283210930/tls.crt::/tmp/serving-cert-3283210930/tls.key\\\\\\\"\\\\nI1201 13:58:25.461951 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 13:58:25.464641 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 13:58:25.464660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 13:58:25.464680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 13:58:25.464686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 13:58:25.473932 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 13:58:25.473958 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 13:58:25.473983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 13:58:25.473986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 13:58:25.473989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 13:58:25.474166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 13:58:25.477276 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc9645fabd7968d92ce9867130dddbb6c72664af68d6b5475cf3271d1975878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:29Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.393714 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e9e83cdb2aa3565484c1ff3e5bbd72c7f50a132de50e60aec1aa88ad145ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:29Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:29 crc kubenswrapper[4585]: E1201 13:58:29.394504 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e38ee8f4-73f3-495c-b626-577353e9a008\\\",\\\"systemUUID\\\":\\\"fcf25ef7-53cd-4591-aa80-a73d07c13768\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:29Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.401075 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n25g8\" (UniqueName: \"kubernetes.io/projected/da6d315c-d478-4216-9f3d-57b20ce5ced8-kube-api-access-n25g8\") pod \"node-ca-nhp6c\" (UID: \"da6d315c-d478-4216-9f3d-57b20ce5ced8\") " pod="openshift-image-registry/node-ca-nhp6c" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.401140 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/da6d315c-d478-4216-9f3d-57b20ce5ced8-serviceca\") pod \"node-ca-nhp6c\" (UID: \"da6d315c-d478-4216-9f3d-57b20ce5ced8\") " pod="openshift-image-registry/node-ca-nhp6c" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.401162 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/da6d315c-d478-4216-9f3d-57b20ce5ced8-host\") pod \"node-ca-nhp6c\" (UID: \"da6d315c-d478-4216-9f3d-57b20ce5ced8\") " pod="openshift-image-registry/node-ca-nhp6c" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.401228 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/da6d315c-d478-4216-9f3d-57b20ce5ced8-host\") pod \"node-ca-nhp6c\" (UID: \"da6d315c-d478-4216-9f3d-57b20ce5ced8\") " pod="openshift-image-registry/node-ca-nhp6c" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.402108 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.402123 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/da6d315c-d478-4216-9f3d-57b20ce5ced8-serviceca\") pod \"node-ca-nhp6c\" (UID: \"da6d315c-d478-4216-9f3d-57b20ce5ced8\") " pod="openshift-image-registry/node-ca-nhp6c" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.402137 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.402151 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.402168 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.402178 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:29Z","lastTransitionTime":"2025-12-01T13:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.417617 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7beb40d-bcd0-43c8-a9fe-c32408790a4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59e8d51b7e0c8b526fab11f39a0a94685aecda8a1d300e1ee2259eb4bbac9003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c2cf1c1e4965502ab56b6d9bbf7840d00225b831b1e613295befad0748d7d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lj9gs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:29Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:29 crc kubenswrapper[4585]: E1201 13:58:29.422065 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e38ee8f4-73f3-495c-b626-577353e9a008\\\",\\\"systemUUID\\\":\\\"fcf25ef7-53cd-4591-aa80-a73d07c13768\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:29Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:29 crc kubenswrapper[4585]: E1201 13:58:29.422557 4585 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.427496 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.427548 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.427557 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.427574 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.427586 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:29Z","lastTransitionTime":"2025-12-01T13:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.431339 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n25g8\" (UniqueName: \"kubernetes.io/projected/da6d315c-d478-4216-9f3d-57b20ce5ced8-kube-api-access-n25g8\") pod \"node-ca-nhp6c\" (UID: \"da6d315c-d478-4216-9f3d-57b20ce5ced8\") " pod="openshift-image-registry/node-ca-nhp6c" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.454624 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:29Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.470481 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-62bsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98be7526-98f6-4a4a-b4a6-1d10e76b7a99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxjs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-62bsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:29Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.487677 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:29Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.518677 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7beb40d-bcd0-43c8-a9fe-c32408790a4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59e8d51b7e0c8b526fab11f39a0a94685aecda8a1d300e1ee2259eb4bbac9003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c2cf1c1e4965502ab56b6d9bbf7840d00225b831b1e613295befad0748d7d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lj9gs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:29Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.530478 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.530527 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.530540 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.530560 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.530573 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:29Z","lastTransitionTime":"2025-12-01T13:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.544351 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:29Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.557608 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nhp6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da6d315c-d478-4216-9f3d-57b20ce5ced8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n25g8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nhp6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:29Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.574105 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-nhp6c" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.575285 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b10cdae7-154c-4fd6-a308-02843603d7ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b40ee42d7d1c9043b0b5c0dea3baacc46c24842ec64ed3b0d0ae1acfaf65f8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa22ac0ee9ad705a8affafffed373135005ef0380c9a646e06229a24d04d20b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://949f7fe4da35866e66212aea18ec1ed303bd542b07e53ff6562618c19de112cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d7c5d98b7f6dcade92ba77f78aa8345baf8161727831a616192726f6ef6f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dfb22c5155fdc4bfdf9d54bf540e4018bca6a616835af7e8d5079bebcd45da\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T13:58:25Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 13:58:19.764445 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 13:58:19.765610 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3283210930/tls.crt::/tmp/serving-cert-3283210930/tls.key\\\\\\\"\\\\nI1201 13:58:25.461951 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 13:58:25.464641 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 13:58:25.464660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 13:58:25.464680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 13:58:25.464686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 13:58:25.473932 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 13:58:25.473958 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 13:58:25.473983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 13:58:25.473986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 13:58:25.473989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 13:58:25.474166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 13:58:25.477276 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc9645fabd7968d92ce9867130dddbb6c72664af68d6b5475cf3271d1975878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:29Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:29 crc kubenswrapper[4585]: W1201 13:58:29.588604 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda6d315c_d478_4216_9f3d_57b20ce5ced8.slice/crio-44746780f8f947e8d79cf2a6c83bc3eef3c238eaf32ca46006c48516377fdea6 WatchSource:0}: Error finding container 44746780f8f947e8d79cf2a6c83bc3eef3c238eaf32ca46006c48516377fdea6: Status 404 returned error can't find the container with id 44746780f8f947e8d79cf2a6c83bc3eef3c238eaf32ca46006c48516377fdea6 Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.592501 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e9e83cdb2aa3565484c1ff3e5bbd72c7f50a132de50e60aec1aa88ad145ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:29Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.600062 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-nhp6c" event={"ID":"da6d315c-d478-4216-9f3d-57b20ce5ced8","Type":"ContainerStarted","Data":"44746780f8f947e8d79cf2a6c83bc3eef3c238eaf32ca46006c48516377fdea6"} Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.603010 4585 generic.go:334] "Generic (PLEG): container finished" podID="e0e1eaa6-bee9-401a-b01a-9bf49a938b29" containerID="9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a" exitCode=0 Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.603083 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xh4hc" event={"ID":"e0e1eaa6-bee9-401a-b01a-9bf49a938b29","Type":"ContainerDied","Data":"9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a"} Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.604788 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-62bsn" event={"ID":"98be7526-98f6-4a4a-b4a6-1d10e76b7a99","Type":"ContainerStarted","Data":"f3d4df68e891df5e4e44371a4c884475b5723d55f6a10d32889cf194ddbe6179"} Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.608166 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" event={"ID":"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d","Type":"ContainerStarted","Data":"37437844a9c9719627f96c0afa303f0d449c46def7249b8c69ebd44900661fa4"} Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.608202 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" event={"ID":"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d","Type":"ContainerStarted","Data":"90163e3776ee8b2a338e3f8e6aa3d918d1d6123d246fbbd88659d342348167a3"} Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.608217 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" event={"ID":"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d","Type":"ContainerStarted","Data":"0d740a1be02b59a062ea8d5c44620cb57725a7ec494fd365e44a8980947c0d2e"} Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.608229 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" event={"ID":"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d","Type":"ContainerStarted","Data":"fdbccd3910729d23067d94c9ed2f5f3c917cd4880185a3b7e8c4c811cd30e609"} Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.610873 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:29Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:29 crc kubenswrapper[4585]: E1201 13:58:29.617161 4585 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-crc\" already exists" pod="openshift-etcd/etcd-crc" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.627034 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9wjs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e7ad3ad-7937-409b-b1c9-9c801f937400\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3662f79465b11c168f80f17c534c792fb1cc6b8a5c0115d219d39def6db073\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvc7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9wjs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:29Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.633683 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.633735 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.633749 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.633771 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.633790 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:29Z","lastTransitionTime":"2025-12-01T13:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.653340 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b3ea01760c8f1e58a55a59a0d453c25ca1298ede32d471f02092fd9d0a43830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e7aa15babbd745afed6bc0d4bbf5526180b8d32f86f8db736066fad3feb506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:29Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.671532 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4694a9b9e5ea09fb1d1963504d94853e22fce9688a502036324f92f39bf8a8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:29Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.690115 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:29Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.713472 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xh4hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0e1eaa6-bee9-401a-b01a-9bf49a938b29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xh4hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:29Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.738652 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b62eed93-4d28-4748-b991-dd9692b58341\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e952d1592344330416348b9ec4add60740a0d23ab554afb70c6419977fb12533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e35deff79558e54b62aa868535d5bc4d4374ede161463dcf5f3a33584867447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9736ff6b49635cb7c26db7bc2292b36cf63053e0d1e4ffa2980a520dd71d9ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe7079ed6fa6b8b64d16c67287c33c2bffa0f42df2166fe66e3f8e173ca87d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83ba28d0b57f4744f3f7f97915fde983e2de227958ddaecd43b3eae9eb406774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:29Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.738782 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.738814 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.738849 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.738869 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.738880 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:29Z","lastTransitionTime":"2025-12-01T13:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.757608 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4030376c-31f1-420a-935f-f1ee174914e7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2571804bbf066bfec1f4fde3430800aa87d9c5b6db48cc1f3d352d742657ca8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d240ebf61216f20038983f7b8d2e0792fc0f1ed8db050debf02c777197d417b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f66a3bd7503966a7b443b7ea1adbd8554c99e810749bc275e5cfa4e95c6b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4434211239bc61f432b46918bbb691c6257fded2fbd9552ce3344d10e091b3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:29Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.771089 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:29Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.784270 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-62bsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98be7526-98f6-4a4a-b4a6-1d10e76b7a99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d4df68e891df5e4e44371a4c884475b5723d55f6a10d32889cf194ddbe6179\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxjs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-62bsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:29Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.804341 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:29Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.821266 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nhp6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da6d315c-d478-4216-9f3d-57b20ce5ced8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n25g8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nhp6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:29Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.841588 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b10cdae7-154c-4fd6-a308-02843603d7ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b40ee42d7d1c9043b0b5c0dea3baacc46c24842ec64ed3b0d0ae1acfaf65f8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa22ac0ee9ad705a8affafffed373135005ef0380c9a646e06229a24d04d20b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://949f7fe4da35866e66212aea18ec1ed303bd542b07e53ff6562618c19de112cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d7c5d98b7f6dcade92ba77f78aa8345baf8161727831a616192726f6ef6f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dfb22c5155fdc4bfdf9d54bf540e4018bca6a616835af7e8d5079bebcd45da\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T13:58:25Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 13:58:19.764445 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 13:58:19.765610 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3283210930/tls.crt::/tmp/serving-cert-3283210930/tls.key\\\\\\\"\\\\nI1201 13:58:25.461951 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 13:58:25.464641 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 13:58:25.464660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 13:58:25.464680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 13:58:25.464686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 13:58:25.473932 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 13:58:25.473958 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 13:58:25.473983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 13:58:25.473986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 13:58:25.473989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 13:58:25.474166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 13:58:25.477276 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc9645fabd7968d92ce9867130dddbb6c72664af68d6b5475cf3271d1975878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:29Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.843160 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.843202 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.843216 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.843258 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.843271 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:29Z","lastTransitionTime":"2025-12-01T13:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.859407 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e9e83cdb2aa3565484c1ff3e5bbd72c7f50a132de50e60aec1aa88ad145ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:29Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.873159 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7beb40d-bcd0-43c8-a9fe-c32408790a4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59e8d51b7e0c8b526fab11f39a0a94685aecda8a1d300e1ee2259eb4bbac9003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c2cf1c1e4965502ab56b6d9bbf7840d00225b831b1e613295befad0748d7d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lj9gs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:29Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.886634 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:29Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.903154 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9wjs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e7ad3ad-7937-409b-b1c9-9c801f937400\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3662f79465b11c168f80f17c534c792fb1cc6b8a5c0115d219d39def6db073\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvc7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9wjs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:29Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.919248 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b3ea01760c8f1e58a55a59a0d453c25ca1298ede32d471f02092fd9d0a43830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e7aa15babbd745afed6bc0d4bbf5526180b8d32f86f8db736066fad3feb506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:29Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.935221 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4694a9b9e5ea09fb1d1963504d94853e22fce9688a502036324f92f39bf8a8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:29Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.945597 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.945644 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.945656 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.945674 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.945685 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:29Z","lastTransitionTime":"2025-12-01T13:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.953523 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:29Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.972827 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xh4hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0e1eaa6-bee9-401a-b01a-9bf49a938b29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xh4hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:29Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:29 crc kubenswrapper[4585]: I1201 13:58:29.998324 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b62eed93-4d28-4748-b991-dd9692b58341\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e952d1592344330416348b9ec4add60740a0d23ab554afb70c6419977fb12533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e35deff79558e54b62aa868535d5bc4d4374ede161463dcf5f3a33584867447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9736ff6b49635cb7c26db7bc2292b36cf63053e0d1e4ffa2980a520dd71d9ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe7079ed6fa6b8b64d16c67287c33c2bffa0f42df2166fe66e3f8e173ca87d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83ba28d0b57f4744f3f7f97915fde983e2de227958ddaecd43b3eae9eb406774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:29Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.015277 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4030376c-31f1-420a-935f-f1ee174914e7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2571804bbf066bfec1f4fde3430800aa87d9c5b6db48cc1f3d352d742657ca8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d240ebf61216f20038983f7b8d2e0792fc0f1ed8db050debf02c777197d417b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f66a3bd7503966a7b443b7ea1adbd8554c99e810749bc275e5cfa4e95c6b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4434211239bc61f432b46918bbb691c6257fded2fbd9552ce3344d10e091b3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:30Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.049631 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.049677 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.049687 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.049707 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.049718 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:30Z","lastTransitionTime":"2025-12-01T13:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.151859 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.151891 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.151900 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.151916 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.151926 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:30Z","lastTransitionTime":"2025-12-01T13:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.254734 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.254785 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.254794 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.254813 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.254824 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:30Z","lastTransitionTime":"2025-12-01T13:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.357323 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.357391 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.357409 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.357433 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.357446 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:30Z","lastTransitionTime":"2025-12-01T13:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.412661 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.412719 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.412690 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 13:58:30 crc kubenswrapper[4585]: E1201 13:58:30.412874 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 13:58:30 crc kubenswrapper[4585]: E1201 13:58:30.413079 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 13:58:30 crc kubenswrapper[4585]: E1201 13:58:30.413297 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.461078 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.461143 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.461157 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.461181 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.461208 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:30Z","lastTransitionTime":"2025-12-01T13:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.564016 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.564069 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.564084 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.564113 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.564129 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:30Z","lastTransitionTime":"2025-12-01T13:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.612304 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-nhp6c" event={"ID":"da6d315c-d478-4216-9f3d-57b20ce5ced8","Type":"ContainerStarted","Data":"3ebdaa73edbe04780c0f421ea88446b4268ed1792af30e1ebb945167db80a548"} Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.615313 4585 generic.go:334] "Generic (PLEG): container finished" podID="e0e1eaa6-bee9-401a-b01a-9bf49a938b29" containerID="6f3e0b6c5dac9ff9b4686937a7d1646c4796a3326193349cf1556876bf1ae61a" exitCode=0 Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.615417 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xh4hc" event={"ID":"e0e1eaa6-bee9-401a-b01a-9bf49a938b29","Type":"ContainerDied","Data":"6f3e0b6c5dac9ff9b4686937a7d1646c4796a3326193349cf1556876bf1ae61a"} Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.623410 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" event={"ID":"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d","Type":"ContainerStarted","Data":"ec93c685227d533ac1ab980dca6f72792f746e68511a7a546fb25189b02507d3"} Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.623480 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" event={"ID":"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d","Type":"ContainerStarted","Data":"1b64c1a0d36f8addf6040455b1e933756582857ef8098c935b9cc5a115e8afee"} Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.632787 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:30Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.652771 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9wjs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e7ad3ad-7937-409b-b1c9-9c801f937400\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3662f79465b11c168f80f17c534c792fb1cc6b8a5c0115d219d39def6db073\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvc7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9wjs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:30Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.667840 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.667893 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.667904 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.667921 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.667934 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:30Z","lastTransitionTime":"2025-12-01T13:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.669113 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4694a9b9e5ea09fb1d1963504d94853e22fce9688a502036324f92f39bf8a8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:30Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.684023 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:30Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.699298 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xh4hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0e1eaa6-bee9-401a-b01a-9bf49a938b29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xh4hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:30Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.727444 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b62eed93-4d28-4748-b991-dd9692b58341\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e952d1592344330416348b9ec4add60740a0d23ab554afb70c6419977fb12533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e35deff79558e54b62aa868535d5bc4d4374ede161463dcf5f3a33584867447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9736ff6b49635cb7c26db7bc2292b36cf63053e0d1e4ffa2980a520dd71d9ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe7079ed6fa6b8b64d16c67287c33c2bffa0f42df2166fe66e3f8e173ca87d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83ba28d0b57f4744f3f7f97915fde983e2de227958ddaecd43b3eae9eb406774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:30Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.744703 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4030376c-31f1-420a-935f-f1ee174914e7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2571804bbf066bfec1f4fde3430800aa87d9c5b6db48cc1f3d352d742657ca8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d240ebf61216f20038983f7b8d2e0792fc0f1ed8db050debf02c777197d417b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f66a3bd7503966a7b443b7ea1adbd8554c99e810749bc275e5cfa4e95c6b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4434211239bc61f432b46918bbb691c6257fded2fbd9552ce3344d10e091b3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:30Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.761557 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b3ea01760c8f1e58a55a59a0d453c25ca1298ede32d471f02092fd9d0a43830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e7aa15babbd745afed6bc0d4bbf5526180b8d32f86f8db736066fad3feb506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:30Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.771645 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.771801 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.771877 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.771949 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.772273 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:30Z","lastTransitionTime":"2025-12-01T13:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.776950 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:30Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.791347 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-62bsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98be7526-98f6-4a4a-b4a6-1d10e76b7a99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d4df68e891df5e4e44371a4c884475b5723d55f6a10d32889cf194ddbe6179\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxjs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-62bsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:30Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.813097 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nhp6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da6d315c-d478-4216-9f3d-57b20ce5ced8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ebdaa73edbe04780c0f421ea88446b4268ed1792af30e1ebb945167db80a548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n25g8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nhp6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:30Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.875202 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.875244 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.875256 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.875272 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.875282 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:30Z","lastTransitionTime":"2025-12-01T13:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.889246 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b10cdae7-154c-4fd6-a308-02843603d7ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b40ee42d7d1c9043b0b5c0dea3baacc46c24842ec64ed3b0d0ae1acfaf65f8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa22ac0ee9ad705a8affafffed373135005ef0380c9a646e06229a24d04d20b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://949f7fe4da35866e66212aea18ec1ed303bd542b07e53ff6562618c19de112cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d7c5d98b7f6dcade92ba77f78aa8345baf8161727831a616192726f6ef6f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dfb22c5155fdc4bfdf9d54bf540e4018bca6a616835af7e8d5079bebcd45da\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T13:58:25Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 13:58:19.764445 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 13:58:19.765610 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3283210930/tls.crt::/tmp/serving-cert-3283210930/tls.key\\\\\\\"\\\\nI1201 13:58:25.461951 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 13:58:25.464641 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 13:58:25.464660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 13:58:25.464680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 13:58:25.464686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 13:58:25.473932 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 13:58:25.473958 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 13:58:25.473983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 13:58:25.473986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 13:58:25.473989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 13:58:25.474166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 13:58:25.477276 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc9645fabd7968d92ce9867130dddbb6c72664af68d6b5475cf3271d1975878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:30Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.909319 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e9e83cdb2aa3565484c1ff3e5bbd72c7f50a132de50e60aec1aa88ad145ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:30Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.919890 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7beb40d-bcd0-43c8-a9fe-c32408790a4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59e8d51b7e0c8b526fab11f39a0a94685aecda8a1d300e1ee2259eb4bbac9003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c2cf1c1e4965502ab56b6d9bbf7840d00225b831b1e613295befad0748d7d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lj9gs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:30Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.936771 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:30Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.950479 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b10cdae7-154c-4fd6-a308-02843603d7ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b40ee42d7d1c9043b0b5c0dea3baacc46c24842ec64ed3b0d0ae1acfaf65f8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa22ac0ee9ad705a8affafffed373135005ef0380c9a646e06229a24d04d20b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://949f7fe4da35866e66212aea18ec1ed303bd542b07e53ff6562618c19de112cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d7c5d98b7f6dcade92ba77f78aa8345baf8161727831a616192726f6ef6f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dfb22c5155fdc4bfdf9d54bf540e4018bca6a616835af7e8d5079bebcd45da\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T13:58:25Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 13:58:19.764445 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 13:58:19.765610 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3283210930/tls.crt::/tmp/serving-cert-3283210930/tls.key\\\\\\\"\\\\nI1201 13:58:25.461951 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 13:58:25.464641 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 13:58:25.464660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 13:58:25.464680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 13:58:25.464686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 13:58:25.473932 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 13:58:25.473958 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 13:58:25.473983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 13:58:25.473986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 13:58:25.473989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 13:58:25.474166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 13:58:25.477276 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc9645fabd7968d92ce9867130dddbb6c72664af68d6b5475cf3271d1975878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:30Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.962331 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e9e83cdb2aa3565484c1ff3e5bbd72c7f50a132de50e60aec1aa88ad145ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:30Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.971719 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7beb40d-bcd0-43c8-a9fe-c32408790a4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59e8d51b7e0c8b526fab11f39a0a94685aecda8a1d300e1ee2259eb4bbac9003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c2cf1c1e4965502ab56b6d9bbf7840d00225b831b1e613295befad0748d7d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lj9gs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:30Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.976937 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.977120 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.977207 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.977281 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.977344 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:30Z","lastTransitionTime":"2025-12-01T13:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:30 crc kubenswrapper[4585]: I1201 13:58:30.993865 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:30Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.004173 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nhp6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da6d315c-d478-4216-9f3d-57b20ce5ced8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ebdaa73edbe04780c0f421ea88446b4268ed1792af30e1ebb945167db80a548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n25g8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nhp6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:31Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.016917 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:31Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.031785 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9wjs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e7ad3ad-7937-409b-b1c9-9c801f937400\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3662f79465b11c168f80f17c534c792fb1cc6b8a5c0115d219d39def6db073\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvc7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9wjs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:31Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.048144 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:31Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.067803 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xh4hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0e1eaa6-bee9-401a-b01a-9bf49a938b29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f3e0b6c5dac9ff9b4686937a7d1646c4796a3326193349cf1556876bf1ae61a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f3e0b6c5dac9ff9b4686937a7d1646c4796a3326193349cf1556876bf1ae61a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xh4hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:31Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.080413 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.080466 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.080478 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.080499 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.080511 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:31Z","lastTransitionTime":"2025-12-01T13:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.091226 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b62eed93-4d28-4748-b991-dd9692b58341\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e952d1592344330416348b9ec4add60740a0d23ab554afb70c6419977fb12533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e35deff79558e54b62aa868535d5bc4d4374ede161463dcf5f3a33584867447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9736ff6b49635cb7c26db7bc2292b36cf63053e0d1e4ffa2980a520dd71d9ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe7079ed6fa6b8b64d16c67287c33c2bffa0f42df2166fe66e3f8e173ca87d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83ba28d0b57f4744f3f7f97915fde983e2de227958ddaecd43b3eae9eb406774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:31Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.106779 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4030376c-31f1-420a-935f-f1ee174914e7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2571804bbf066bfec1f4fde3430800aa87d9c5b6db48cc1f3d352d742657ca8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d240ebf61216f20038983f7b8d2e0792fc0f1ed8db050debf02c777197d417b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f66a3bd7503966a7b443b7ea1adbd8554c99e810749bc275e5cfa4e95c6b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4434211239bc61f432b46918bbb691c6257fded2fbd9552ce3344d10e091b3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:31Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.124225 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b3ea01760c8f1e58a55a59a0d453c25ca1298ede32d471f02092fd9d0a43830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e7aa15babbd745afed6bc0d4bbf5526180b8d32f86f8db736066fad3feb506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:31Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.138532 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4694a9b9e5ea09fb1d1963504d94853e22fce9688a502036324f92f39bf8a8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:31Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.152221 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:31Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.165012 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-62bsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98be7526-98f6-4a4a-b4a6-1d10e76b7a99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d4df68e891df5e4e44371a4c884475b5723d55f6a10d32889cf194ddbe6179\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxjs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-62bsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:31Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.183285 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.183352 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.183364 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.183384 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.183397 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:31Z","lastTransitionTime":"2025-12-01T13:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.286578 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.286624 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.286633 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.286649 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.286661 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:31Z","lastTransitionTime":"2025-12-01T13:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.389742 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.389785 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.389794 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.389811 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.389821 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:31Z","lastTransitionTime":"2025-12-01T13:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.492720 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.492764 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.492774 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.492792 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.492816 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:31Z","lastTransitionTime":"2025-12-01T13:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.595503 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.595532 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.595542 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.595558 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.595568 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:31Z","lastTransitionTime":"2025-12-01T13:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.629634 4585 generic.go:334] "Generic (PLEG): container finished" podID="e0e1eaa6-bee9-401a-b01a-9bf49a938b29" containerID="2451dad343dcc9ec3bd5eb7109aefab60da5b700ae69d8eff28fef524795a758" exitCode=0 Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.629699 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xh4hc" event={"ID":"e0e1eaa6-bee9-401a-b01a-9bf49a938b29","Type":"ContainerDied","Data":"2451dad343dcc9ec3bd5eb7109aefab60da5b700ae69d8eff28fef524795a758"} Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.646038 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:31Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.658934 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-62bsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98be7526-98f6-4a4a-b4a6-1d10e76b7a99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d4df68e891df5e4e44371a4c884475b5723d55f6a10d32889cf194ddbe6179\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxjs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-62bsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:31Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.679160 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b10cdae7-154c-4fd6-a308-02843603d7ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b40ee42d7d1c9043b0b5c0dea3baacc46c24842ec64ed3b0d0ae1acfaf65f8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa22ac0ee9ad705a8affafffed373135005ef0380c9a646e06229a24d04d20b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://949f7fe4da35866e66212aea18ec1ed303bd542b07e53ff6562618c19de112cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d7c5d98b7f6dcade92ba77f78aa8345baf8161727831a616192726f6ef6f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dfb22c5155fdc4bfdf9d54bf540e4018bca6a616835af7e8d5079bebcd45da\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T13:58:25Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 13:58:19.764445 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 13:58:19.765610 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3283210930/tls.crt::/tmp/serving-cert-3283210930/tls.key\\\\\\\"\\\\nI1201 13:58:25.461951 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 13:58:25.464641 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 13:58:25.464660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 13:58:25.464680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 13:58:25.464686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 13:58:25.473932 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 13:58:25.473958 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 13:58:25.473983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 13:58:25.473986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 13:58:25.473989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 13:58:25.474166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 13:58:25.477276 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc9645fabd7968d92ce9867130dddbb6c72664af68d6b5475cf3271d1975878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:31Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.697926 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e9e83cdb2aa3565484c1ff3e5bbd72c7f50a132de50e60aec1aa88ad145ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:31Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.699007 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.699111 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.699191 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.699262 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.699319 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:31Z","lastTransitionTime":"2025-12-01T13:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.715191 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7beb40d-bcd0-43c8-a9fe-c32408790a4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59e8d51b7e0c8b526fab11f39a0a94685aecda8a1d300e1ee2259eb4bbac9003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c2cf1c1e4965502ab56b6d9bbf7840d00225b831b1e613295befad0748d7d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lj9gs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:31Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.738334 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:31Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.749899 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nhp6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da6d315c-d478-4216-9f3d-57b20ce5ced8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ebdaa73edbe04780c0f421ea88446b4268ed1792af30e1ebb945167db80a548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n25g8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nhp6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:31Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.763585 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:31Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.777722 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9wjs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e7ad3ad-7937-409b-b1c9-9c801f937400\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3662f79465b11c168f80f17c534c792fb1cc6b8a5c0115d219d39def6db073\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvc7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9wjs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:31Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.799403 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b62eed93-4d28-4748-b991-dd9692b58341\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e952d1592344330416348b9ec4add60740a0d23ab554afb70c6419977fb12533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e35deff79558e54b62aa868535d5bc4d4374ede161463dcf5f3a33584867447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9736ff6b49635cb7c26db7bc2292b36cf63053e0d1e4ffa2980a520dd71d9ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe7079ed6fa6b8b64d16c67287c33c2bffa0f42df2166fe66e3f8e173ca87d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83ba28d0b57f4744f3f7f97915fde983e2de227958ddaecd43b3eae9eb406774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:31Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.802597 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.802642 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.802658 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.802678 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.802692 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:31Z","lastTransitionTime":"2025-12-01T13:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.814287 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4030376c-31f1-420a-935f-f1ee174914e7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2571804bbf066bfec1f4fde3430800aa87d9c5b6db48cc1f3d352d742657ca8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d240ebf61216f20038983f7b8d2e0792fc0f1ed8db050debf02c777197d417b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f66a3bd7503966a7b443b7ea1adbd8554c99e810749bc275e5cfa4e95c6b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4434211239bc61f432b46918bbb691c6257fded2fbd9552ce3344d10e091b3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:31Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.828727 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b3ea01760c8f1e58a55a59a0d453c25ca1298ede32d471f02092fd9d0a43830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e7aa15babbd745afed6bc0d4bbf5526180b8d32f86f8db736066fad3feb506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:31Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.841088 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4694a9b9e5ea09fb1d1963504d94853e22fce9688a502036324f92f39bf8a8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:31Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.862531 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:31Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.881236 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xh4hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0e1eaa6-bee9-401a-b01a-9bf49a938b29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f3e0b6c5dac9ff9b4686937a7d1646c4796a3326193349cf1556876bf1ae61a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f3e0b6c5dac9ff9b4686937a7d1646c4796a3326193349cf1556876bf1ae61a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2451dad343dcc9ec3bd5eb7109aefab60da5b700ae69d8eff28fef524795a758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2451dad343dcc9ec3bd5eb7109aefab60da5b700ae69d8eff28fef524795a758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xh4hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:31Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.905444 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.905486 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.905496 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.905511 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:31 crc kubenswrapper[4585]: I1201 13:58:31.905523 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:31Z","lastTransitionTime":"2025-12-01T13:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.007886 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.008271 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.008281 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.008298 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.008308 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:32Z","lastTransitionTime":"2025-12-01T13:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.110384 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.110427 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.110436 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.110452 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.110460 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:32Z","lastTransitionTime":"2025-12-01T13:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.213104 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.213155 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.213166 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.213189 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.213200 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:32Z","lastTransitionTime":"2025-12-01T13:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.316807 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.316853 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.316862 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.316880 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.316893 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:32Z","lastTransitionTime":"2025-12-01T13:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.412319 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.412357 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 13:58:32 crc kubenswrapper[4585]: E1201 13:58:32.412502 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.412905 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 13:58:32 crc kubenswrapper[4585]: E1201 13:58:32.412988 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 13:58:32 crc kubenswrapper[4585]: E1201 13:58:32.413037 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.419433 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.419498 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.419512 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.419592 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.419604 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:32Z","lastTransitionTime":"2025-12-01T13:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.522256 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.522309 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.522318 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.522334 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.522346 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:32Z","lastTransitionTime":"2025-12-01T13:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.625621 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.625672 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.625684 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.625702 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.625714 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:32Z","lastTransitionTime":"2025-12-01T13:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.638025 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" event={"ID":"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d","Type":"ContainerStarted","Data":"5c85de7cd35b11158d931065cec6bafaf456be350f4a6095b8a9d0851bf87b1a"} Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.643737 4585 generic.go:334] "Generic (PLEG): container finished" podID="e0e1eaa6-bee9-401a-b01a-9bf49a938b29" containerID="e9b888c3602a8f5d0f8cb762ab7d9b6676a0467f9546dbc38c5d2dd99acedd0b" exitCode=0 Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.643787 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xh4hc" event={"ID":"e0e1eaa6-bee9-401a-b01a-9bf49a938b29","Type":"ContainerDied","Data":"e9b888c3602a8f5d0f8cb762ab7d9b6676a0467f9546dbc38c5d2dd99acedd0b"} Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.661216 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-62bsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98be7526-98f6-4a4a-b4a6-1d10e76b7a99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d4df68e891df5e4e44371a4c884475b5723d55f6a10d32889cf194ddbe6179\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxjs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-62bsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:32Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.676374 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:32Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.693428 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7beb40d-bcd0-43c8-a9fe-c32408790a4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59e8d51b7e0c8b526fab11f39a0a94685aecda8a1d300e1ee2259eb4bbac9003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c2cf1c1e4965502ab56b6d9bbf7840d00225b831b1e613295befad0748d7d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lj9gs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:32Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.713322 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:32Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.726923 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nhp6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da6d315c-d478-4216-9f3d-57b20ce5ced8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ebdaa73edbe04780c0f421ea88446b4268ed1792af30e1ebb945167db80a548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n25g8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nhp6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:32Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.732515 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.732548 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.732561 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.732579 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.732589 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:32Z","lastTransitionTime":"2025-12-01T13:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.744572 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b10cdae7-154c-4fd6-a308-02843603d7ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b40ee42d7d1c9043b0b5c0dea3baacc46c24842ec64ed3b0d0ae1acfaf65f8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa22ac0ee9ad705a8affafffed373135005ef0380c9a646e06229a24d04d20b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://949f7fe4da35866e66212aea18ec1ed303bd542b07e53ff6562618c19de112cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d7c5d98b7f6dcade92ba77f78aa8345baf8161727831a616192726f6ef6f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dfb22c5155fdc4bfdf9d54bf540e4018bca6a616835af7e8d5079bebcd45da\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T13:58:25Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 13:58:19.764445 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 13:58:19.765610 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3283210930/tls.crt::/tmp/serving-cert-3283210930/tls.key\\\\\\\"\\\\nI1201 13:58:25.461951 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 13:58:25.464641 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 13:58:25.464660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 13:58:25.464680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 13:58:25.464686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 13:58:25.473932 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 13:58:25.473958 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 13:58:25.473983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 13:58:25.473986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 13:58:25.473989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 13:58:25.474166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 13:58:25.477276 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc9645fabd7968d92ce9867130dddbb6c72664af68d6b5475cf3271d1975878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:32Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.758453 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e9e83cdb2aa3565484c1ff3e5bbd72c7f50a132de50e60aec1aa88ad145ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:32Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.772749 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:32Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.785723 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9wjs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e7ad3ad-7937-409b-b1c9-9c801f937400\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3662f79465b11c168f80f17c534c792fb1cc6b8a5c0115d219d39def6db073\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvc7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9wjs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:32Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.799566 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b3ea01760c8f1e58a55a59a0d453c25ca1298ede32d471f02092fd9d0a43830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e7aa15babbd745afed6bc0d4bbf5526180b8d32f86f8db736066fad3feb506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:32Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.812579 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4694a9b9e5ea09fb1d1963504d94853e22fce9688a502036324f92f39bf8a8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:32Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.824927 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:32Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.836374 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.836417 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.836428 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.836446 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.836458 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:32Z","lastTransitionTime":"2025-12-01T13:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.839652 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xh4hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0e1eaa6-bee9-401a-b01a-9bf49a938b29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f3e0b6c5dac9ff9b4686937a7d1646c4796a3326193349cf1556876bf1ae61a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f3e0b6c5dac9ff9b4686937a7d1646c4796a3326193349cf1556876bf1ae61a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2451dad343dcc9ec3bd5eb7109aefab60da5b700ae69d8eff28fef524795a758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2451dad343dcc9ec3bd5eb7109aefab60da5b700ae69d8eff28fef524795a758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9b888c3602a8f5d0f8cb762ab7d9b6676a0467f9546dbc38c5d2dd99acedd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9b888c3602a8f5d0f8cb762ab7d9b6676a0467f9546dbc38c5d2dd99acedd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xh4hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:32Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.863414 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b62eed93-4d28-4748-b991-dd9692b58341\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e952d1592344330416348b9ec4add60740a0d23ab554afb70c6419977fb12533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e35deff79558e54b62aa868535d5bc4d4374ede161463dcf5f3a33584867447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9736ff6b49635cb7c26db7bc2292b36cf63053e0d1e4ffa2980a520dd71d9ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe7079ed6fa6b8b64d16c67287c33c2bffa0f42df2166fe66e3f8e173ca87d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83ba28d0b57f4744f3f7f97915fde983e2de227958ddaecd43b3eae9eb406774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:32Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.877418 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4030376c-31f1-420a-935f-f1ee174914e7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2571804bbf066bfec1f4fde3430800aa87d9c5b6db48cc1f3d352d742657ca8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d240ebf61216f20038983f7b8d2e0792fc0f1ed8db050debf02c777197d417b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f66a3bd7503966a7b443b7ea1adbd8554c99e810749bc275e5cfa4e95c6b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4434211239bc61f432b46918bbb691c6257fded2fbd9552ce3344d10e091b3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:32Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.938955 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.939013 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.939023 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.939040 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:32 crc kubenswrapper[4585]: I1201 13:58:32.939051 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:32Z","lastTransitionTime":"2025-12-01T13:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.041307 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.041350 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.041361 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.041379 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.041391 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:33Z","lastTransitionTime":"2025-12-01T13:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.143580 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.143627 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.143638 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.143661 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.143674 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:33Z","lastTransitionTime":"2025-12-01T13:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.243571 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.243714 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 13:58:33 crc kubenswrapper[4585]: E1201 13:58:33.243745 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 13:58:41.243720499 +0000 UTC m=+35.227934344 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.243771 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.243798 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.243822 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 13:58:33 crc kubenswrapper[4585]: E1201 13:58:33.243864 4585 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 13:58:33 crc kubenswrapper[4585]: E1201 13:58:33.243885 4585 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 13:58:33 crc kubenswrapper[4585]: E1201 13:58:33.243887 4585 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 13:58:33 crc kubenswrapper[4585]: E1201 13:58:33.243899 4585 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 13:58:33 crc kubenswrapper[4585]: E1201 13:58:33.243931 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 13:58:41.243921934 +0000 UTC m=+35.228135789 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 13:58:33 crc kubenswrapper[4585]: E1201 13:58:33.243949 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 13:58:41.243938674 +0000 UTC m=+35.228152529 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 13:58:33 crc kubenswrapper[4585]: E1201 13:58:33.243985 4585 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 13:58:33 crc kubenswrapper[4585]: E1201 13:58:33.244030 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 13:58:41.244020727 +0000 UTC m=+35.228234582 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 13:58:33 crc kubenswrapper[4585]: E1201 13:58:33.244055 4585 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 13:58:33 crc kubenswrapper[4585]: E1201 13:58:33.244076 4585 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 13:58:33 crc kubenswrapper[4585]: E1201 13:58:33.244088 4585 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 13:58:33 crc kubenswrapper[4585]: E1201 13:58:33.244124 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 13:58:41.244113359 +0000 UTC m=+35.228327214 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.245791 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.245820 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.245829 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.245843 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.245851 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:33Z","lastTransitionTime":"2025-12-01T13:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.349088 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.349128 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.349137 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.349152 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.349161 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:33Z","lastTransitionTime":"2025-12-01T13:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.452538 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.452588 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.452601 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.452706 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.452724 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:33Z","lastTransitionTime":"2025-12-01T13:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.555140 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.555215 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.555228 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.555252 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.555265 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:33Z","lastTransitionTime":"2025-12-01T13:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.651178 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xh4hc" event={"ID":"e0e1eaa6-bee9-401a-b01a-9bf49a938b29","Type":"ContainerStarted","Data":"3110c36a439f88ab7fc0903fd766db1abef51f129bb1fc68634531d0e30cab0c"} Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.658435 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.658481 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.658495 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.658510 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.658536 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:33Z","lastTransitionTime":"2025-12-01T13:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.668966 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:33Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.680856 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-62bsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98be7526-98f6-4a4a-b4a6-1d10e76b7a99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d4df68e891df5e4e44371a4c884475b5723d55f6a10d32889cf194ddbe6179\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxjs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-62bsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:33Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.699004 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:33Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.711635 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nhp6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da6d315c-d478-4216-9f3d-57b20ce5ced8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ebdaa73edbe04780c0f421ea88446b4268ed1792af30e1ebb945167db80a548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n25g8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nhp6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:33Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.726214 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b10cdae7-154c-4fd6-a308-02843603d7ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b40ee42d7d1c9043b0b5c0dea3baacc46c24842ec64ed3b0d0ae1acfaf65f8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa22ac0ee9ad705a8affafffed373135005ef0380c9a646e06229a24d04d20b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://949f7fe4da35866e66212aea18ec1ed303bd542b07e53ff6562618c19de112cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d7c5d98b7f6dcade92ba77f78aa8345baf8161727831a616192726f6ef6f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dfb22c5155fdc4bfdf9d54bf540e4018bca6a616835af7e8d5079bebcd45da\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T13:58:25Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 13:58:19.764445 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 13:58:19.765610 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3283210930/tls.crt::/tmp/serving-cert-3283210930/tls.key\\\\\\\"\\\\nI1201 13:58:25.461951 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 13:58:25.464641 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 13:58:25.464660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 13:58:25.464680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 13:58:25.464686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 13:58:25.473932 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 13:58:25.473958 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 13:58:25.473983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 13:58:25.473986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 13:58:25.473989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 13:58:25.474166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 13:58:25.477276 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc9645fabd7968d92ce9867130dddbb6c72664af68d6b5475cf3271d1975878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:33Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.742006 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e9e83cdb2aa3565484c1ff3e5bbd72c7f50a132de50e60aec1aa88ad145ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:33Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.754416 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7beb40d-bcd0-43c8-a9fe-c32408790a4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59e8d51b7e0c8b526fab11f39a0a94685aecda8a1d300e1ee2259eb4bbac9003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c2cf1c1e4965502ab56b6d9bbf7840d00225b831b1e613295befad0748d7d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lj9gs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:33Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.760399 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.760608 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.760710 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.760777 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.760833 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:33Z","lastTransitionTime":"2025-12-01T13:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.770336 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:33Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.783072 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9wjs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e7ad3ad-7937-409b-b1c9-9c801f937400\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3662f79465b11c168f80f17c534c792fb1cc6b8a5c0115d219d39def6db073\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvc7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9wjs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:33Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.796450 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b3ea01760c8f1e58a55a59a0d453c25ca1298ede32d471f02092fd9d0a43830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e7aa15babbd745afed6bc0d4bbf5526180b8d32f86f8db736066fad3feb506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:33Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.808846 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4694a9b9e5ea09fb1d1963504d94853e22fce9688a502036324f92f39bf8a8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:33Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.820024 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:33Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.837269 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xh4hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0e1eaa6-bee9-401a-b01a-9bf49a938b29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f3e0b6c5dac9ff9b4686937a7d1646c4796a3326193349cf1556876bf1ae61a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f3e0b6c5dac9ff9b4686937a7d1646c4796a3326193349cf1556876bf1ae61a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2451dad343dcc9ec3bd5eb7109aefab60da5b700ae69d8eff28fef524795a758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2451dad343dcc9ec3bd5eb7109aefab60da5b700ae69d8eff28fef524795a758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9b888c3602a8f5d0f8cb762ab7d9b6676a0467f9546dbc38c5d2dd99acedd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9b888c3602a8f5d0f8cb762ab7d9b6676a0467f9546dbc38c5d2dd99acedd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3110c36a439f88ab7fc0903fd766db1abef51f129bb1fc68634531d0e30cab0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xh4hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:33Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.861276 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b62eed93-4d28-4748-b991-dd9692b58341\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e952d1592344330416348b9ec4add60740a0d23ab554afb70c6419977fb12533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e35deff79558e54b62aa868535d5bc4d4374ede161463dcf5f3a33584867447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9736ff6b49635cb7c26db7bc2292b36cf63053e0d1e4ffa2980a520dd71d9ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe7079ed6fa6b8b64d16c67287c33c2bffa0f42df2166fe66e3f8e173ca87d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83ba28d0b57f4744f3f7f97915fde983e2de227958ddaecd43b3eae9eb406774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:33Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.863251 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.863299 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.863309 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.863329 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.863341 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:33Z","lastTransitionTime":"2025-12-01T13:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.881173 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4030376c-31f1-420a-935f-f1ee174914e7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2571804bbf066bfec1f4fde3430800aa87d9c5b6db48cc1f3d352d742657ca8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d240ebf61216f20038983f7b8d2e0792fc0f1ed8db050debf02c777197d417b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f66a3bd7503966a7b443b7ea1adbd8554c99e810749bc275e5cfa4e95c6b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4434211239bc61f432b46918bbb691c6257fded2fbd9552ce3344d10e091b3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:33Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.966358 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.966824 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.966937 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.967029 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:33 crc kubenswrapper[4585]: I1201 13:58:33.967090 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:33Z","lastTransitionTime":"2025-12-01T13:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.070263 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.070639 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.070649 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.070666 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.070718 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:34Z","lastTransitionTime":"2025-12-01T13:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.173095 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.173133 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.173143 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.173161 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.173175 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:34Z","lastTransitionTime":"2025-12-01T13:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.275436 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.275472 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.275481 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.275495 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.275504 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:34Z","lastTransitionTime":"2025-12-01T13:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.378473 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.378523 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.378566 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.378584 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.378595 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:34Z","lastTransitionTime":"2025-12-01T13:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.411922 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.412920 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 13:58:34 crc kubenswrapper[4585]: E1201 13:58:34.413091 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.413198 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 13:58:34 crc kubenswrapper[4585]: E1201 13:58:34.413289 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 13:58:34 crc kubenswrapper[4585]: E1201 13:58:34.413393 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.484018 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.484091 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.484105 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.484128 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.484668 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:34Z","lastTransitionTime":"2025-12-01T13:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.588317 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.588366 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.588380 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.588399 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.588411 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:34Z","lastTransitionTime":"2025-12-01T13:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.666581 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" event={"ID":"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d","Type":"ContainerStarted","Data":"2035d90bb6ffbd065eccd56b401e72fe58499ff5b323f28d73cbf6082ff23848"} Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.670403 4585 generic.go:334] "Generic (PLEG): container finished" podID="e0e1eaa6-bee9-401a-b01a-9bf49a938b29" containerID="3110c36a439f88ab7fc0903fd766db1abef51f129bb1fc68634531d0e30cab0c" exitCode=0 Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.670457 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xh4hc" event={"ID":"e0e1eaa6-bee9-401a-b01a-9bf49a938b29","Type":"ContainerDied","Data":"3110c36a439f88ab7fc0903fd766db1abef51f129bb1fc68634531d0e30cab0c"} Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.695181 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.695230 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.695246 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.695271 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.695290 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:34Z","lastTransitionTime":"2025-12-01T13:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.702656 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b10cdae7-154c-4fd6-a308-02843603d7ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b40ee42d7d1c9043b0b5c0dea3baacc46c24842ec64ed3b0d0ae1acfaf65f8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa22ac0ee9ad705a8affafffed373135005ef0380c9a646e06229a24d04d20b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://949f7fe4da35866e66212aea18ec1ed303bd542b07e53ff6562618c19de112cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d7c5d98b7f6dcade92ba77f78aa8345baf8161727831a616192726f6ef6f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dfb22c5155fdc4bfdf9d54bf540e4018bca6a616835af7e8d5079bebcd45da\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T13:58:25Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 13:58:19.764445 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 13:58:19.765610 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3283210930/tls.crt::/tmp/serving-cert-3283210930/tls.key\\\\\\\"\\\\nI1201 13:58:25.461951 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 13:58:25.464641 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 13:58:25.464660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 13:58:25.464680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 13:58:25.464686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 13:58:25.473932 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 13:58:25.473958 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 13:58:25.473983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 13:58:25.473986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 13:58:25.473989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 13:58:25.474166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 13:58:25.477276 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc9645fabd7968d92ce9867130dddbb6c72664af68d6b5475cf3271d1975878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:34Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.719532 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e9e83cdb2aa3565484c1ff3e5bbd72c7f50a132de50e60aec1aa88ad145ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:34Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.734503 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7beb40d-bcd0-43c8-a9fe-c32408790a4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59e8d51b7e0c8b526fab11f39a0a94685aecda8a1d300e1ee2259eb4bbac9003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c2cf1c1e4965502ab56b6d9bbf7840d00225b831b1e613295befad0748d7d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lj9gs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:34Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.765410 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90163e3776ee8b2a338e3f8e6aa3d918d1d6123d246fbbd88659d342348167a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37437844a9c9719627f96c0afa303f0d449c46def7249b8c69ebd44900661fa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec93c685227d533ac1ab980dca6f72792f746e68511a7a546fb25189b02507d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b64c1a0d36f8addf6040455b1e933756582857ef8098c935b9cc5a115e8afee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d740a1be02b59a062ea8d5c44620cb57725a7ec494fd365e44a8980947c0d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdbccd3910729d23067d94c9ed2f5f3c917cd4880185a3b7e8c4c811cd30e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2035d90bb6ffbd065eccd56b401e72fe58499ff5b323f28d73cbf6082ff23848\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c85de7cd35b11158d931065cec6bafaf456be350f4a6095b8a9d0851bf87b1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:34Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.787137 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nhp6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da6d315c-d478-4216-9f3d-57b20ce5ced8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ebdaa73edbe04780c0f421ea88446b4268ed1792af30e1ebb945167db80a548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n25g8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nhp6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:34Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.798082 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.798116 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.798126 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.798142 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.798154 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:34Z","lastTransitionTime":"2025-12-01T13:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.803854 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:34Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.819111 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9wjs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e7ad3ad-7937-409b-b1c9-9c801f937400\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3662f79465b11c168f80f17c534c792fb1cc6b8a5c0115d219d39def6db073\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvc7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9wjs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:34Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.838060 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b62eed93-4d28-4748-b991-dd9692b58341\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e952d1592344330416348b9ec4add60740a0d23ab554afb70c6419977fb12533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e35deff79558e54b62aa868535d5bc4d4374ede161463dcf5f3a33584867447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9736ff6b49635cb7c26db7bc2292b36cf63053e0d1e4ffa2980a520dd71d9ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe7079ed6fa6b8b64d16c67287c33c2bffa0f42df2166fe66e3f8e173ca87d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83ba28d0b57f4744f3f7f97915fde983e2de227958ddaecd43b3eae9eb406774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:34Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.856918 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4030376c-31f1-420a-935f-f1ee174914e7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2571804bbf066bfec1f4fde3430800aa87d9c5b6db48cc1f3d352d742657ca8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d240ebf61216f20038983f7b8d2e0792fc0f1ed8db050debf02c777197d417b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f66a3bd7503966a7b443b7ea1adbd8554c99e810749bc275e5cfa4e95c6b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4434211239bc61f432b46918bbb691c6257fded2fbd9552ce3344d10e091b3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:34Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.876081 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b3ea01760c8f1e58a55a59a0d453c25ca1298ede32d471f02092fd9d0a43830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e7aa15babbd745afed6bc0d4bbf5526180b8d32f86f8db736066fad3feb506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:34Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.887825 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4694a9b9e5ea09fb1d1963504d94853e22fce9688a502036324f92f39bf8a8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:34Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.900371 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.900405 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.900415 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.900434 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.900446 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:34Z","lastTransitionTime":"2025-12-01T13:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.902338 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:34Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.918723 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xh4hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0e1eaa6-bee9-401a-b01a-9bf49a938b29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f3e0b6c5dac9ff9b4686937a7d1646c4796a3326193349cf1556876bf1ae61a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f3e0b6c5dac9ff9b4686937a7d1646c4796a3326193349cf1556876bf1ae61a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2451dad343dcc9ec3bd5eb7109aefab60da5b700ae69d8eff28fef524795a758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2451dad343dcc9ec3bd5eb7109aefab60da5b700ae69d8eff28fef524795a758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9b888c3602a8f5d0f8cb762ab7d9b6676a0467f9546dbc38c5d2dd99acedd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9b888c3602a8f5d0f8cb762ab7d9b6676a0467f9546dbc38c5d2dd99acedd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3110c36a439f88ab7fc0903fd766db1abef51f129bb1fc68634531d0e30cab0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xh4hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:34Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.937129 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:34Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.949248 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-62bsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98be7526-98f6-4a4a-b4a6-1d10e76b7a99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d4df68e891df5e4e44371a4c884475b5723d55f6a10d32889cf194ddbe6179\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxjs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-62bsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:34Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.969294 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b62eed93-4d28-4748-b991-dd9692b58341\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e952d1592344330416348b9ec4add60740a0d23ab554afb70c6419977fb12533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e35deff79558e54b62aa868535d5bc4d4374ede161463dcf5f3a33584867447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9736ff6b49635cb7c26db7bc2292b36cf63053e0d1e4ffa2980a520dd71d9ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe7079ed6fa6b8b64d16c67287c33c2bffa0f42df2166fe66e3f8e173ca87d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83ba28d0b57f4744f3f7f97915fde983e2de227958ddaecd43b3eae9eb406774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:34Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.982788 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4030376c-31f1-420a-935f-f1ee174914e7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2571804bbf066bfec1f4fde3430800aa87d9c5b6db48cc1f3d352d742657ca8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d240ebf61216f20038983f7b8d2e0792fc0f1ed8db050debf02c777197d417b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f66a3bd7503966a7b443b7ea1adbd8554c99e810749bc275e5cfa4e95c6b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4434211239bc61f432b46918bbb691c6257fded2fbd9552ce3344d10e091b3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:34Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:34 crc kubenswrapper[4585]: I1201 13:58:34.996326 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b3ea01760c8f1e58a55a59a0d453c25ca1298ede32d471f02092fd9d0a43830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e7aa15babbd745afed6bc0d4bbf5526180b8d32f86f8db736066fad3feb506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:34Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.003295 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.003338 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.003349 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.003369 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.003381 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:35Z","lastTransitionTime":"2025-12-01T13:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.008375 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4694a9b9e5ea09fb1d1963504d94853e22fce9688a502036324f92f39bf8a8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:35Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.020098 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:35Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.035321 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xh4hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0e1eaa6-bee9-401a-b01a-9bf49a938b29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f3e0b6c5dac9ff9b4686937a7d1646c4796a3326193349cf1556876bf1ae61a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f3e0b6c5dac9ff9b4686937a7d1646c4796a3326193349cf1556876bf1ae61a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2451dad343dcc9ec3bd5eb7109aefab60da5b700ae69d8eff28fef524795a758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2451dad343dcc9ec3bd5eb7109aefab60da5b700ae69d8eff28fef524795a758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9b888c3602a8f5d0f8cb762ab7d9b6676a0467f9546dbc38c5d2dd99acedd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9b888c3602a8f5d0f8cb762ab7d9b6676a0467f9546dbc38c5d2dd99acedd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3110c36a439f88ab7fc0903fd766db1abef51f129bb1fc68634531d0e30cab0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3110c36a439f88ab7fc0903fd766db1abef51f129bb1fc68634531d0e30cab0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xh4hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:35Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.047620 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:35Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.058318 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-62bsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98be7526-98f6-4a4a-b4a6-1d10e76b7a99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d4df68e891df5e4e44371a4c884475b5723d55f6a10d32889cf194ddbe6179\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxjs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-62bsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:35Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.071485 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b10cdae7-154c-4fd6-a308-02843603d7ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b40ee42d7d1c9043b0b5c0dea3baacc46c24842ec64ed3b0d0ae1acfaf65f8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa22ac0ee9ad705a8affafffed373135005ef0380c9a646e06229a24d04d20b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://949f7fe4da35866e66212aea18ec1ed303bd542b07e53ff6562618c19de112cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d7c5d98b7f6dcade92ba77f78aa8345baf8161727831a616192726f6ef6f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dfb22c5155fdc4bfdf9d54bf540e4018bca6a616835af7e8d5079bebcd45da\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T13:58:25Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 13:58:19.764445 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 13:58:19.765610 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3283210930/tls.crt::/tmp/serving-cert-3283210930/tls.key\\\\\\\"\\\\nI1201 13:58:25.461951 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 13:58:25.464641 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 13:58:25.464660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 13:58:25.464680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 13:58:25.464686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 13:58:25.473932 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 13:58:25.473958 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 13:58:25.473983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 13:58:25.473986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 13:58:25.473989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 13:58:25.474166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 13:58:25.477276 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc9645fabd7968d92ce9867130dddbb6c72664af68d6b5475cf3271d1975878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:35Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.087374 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e9e83cdb2aa3565484c1ff3e5bbd72c7f50a132de50e60aec1aa88ad145ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:35Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.099989 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7beb40d-bcd0-43c8-a9fe-c32408790a4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59e8d51b7e0c8b526fab11f39a0a94685aecda8a1d300e1ee2259eb4bbac9003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c2cf1c1e4965502ab56b6d9bbf7840d00225b831b1e613295befad0748d7d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lj9gs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:35Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.106111 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.106156 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.106164 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.106212 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.106223 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:35Z","lastTransitionTime":"2025-12-01T13:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.121045 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90163e3776ee8b2a338e3f8e6aa3d918d1d6123d246fbbd88659d342348167a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37437844a9c9719627f96c0afa303f0d449c46def7249b8c69ebd44900661fa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec93c685227d533ac1ab980dca6f72792f746e68511a7a546fb25189b02507d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b64c1a0d36f8addf6040455b1e933756582857ef8098c935b9cc5a115e8afee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d740a1be02b59a062ea8d5c44620cb57725a7ec494fd365e44a8980947c0d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdbccd3910729d23067d94c9ed2f5f3c917cd4880185a3b7e8c4c811cd30e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2035d90bb6ffbd065eccd56b401e72fe58499ff5b323f28d73cbf6082ff23848\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c85de7cd35b11158d931065cec6bafaf456be350f4a6095b8a9d0851bf87b1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:35Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.132665 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nhp6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da6d315c-d478-4216-9f3d-57b20ce5ced8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ebdaa73edbe04780c0f421ea88446b4268ed1792af30e1ebb945167db80a548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n25g8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nhp6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:35Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.145840 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:35Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.162345 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9wjs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e7ad3ad-7937-409b-b1c9-9c801f937400\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3662f79465b11c168f80f17c534c792fb1cc6b8a5c0115d219d39def6db073\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvc7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9wjs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:35Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.209523 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.209937 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.209947 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.209965 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.209997 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:35Z","lastTransitionTime":"2025-12-01T13:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.312031 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.312080 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.312092 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.312110 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.312122 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:35Z","lastTransitionTime":"2025-12-01T13:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.318223 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.414509 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.414555 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.414565 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.414584 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.414598 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:35Z","lastTransitionTime":"2025-12-01T13:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.517053 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.517094 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.517107 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.517125 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.517135 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:35Z","lastTransitionTime":"2025-12-01T13:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.620084 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.620140 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.620150 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.620169 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.620183 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:35Z","lastTransitionTime":"2025-12-01T13:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.681161 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xh4hc" event={"ID":"e0e1eaa6-bee9-401a-b01a-9bf49a938b29","Type":"ContainerStarted","Data":"db9b353d1934e3336c4282cb057c52ef64e01675fa36e23d1fa0af7133508a62"} Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.681209 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.682086 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.696951 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e9e83cdb2aa3565484c1ff3e5bbd72c7f50a132de50e60aec1aa88ad145ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:35Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.708748 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7beb40d-bcd0-43c8-a9fe-c32408790a4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59e8d51b7e0c8b526fab11f39a0a94685aecda8a1d300e1ee2259eb4bbac9003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c2cf1c1e4965502ab56b6d9bbf7840d00225b831b1e613295befad0748d7d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lj9gs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:35Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.715574 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.720823 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.722634 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.722687 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.722703 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.722730 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.722748 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:35Z","lastTransitionTime":"2025-12-01T13:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.733962 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90163e3776ee8b2a338e3f8e6aa3d918d1d6123d246fbbd88659d342348167a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37437844a9c9719627f96c0afa303f0d449c46def7249b8c69ebd44900661fa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec93c685227d533ac1ab980dca6f72792f746e68511a7a546fb25189b02507d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b64c1a0d36f8addf6040455b1e933756582857ef8098c935b9cc5a115e8afee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d740a1be02b59a062ea8d5c44620cb57725a7ec494fd365e44a8980947c0d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdbccd3910729d23067d94c9ed2f5f3c917cd4880185a3b7e8c4c811cd30e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2035d90bb6ffbd065eccd56b401e72fe58499ff5b323f28d73cbf6082ff23848\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c85de7cd35b11158d931065cec6bafaf456be350f4a6095b8a9d0851bf87b1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:35Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.748139 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nhp6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da6d315c-d478-4216-9f3d-57b20ce5ced8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ebdaa73edbe04780c0f421ea88446b4268ed1792af30e1ebb945167db80a548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n25g8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nhp6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:35Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.765126 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b10cdae7-154c-4fd6-a308-02843603d7ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b40ee42d7d1c9043b0b5c0dea3baacc46c24842ec64ed3b0d0ae1acfaf65f8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa22ac0ee9ad705a8affafffed373135005ef0380c9a646e06229a24d04d20b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://949f7fe4da35866e66212aea18ec1ed303bd542b07e53ff6562618c19de112cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d7c5d98b7f6dcade92ba77f78aa8345baf8161727831a616192726f6ef6f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dfb22c5155fdc4bfdf9d54bf540e4018bca6a616835af7e8d5079bebcd45da\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T13:58:25Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 13:58:19.764445 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 13:58:19.765610 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3283210930/tls.crt::/tmp/serving-cert-3283210930/tls.key\\\\\\\"\\\\nI1201 13:58:25.461951 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 13:58:25.464641 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 13:58:25.464660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 13:58:25.464680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 13:58:25.464686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 13:58:25.473932 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 13:58:25.473958 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 13:58:25.473983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 13:58:25.473986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 13:58:25.473989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 13:58:25.474166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 13:58:25.477276 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc9645fabd7968d92ce9867130dddbb6c72664af68d6b5475cf3271d1975878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:35Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.779069 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9wjs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e7ad3ad-7937-409b-b1c9-9c801f937400\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3662f79465b11c168f80f17c534c792fb1cc6b8a5c0115d219d39def6db073\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvc7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9wjs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:35Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.798178 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:35Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.817425 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4030376c-31f1-420a-935f-f1ee174914e7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2571804bbf066bfec1f4fde3430800aa87d9c5b6db48cc1f3d352d742657ca8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d240ebf61216f20038983f7b8d2e0792fc0f1ed8db050debf02c777197d417b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f66a3bd7503966a7b443b7ea1adbd8554c99e810749bc275e5cfa4e95c6b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4434211239bc61f432b46918bbb691c6257fded2fbd9552ce3344d10e091b3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:35Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.825259 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.825297 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.825305 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.825324 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.825333 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:35Z","lastTransitionTime":"2025-12-01T13:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.834835 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b3ea01760c8f1e58a55a59a0d453c25ca1298ede32d471f02092fd9d0a43830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e7aa15babbd745afed6bc0d4bbf5526180b8d32f86f8db736066fad3feb506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:35Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.850242 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4694a9b9e5ea09fb1d1963504d94853e22fce9688a502036324f92f39bf8a8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:35Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.864746 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:35Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.883125 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xh4hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0e1eaa6-bee9-401a-b01a-9bf49a938b29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db9b353d1934e3336c4282cb057c52ef64e01675fa36e23d1fa0af7133508a62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f3e0b6c5dac9ff9b4686937a7d1646c4796a3326193349cf1556876bf1ae61a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f3e0b6c5dac9ff9b4686937a7d1646c4796a3326193349cf1556876bf1ae61a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2451dad343dcc9ec3bd5eb7109aefab60da5b700ae69d8eff28fef524795a758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2451dad343dcc9ec3bd5eb7109aefab60da5b700ae69d8eff28fef524795a758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9b888c3602a8f5d0f8cb762ab7d9b6676a0467f9546dbc38c5d2dd99acedd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9b888c3602a8f5d0f8cb762ab7d9b6676a0467f9546dbc38c5d2dd99acedd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3110c36a439f88ab7fc0903fd766db1abef51f129bb1fc68634531d0e30cab0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3110c36a439f88ab7fc0903fd766db1abef51f129bb1fc68634531d0e30cab0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xh4hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:35Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.912959 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b62eed93-4d28-4748-b991-dd9692b58341\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e952d1592344330416348b9ec4add60740a0d23ab554afb70c6419977fb12533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e35deff79558e54b62aa868535d5bc4d4374ede161463dcf5f3a33584867447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9736ff6b49635cb7c26db7bc2292b36cf63053e0d1e4ffa2980a520dd71d9ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe7079ed6fa6b8b64d16c67287c33c2bffa0f42df2166fe66e3f8e173ca87d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83ba28d0b57f4744f3f7f97915fde983e2de227958ddaecd43b3eae9eb406774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:35Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.928513 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.928568 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.928583 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.928607 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.928624 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:35Z","lastTransitionTime":"2025-12-01T13:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.928896 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:35Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.942494 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-62bsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98be7526-98f6-4a4a-b4a6-1d10e76b7a99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d4df68e891df5e4e44371a4c884475b5723d55f6a10d32889cf194ddbe6179\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxjs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-62bsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:35Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.958224 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:35Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.973926 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9wjs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e7ad3ad-7937-409b-b1c9-9c801f937400\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3662f79465b11c168f80f17c534c792fb1cc6b8a5c0115d219d39def6db073\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvc7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9wjs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:35Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:35 crc kubenswrapper[4585]: I1201 13:58:35.993296 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xh4hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0e1eaa6-bee9-401a-b01a-9bf49a938b29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db9b353d1934e3336c4282cb057c52ef64e01675fa36e23d1fa0af7133508a62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f3e0b6c5dac9ff9b4686937a7d1646c4796a3326193349cf1556876bf1ae61a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f3e0b6c5dac9ff9b4686937a7d1646c4796a3326193349cf1556876bf1ae61a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2451dad343dcc9ec3bd5eb7109aefab60da5b700ae69d8eff28fef524795a758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2451dad343dcc9ec3bd5eb7109aefab60da5b700ae69d8eff28fef524795a758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9b888c3602a8f5d0f8cb762ab7d9b6676a0467f9546dbc38c5d2dd99acedd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9b888c3602a8f5d0f8cb762ab7d9b6676a0467f9546dbc38c5d2dd99acedd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3110c36a439f88ab7fc0903fd766db1abef51f129bb1fc68634531d0e30cab0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3110c36a439f88ab7fc0903fd766db1abef51f129bb1fc68634531d0e30cab0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xh4hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:35Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.019649 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b62eed93-4d28-4748-b991-dd9692b58341\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e952d1592344330416348b9ec4add60740a0d23ab554afb70c6419977fb12533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e35deff79558e54b62aa868535d5bc4d4374ede161463dcf5f3a33584867447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9736ff6b49635cb7c26db7bc2292b36cf63053e0d1e4ffa2980a520dd71d9ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe7079ed6fa6b8b64d16c67287c33c2bffa0f42df2166fe66e3f8e173ca87d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83ba28d0b57f4744f3f7f97915fde983e2de227958ddaecd43b3eae9eb406774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:36Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.031556 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.031604 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.031617 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.031637 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.031651 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:36Z","lastTransitionTime":"2025-12-01T13:58:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.035614 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4030376c-31f1-420a-935f-f1ee174914e7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2571804bbf066bfec1f4fde3430800aa87d9c5b6db48cc1f3d352d742657ca8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d240ebf61216f20038983f7b8d2e0792fc0f1ed8db050debf02c777197d417b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f66a3bd7503966a7b443b7ea1adbd8554c99e810749bc275e5cfa4e95c6b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4434211239bc61f432b46918bbb691c6257fded2fbd9552ce3344d10e091b3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:36Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.050049 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b3ea01760c8f1e58a55a59a0d453c25ca1298ede32d471f02092fd9d0a43830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e7aa15babbd745afed6bc0d4bbf5526180b8d32f86f8db736066fad3feb506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:36Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.063862 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4694a9b9e5ea09fb1d1963504d94853e22fce9688a502036324f92f39bf8a8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:36Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.079794 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:36Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.094642 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:36Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.106775 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-62bsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98be7526-98f6-4a4a-b4a6-1d10e76b7a99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d4df68e891df5e4e44371a4c884475b5723d55f6a10d32889cf194ddbe6179\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxjs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-62bsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:36Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.120077 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b10cdae7-154c-4fd6-a308-02843603d7ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b40ee42d7d1c9043b0b5c0dea3baacc46c24842ec64ed3b0d0ae1acfaf65f8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa22ac0ee9ad705a8affafffed373135005ef0380c9a646e06229a24d04d20b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://949f7fe4da35866e66212aea18ec1ed303bd542b07e53ff6562618c19de112cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d7c5d98b7f6dcade92ba77f78aa8345baf8161727831a616192726f6ef6f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dfb22c5155fdc4bfdf9d54bf540e4018bca6a616835af7e8d5079bebcd45da\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T13:58:25Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 13:58:19.764445 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 13:58:19.765610 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3283210930/tls.crt::/tmp/serving-cert-3283210930/tls.key\\\\\\\"\\\\nI1201 13:58:25.461951 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 13:58:25.464641 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 13:58:25.464660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 13:58:25.464680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 13:58:25.464686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 13:58:25.473932 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 13:58:25.473958 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 13:58:25.473983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 13:58:25.473986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 13:58:25.473989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 13:58:25.474166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 13:58:25.477276 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc9645fabd7968d92ce9867130dddbb6c72664af68d6b5475cf3271d1975878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:36Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.133201 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e9e83cdb2aa3565484c1ff3e5bbd72c7f50a132de50e60aec1aa88ad145ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:36Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.134201 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.134254 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.134266 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.134286 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.134297 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:36Z","lastTransitionTime":"2025-12-01T13:58:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.144442 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7beb40d-bcd0-43c8-a9fe-c32408790a4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59e8d51b7e0c8b526fab11f39a0a94685aecda8a1d300e1ee2259eb4bbac9003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c2cf1c1e4965502ab56b6d9bbf7840d00225b831b1e613295befad0748d7d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lj9gs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:36Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.161293 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90163e3776ee8b2a338e3f8e6aa3d918d1d6123d246fbbd88659d342348167a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37437844a9c9719627f96c0afa303f0d449c46def7249b8c69ebd44900661fa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec93c685227d533ac1ab980dca6f72792f746e68511a7a546fb25189b02507d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b64c1a0d36f8addf6040455b1e933756582857ef8098c935b9cc5a115e8afee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d740a1be02b59a062ea8d5c44620cb57725a7ec494fd365e44a8980947c0d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdbccd3910729d23067d94c9ed2f5f3c917cd4880185a3b7e8c4c811cd30e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2035d90bb6ffbd065eccd56b401e72fe58499ff5b323f28d73cbf6082ff23848\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c85de7cd35b11158d931065cec6bafaf456be350f4a6095b8a9d0851bf87b1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:36Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.171117 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nhp6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da6d315c-d478-4216-9f3d-57b20ce5ced8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ebdaa73edbe04780c0f421ea88446b4268ed1792af30e1ebb945167db80a548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n25g8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nhp6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:36Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.236794 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.236840 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.236854 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.236872 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.236885 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:36Z","lastTransitionTime":"2025-12-01T13:58:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.340079 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.340139 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.340155 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.340177 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.340189 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:36Z","lastTransitionTime":"2025-12-01T13:58:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.412381 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.412458 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 13:58:36 crc kubenswrapper[4585]: E1201 13:58:36.412605 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 13:58:36 crc kubenswrapper[4585]: E1201 13:58:36.412811 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.412964 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 13:58:36 crc kubenswrapper[4585]: E1201 13:58:36.413209 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.431935 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:36Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.442604 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.442887 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.442966 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.443088 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.443176 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:36Z","lastTransitionTime":"2025-12-01T13:58:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.446599 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9wjs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e7ad3ad-7937-409b-b1c9-9c801f937400\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3662f79465b11c168f80f17c534c792fb1cc6b8a5c0115d219d39def6db073\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvc7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9wjs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:36Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.495299 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:36Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.510347 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xh4hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0e1eaa6-bee9-401a-b01a-9bf49a938b29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db9b353d1934e3336c4282cb057c52ef64e01675fa36e23d1fa0af7133508a62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f3e0b6c5dac9ff9b4686937a7d1646c4796a3326193349cf1556876bf1ae61a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f3e0b6c5dac9ff9b4686937a7d1646c4796a3326193349cf1556876bf1ae61a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2451dad343dcc9ec3bd5eb7109aefab60da5b700ae69d8eff28fef524795a758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2451dad343dcc9ec3bd5eb7109aefab60da5b700ae69d8eff28fef524795a758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9b888c3602a8f5d0f8cb762ab7d9b6676a0467f9546dbc38c5d2dd99acedd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9b888c3602a8f5d0f8cb762ab7d9b6676a0467f9546dbc38c5d2dd99acedd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3110c36a439f88ab7fc0903fd766db1abef51f129bb1fc68634531d0e30cab0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3110c36a439f88ab7fc0903fd766db1abef51f129bb1fc68634531d0e30cab0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xh4hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:36Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.540814 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b62eed93-4d28-4748-b991-dd9692b58341\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e952d1592344330416348b9ec4add60740a0d23ab554afb70c6419977fb12533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e35deff79558e54b62aa868535d5bc4d4374ede161463dcf5f3a33584867447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9736ff6b49635cb7c26db7bc2292b36cf63053e0d1e4ffa2980a520dd71d9ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe7079ed6fa6b8b64d16c67287c33c2bffa0f42df2166fe66e3f8e173ca87d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83ba28d0b57f4744f3f7f97915fde983e2de227958ddaecd43b3eae9eb406774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:36Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.545515 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.545560 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.545570 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.545587 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.545599 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:36Z","lastTransitionTime":"2025-12-01T13:58:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.558414 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4030376c-31f1-420a-935f-f1ee174914e7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2571804bbf066bfec1f4fde3430800aa87d9c5b6db48cc1f3d352d742657ca8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d240ebf61216f20038983f7b8d2e0792fc0f1ed8db050debf02c777197d417b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f66a3bd7503966a7b443b7ea1adbd8554c99e810749bc275e5cfa4e95c6b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4434211239bc61f432b46918bbb691c6257fded2fbd9552ce3344d10e091b3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:36Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.573352 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b3ea01760c8f1e58a55a59a0d453c25ca1298ede32d471f02092fd9d0a43830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e7aa15babbd745afed6bc0d4bbf5526180b8d32f86f8db736066fad3feb506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:36Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.589359 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4694a9b9e5ea09fb1d1963504d94853e22fce9688a502036324f92f39bf8a8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:36Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.608551 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:36Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.622728 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-62bsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98be7526-98f6-4a4a-b4a6-1d10e76b7a99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d4df68e891df5e4e44371a4c884475b5723d55f6a10d32889cf194ddbe6179\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxjs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-62bsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:36Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.639152 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b10cdae7-154c-4fd6-a308-02843603d7ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b40ee42d7d1c9043b0b5c0dea3baacc46c24842ec64ed3b0d0ae1acfaf65f8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa22ac0ee9ad705a8affafffed373135005ef0380c9a646e06229a24d04d20b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://949f7fe4da35866e66212aea18ec1ed303bd542b07e53ff6562618c19de112cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d7c5d98b7f6dcade92ba77f78aa8345baf8161727831a616192726f6ef6f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dfb22c5155fdc4bfdf9d54bf540e4018bca6a616835af7e8d5079bebcd45da\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T13:58:25Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 13:58:19.764445 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 13:58:19.765610 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3283210930/tls.crt::/tmp/serving-cert-3283210930/tls.key\\\\\\\"\\\\nI1201 13:58:25.461951 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 13:58:25.464641 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 13:58:25.464660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 13:58:25.464680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 13:58:25.464686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 13:58:25.473932 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 13:58:25.473958 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 13:58:25.473983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 13:58:25.473986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 13:58:25.473989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 13:58:25.474166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 13:58:25.477276 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc9645fabd7968d92ce9867130dddbb6c72664af68d6b5475cf3271d1975878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:36Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.648255 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.648298 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.648309 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.648328 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.648340 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:36Z","lastTransitionTime":"2025-12-01T13:58:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.653924 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e9e83cdb2aa3565484c1ff3e5bbd72c7f50a132de50e60aec1aa88ad145ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:36Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.667716 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7beb40d-bcd0-43c8-a9fe-c32408790a4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59e8d51b7e0c8b526fab11f39a0a94685aecda8a1d300e1ee2259eb4bbac9003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c2cf1c1e4965502ab56b6d9bbf7840d00225b831b1e613295befad0748d7d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lj9gs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:36Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.690923 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90163e3776ee8b2a338e3f8e6aa3d918d1d6123d246fbbd88659d342348167a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37437844a9c9719627f96c0afa303f0d449c46def7249b8c69ebd44900661fa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec93c685227d533ac1ab980dca6f72792f746e68511a7a546fb25189b02507d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b64c1a0d36f8addf6040455b1e933756582857ef8098c935b9cc5a115e8afee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d740a1be02b59a062ea8d5c44620cb57725a7ec494fd365e44a8980947c0d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdbccd3910729d23067d94c9ed2f5f3c917cd4880185a3b7e8c4c811cd30e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2035d90bb6ffbd065eccd56b401e72fe58499ff5b323f28d73cbf6082ff23848\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c85de7cd35b11158d931065cec6bafaf456be350f4a6095b8a9d0851bf87b1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:36Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.702619 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nhp6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da6d315c-d478-4216-9f3d-57b20ce5ced8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ebdaa73edbe04780c0f421ea88446b4268ed1792af30e1ebb945167db80a548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n25g8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nhp6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:36Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.751433 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.751489 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.751501 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.751525 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.751539 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:36Z","lastTransitionTime":"2025-12-01T13:58:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.854717 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.854753 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.854765 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.854783 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.854795 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:36Z","lastTransitionTime":"2025-12-01T13:58:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.960706 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.960783 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.960794 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.960814 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:36 crc kubenswrapper[4585]: I1201 13:58:36.960847 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:36Z","lastTransitionTime":"2025-12-01T13:58:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.063632 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.063690 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.063701 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.063719 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.063728 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:37Z","lastTransitionTime":"2025-12-01T13:58:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.166668 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.166724 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.166736 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.166754 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.166766 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:37Z","lastTransitionTime":"2025-12-01T13:58:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.269728 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.269787 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.269805 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.269865 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.269884 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:37Z","lastTransitionTime":"2025-12-01T13:58:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.372162 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.372220 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.372233 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.372255 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.372269 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:37Z","lastTransitionTime":"2025-12-01T13:58:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.475021 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.475064 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.475075 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.475093 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.475106 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:37Z","lastTransitionTime":"2025-12-01T13:58:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.578317 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.578366 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.578375 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.578392 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.578405 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:37Z","lastTransitionTime":"2025-12-01T13:58:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.681473 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.681531 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.681542 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.681559 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.681574 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:37Z","lastTransitionTime":"2025-12-01T13:58:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.695251 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tjkqr_b0b45150-070d-4f7c-b53a-d76dcbaa6e6d/ovnkube-controller/0.log" Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.701105 4585 generic.go:334] "Generic (PLEG): container finished" podID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" containerID="2035d90bb6ffbd065eccd56b401e72fe58499ff5b323f28d73cbf6082ff23848" exitCode=1 Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.701158 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" event={"ID":"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d","Type":"ContainerDied","Data":"2035d90bb6ffbd065eccd56b401e72fe58499ff5b323f28d73cbf6082ff23848"} Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.701916 4585 scope.go:117] "RemoveContainer" containerID="2035d90bb6ffbd065eccd56b401e72fe58499ff5b323f28d73cbf6082ff23848" Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.718245 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:37Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.733261 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9wjs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e7ad3ad-7937-409b-b1c9-9c801f937400\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3662f79465b11c168f80f17c534c792fb1cc6b8a5c0115d219d39def6db073\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvc7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9wjs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:37Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.746843 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:37Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.767137 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xh4hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0e1eaa6-bee9-401a-b01a-9bf49a938b29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db9b353d1934e3336c4282cb057c52ef64e01675fa36e23d1fa0af7133508a62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f3e0b6c5dac9ff9b4686937a7d1646c4796a3326193349cf1556876bf1ae61a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f3e0b6c5dac9ff9b4686937a7d1646c4796a3326193349cf1556876bf1ae61a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2451dad343dcc9ec3bd5eb7109aefab60da5b700ae69d8eff28fef524795a758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2451dad343dcc9ec3bd5eb7109aefab60da5b700ae69d8eff28fef524795a758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9b888c3602a8f5d0f8cb762ab7d9b6676a0467f9546dbc38c5d2dd99acedd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9b888c3602a8f5d0f8cb762ab7d9b6676a0467f9546dbc38c5d2dd99acedd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3110c36a439f88ab7fc0903fd766db1abef51f129bb1fc68634531d0e30cab0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3110c36a439f88ab7fc0903fd766db1abef51f129bb1fc68634531d0e30cab0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xh4hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:37Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.784296 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.784349 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.784361 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.784381 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.784394 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:37Z","lastTransitionTime":"2025-12-01T13:58:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.788682 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b62eed93-4d28-4748-b991-dd9692b58341\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e952d1592344330416348b9ec4add60740a0d23ab554afb70c6419977fb12533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e35deff79558e54b62aa868535d5bc4d4374ede161463dcf5f3a33584867447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9736ff6b49635cb7c26db7bc2292b36cf63053e0d1e4ffa2980a520dd71d9ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe7079ed6fa6b8b64d16c67287c33c2bffa0f42df2166fe66e3f8e173ca87d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83ba28d0b57f4744f3f7f97915fde983e2de227958ddaecd43b3eae9eb406774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:37Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.810351 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4030376c-31f1-420a-935f-f1ee174914e7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2571804bbf066bfec1f4fde3430800aa87d9c5b6db48cc1f3d352d742657ca8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d240ebf61216f20038983f7b8d2e0792fc0f1ed8db050debf02c777197d417b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f66a3bd7503966a7b443b7ea1adbd8554c99e810749bc275e5cfa4e95c6b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4434211239bc61f432b46918bbb691c6257fded2fbd9552ce3344d10e091b3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:37Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.829288 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b3ea01760c8f1e58a55a59a0d453c25ca1298ede32d471f02092fd9d0a43830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e7aa15babbd745afed6bc0d4bbf5526180b8d32f86f8db736066fad3feb506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:37Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.842626 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4694a9b9e5ea09fb1d1963504d94853e22fce9688a502036324f92f39bf8a8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:37Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.855648 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:37Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.865209 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-62bsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98be7526-98f6-4a4a-b4a6-1d10e76b7a99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d4df68e891df5e4e44371a4c884475b5723d55f6a10d32889cf194ddbe6179\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxjs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-62bsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:37Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.880100 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b10cdae7-154c-4fd6-a308-02843603d7ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b40ee42d7d1c9043b0b5c0dea3baacc46c24842ec64ed3b0d0ae1acfaf65f8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa22ac0ee9ad705a8affafffed373135005ef0380c9a646e06229a24d04d20b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://949f7fe4da35866e66212aea18ec1ed303bd542b07e53ff6562618c19de112cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d7c5d98b7f6dcade92ba77f78aa8345baf8161727831a616192726f6ef6f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dfb22c5155fdc4bfdf9d54bf540e4018bca6a616835af7e8d5079bebcd45da\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T13:58:25Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 13:58:19.764445 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 13:58:19.765610 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3283210930/tls.crt::/tmp/serving-cert-3283210930/tls.key\\\\\\\"\\\\nI1201 13:58:25.461951 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 13:58:25.464641 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 13:58:25.464660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 13:58:25.464680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 13:58:25.464686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 13:58:25.473932 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 13:58:25.473958 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 13:58:25.473983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 13:58:25.473986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 13:58:25.473989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 13:58:25.474166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 13:58:25.477276 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc9645fabd7968d92ce9867130dddbb6c72664af68d6b5475cf3271d1975878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:37Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.886707 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.886737 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.886749 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.886765 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.886776 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:37Z","lastTransitionTime":"2025-12-01T13:58:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.896708 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e9e83cdb2aa3565484c1ff3e5bbd72c7f50a132de50e60aec1aa88ad145ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:37Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.913930 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7beb40d-bcd0-43c8-a9fe-c32408790a4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59e8d51b7e0c8b526fab11f39a0a94685aecda8a1d300e1ee2259eb4bbac9003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c2cf1c1e4965502ab56b6d9bbf7840d00225b831b1e613295befad0748d7d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lj9gs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:37Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.936088 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90163e3776ee8b2a338e3f8e6aa3d918d1d6123d246fbbd88659d342348167a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37437844a9c9719627f96c0afa303f0d449c46def7249b8c69ebd44900661fa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec93c685227d533ac1ab980dca6f72792f746e68511a7a546fb25189b02507d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b64c1a0d36f8addf6040455b1e933756582857ef8098c935b9cc5a115e8afee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d740a1be02b59a062ea8d5c44620cb57725a7ec494fd365e44a8980947c0d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdbccd3910729d23067d94c9ed2f5f3c917cd4880185a3b7e8c4c811cd30e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2035d90bb6ffbd065eccd56b401e72fe58499ff5b323f28d73cbf6082ff23848\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2035d90bb6ffbd065eccd56b401e72fe58499ff5b323f28d73cbf6082ff23848\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T13:58:37Z\\\",\\\"message\\\":\\\"y (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1201 13:58:36.905704 5794 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 13:58:36.905900 5794 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 13:58:36.906012 5794 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 13:58:36.906359 5794 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 13:58:36.906460 5794 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 13:58:36.906473 5794 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 13:58:36.906540 5794 factory.go:656] Stopping watch factory\\\\nI1201 13:58:36.906555 5794 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 13:58:36.906562 5794 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 13:58:36.906742 5794 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 13:58:36.906871 5794 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c85de7cd35b11158d931065cec6bafaf456be350f4a6095b8a9d0851bf87b1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:37Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.948860 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nhp6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da6d315c-d478-4216-9f3d-57b20ce5ced8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ebdaa73edbe04780c0f421ea88446b4268ed1792af30e1ebb945167db80a548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n25g8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nhp6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:37Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.988829 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.988873 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.988886 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.988931 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:37 crc kubenswrapper[4585]: I1201 13:58:37.988947 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:37Z","lastTransitionTime":"2025-12-01T13:58:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:38 crc kubenswrapper[4585]: I1201 13:58:38.091937 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:38 crc kubenswrapper[4585]: I1201 13:58:38.092006 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:38 crc kubenswrapper[4585]: I1201 13:58:38.092017 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:38 crc kubenswrapper[4585]: I1201 13:58:38.092036 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:38 crc kubenswrapper[4585]: I1201 13:58:38.092048 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:38Z","lastTransitionTime":"2025-12-01T13:58:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:38 crc kubenswrapper[4585]: I1201 13:58:38.194733 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:38 crc kubenswrapper[4585]: I1201 13:58:38.194797 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:38 crc kubenswrapper[4585]: I1201 13:58:38.194809 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:38 crc kubenswrapper[4585]: I1201 13:58:38.194830 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:38 crc kubenswrapper[4585]: I1201 13:58:38.194869 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:38Z","lastTransitionTime":"2025-12-01T13:58:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:38 crc kubenswrapper[4585]: I1201 13:58:38.297694 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:38 crc kubenswrapper[4585]: I1201 13:58:38.297735 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:38 crc kubenswrapper[4585]: I1201 13:58:38.297748 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:38 crc kubenswrapper[4585]: I1201 13:58:38.297768 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:38 crc kubenswrapper[4585]: I1201 13:58:38.297779 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:38Z","lastTransitionTime":"2025-12-01T13:58:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:38 crc kubenswrapper[4585]: I1201 13:58:38.400343 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:38 crc kubenswrapper[4585]: I1201 13:58:38.400402 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:38 crc kubenswrapper[4585]: I1201 13:58:38.400413 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:38 crc kubenswrapper[4585]: I1201 13:58:38.400437 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:38 crc kubenswrapper[4585]: I1201 13:58:38.400451 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:38Z","lastTransitionTime":"2025-12-01T13:58:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:38 crc kubenswrapper[4585]: I1201 13:58:38.411965 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 13:58:38 crc kubenswrapper[4585]: I1201 13:58:38.412061 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 13:58:38 crc kubenswrapper[4585]: E1201 13:58:38.412151 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 13:58:38 crc kubenswrapper[4585]: I1201 13:58:38.412040 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 13:58:38 crc kubenswrapper[4585]: E1201 13:58:38.412215 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 13:58:38 crc kubenswrapper[4585]: E1201 13:58:38.412491 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 13:58:38 crc kubenswrapper[4585]: I1201 13:58:38.503738 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:38 crc kubenswrapper[4585]: I1201 13:58:38.503795 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:38 crc kubenswrapper[4585]: I1201 13:58:38.503805 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:38 crc kubenswrapper[4585]: I1201 13:58:38.503827 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:38 crc kubenswrapper[4585]: I1201 13:58:38.503837 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:38Z","lastTransitionTime":"2025-12-01T13:58:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:38 crc kubenswrapper[4585]: I1201 13:58:38.606087 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:38 crc kubenswrapper[4585]: I1201 13:58:38.606143 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:38 crc kubenswrapper[4585]: I1201 13:58:38.606161 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:38 crc kubenswrapper[4585]: I1201 13:58:38.606185 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:38 crc kubenswrapper[4585]: I1201 13:58:38.606201 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:38Z","lastTransitionTime":"2025-12-01T13:58:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:38 crc kubenswrapper[4585]: I1201 13:58:38.706917 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tjkqr_b0b45150-070d-4f7c-b53a-d76dcbaa6e6d/ovnkube-controller/0.log" Dec 01 13:58:38 crc kubenswrapper[4585]: I1201 13:58:38.708547 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:38 crc kubenswrapper[4585]: I1201 13:58:38.708577 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:38 crc kubenswrapper[4585]: I1201 13:58:38.708587 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:38 crc kubenswrapper[4585]: I1201 13:58:38.708607 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:38 crc kubenswrapper[4585]: I1201 13:58:38.708621 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:38Z","lastTransitionTime":"2025-12-01T13:58:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:38 crc kubenswrapper[4585]: I1201 13:58:38.710699 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" event={"ID":"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d","Type":"ContainerStarted","Data":"41a150020fb39ba2ce75b7cba4d24fff9e245b088b62f43a3feb8dcf8344aa33"} Dec 01 13:58:38 crc kubenswrapper[4585]: I1201 13:58:38.711020 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:38 crc kubenswrapper[4585]: I1201 13:58:38.725919 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b10cdae7-154c-4fd6-a308-02843603d7ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b40ee42d7d1c9043b0b5c0dea3baacc46c24842ec64ed3b0d0ae1acfaf65f8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa22ac0ee9ad705a8affafffed373135005ef0380c9a646e06229a24d04d20b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://949f7fe4da35866e66212aea18ec1ed303bd542b07e53ff6562618c19de112cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d7c5d98b7f6dcade92ba77f78aa8345baf8161727831a616192726f6ef6f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dfb22c5155fdc4bfdf9d54bf540e4018bca6a616835af7e8d5079bebcd45da\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T13:58:25Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 13:58:19.764445 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 13:58:19.765610 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3283210930/tls.crt::/tmp/serving-cert-3283210930/tls.key\\\\\\\"\\\\nI1201 13:58:25.461951 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 13:58:25.464641 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 13:58:25.464660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 13:58:25.464680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 13:58:25.464686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 13:58:25.473932 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 13:58:25.473958 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 13:58:25.473983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 13:58:25.473986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 13:58:25.473989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 13:58:25.474166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 13:58:25.477276 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc9645fabd7968d92ce9867130dddbb6c72664af68d6b5475cf3271d1975878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:38Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:38 crc kubenswrapper[4585]: I1201 13:58:38.738936 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e9e83cdb2aa3565484c1ff3e5bbd72c7f50a132de50e60aec1aa88ad145ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:38Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:38 crc kubenswrapper[4585]: I1201 13:58:38.762131 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7beb40d-bcd0-43c8-a9fe-c32408790a4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59e8d51b7e0c8b526fab11f39a0a94685aecda8a1d300e1ee2259eb4bbac9003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c2cf1c1e4965502ab56b6d9bbf7840d00225b831b1e613295befad0748d7d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lj9gs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:38Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:38 crc kubenswrapper[4585]: I1201 13:58:38.803715 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90163e3776ee8b2a338e3f8e6aa3d918d1d6123d246fbbd88659d342348167a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37437844a9c9719627f96c0afa303f0d449c46def7249b8c69ebd44900661fa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec93c685227d533ac1ab980dca6f72792f746e68511a7a546fb25189b02507d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b64c1a0d36f8addf6040455b1e933756582857ef8098c935b9cc5a115e8afee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d740a1be02b59a062ea8d5c44620cb57725a7ec494fd365e44a8980947c0d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdbccd3910729d23067d94c9ed2f5f3c917cd4880185a3b7e8c4c811cd30e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41a150020fb39ba2ce75b7cba4d24fff9e245b088b62f43a3feb8dcf8344aa33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2035d90bb6ffbd065eccd56b401e72fe58499ff5b323f28d73cbf6082ff23848\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T13:58:37Z\\\",\\\"message\\\":\\\"y (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1201 13:58:36.905704 5794 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 13:58:36.905900 5794 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 13:58:36.906012 5794 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 13:58:36.906359 5794 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 13:58:36.906460 5794 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 13:58:36.906473 5794 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 13:58:36.906540 5794 factory.go:656] Stopping watch factory\\\\nI1201 13:58:36.906555 5794 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 13:58:36.906562 5794 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 13:58:36.906742 5794 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 13:58:36.906871 5794 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c85de7cd35b11158d931065cec6bafaf456be350f4a6095b8a9d0851bf87b1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:38Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:38 crc kubenswrapper[4585]: I1201 13:58:38.811146 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:38 crc kubenswrapper[4585]: I1201 13:58:38.811198 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:38 crc kubenswrapper[4585]: I1201 13:58:38.811211 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:38 crc kubenswrapper[4585]: I1201 13:58:38.811233 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:38 crc kubenswrapper[4585]: I1201 13:58:38.811248 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:38Z","lastTransitionTime":"2025-12-01T13:58:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:38 crc kubenswrapper[4585]: I1201 13:58:38.825582 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nhp6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da6d315c-d478-4216-9f3d-57b20ce5ced8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ebdaa73edbe04780c0f421ea88446b4268ed1792af30e1ebb945167db80a548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n25g8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nhp6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:38Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:38 crc kubenswrapper[4585]: I1201 13:58:38.844408 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:38Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:38 crc kubenswrapper[4585]: I1201 13:58:38.861106 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9wjs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e7ad3ad-7937-409b-b1c9-9c801f937400\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3662f79465b11c168f80f17c534c792fb1cc6b8a5c0115d219d39def6db073\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvc7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9wjs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:38Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:38 crc kubenswrapper[4585]: I1201 13:58:38.878451 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:38Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:38 crc kubenswrapper[4585]: I1201 13:58:38.895580 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xh4hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0e1eaa6-bee9-401a-b01a-9bf49a938b29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db9b353d1934e3336c4282cb057c52ef64e01675fa36e23d1fa0af7133508a62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f3e0b6c5dac9ff9b4686937a7d1646c4796a3326193349cf1556876bf1ae61a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f3e0b6c5dac9ff9b4686937a7d1646c4796a3326193349cf1556876bf1ae61a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2451dad343dcc9ec3bd5eb7109aefab60da5b700ae69d8eff28fef524795a758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2451dad343dcc9ec3bd5eb7109aefab60da5b700ae69d8eff28fef524795a758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9b888c3602a8f5d0f8cb762ab7d9b6676a0467f9546dbc38c5d2dd99acedd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9b888c3602a8f5d0f8cb762ab7d9b6676a0467f9546dbc38c5d2dd99acedd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3110c36a439f88ab7fc0903fd766db1abef51f129bb1fc68634531d0e30cab0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3110c36a439f88ab7fc0903fd766db1abef51f129bb1fc68634531d0e30cab0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xh4hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:38Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:38 crc kubenswrapper[4585]: I1201 13:58:38.913964 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:38 crc kubenswrapper[4585]: I1201 13:58:38.914100 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:38 crc kubenswrapper[4585]: I1201 13:58:38.914115 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:38 crc kubenswrapper[4585]: I1201 13:58:38.914135 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:38 crc kubenswrapper[4585]: I1201 13:58:38.914148 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:38Z","lastTransitionTime":"2025-12-01T13:58:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:38 crc kubenswrapper[4585]: I1201 13:58:38.923213 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b62eed93-4d28-4748-b991-dd9692b58341\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e952d1592344330416348b9ec4add60740a0d23ab554afb70c6419977fb12533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e35deff79558e54b62aa868535d5bc4d4374ede161463dcf5f3a33584867447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9736ff6b49635cb7c26db7bc2292b36cf63053e0d1e4ffa2980a520dd71d9ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe7079ed6fa6b8b64d16c67287c33c2bffa0f42df2166fe66e3f8e173ca87d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83ba28d0b57f4744f3f7f97915fde983e2de227958ddaecd43b3eae9eb406774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:38Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:38 crc kubenswrapper[4585]: I1201 13:58:38.937439 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4030376c-31f1-420a-935f-f1ee174914e7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2571804bbf066bfec1f4fde3430800aa87d9c5b6db48cc1f3d352d742657ca8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d240ebf61216f20038983f7b8d2e0792fc0f1ed8db050debf02c777197d417b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f66a3bd7503966a7b443b7ea1adbd8554c99e810749bc275e5cfa4e95c6b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4434211239bc61f432b46918bbb691c6257fded2fbd9552ce3344d10e091b3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:38Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:38 crc kubenswrapper[4585]: I1201 13:58:38.955396 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b3ea01760c8f1e58a55a59a0d453c25ca1298ede32d471f02092fd9d0a43830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e7aa15babbd745afed6bc0d4bbf5526180b8d32f86f8db736066fad3feb506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:38Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:38 crc kubenswrapper[4585]: I1201 13:58:38.969211 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4694a9b9e5ea09fb1d1963504d94853e22fce9688a502036324f92f39bf8a8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:38Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:38 crc kubenswrapper[4585]: I1201 13:58:38.983496 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:38Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:38 crc kubenswrapper[4585]: I1201 13:58:38.994623 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-62bsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98be7526-98f6-4a4a-b4a6-1d10e76b7a99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d4df68e891df5e4e44371a4c884475b5723d55f6a10d32889cf194ddbe6179\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxjs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-62bsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:38Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.019561 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.019612 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.019624 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.019834 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.019855 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:39Z","lastTransitionTime":"2025-12-01T13:58:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.123529 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.123677 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.123688 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.123708 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.123720 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:39Z","lastTransitionTime":"2025-12-01T13:58:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.227622 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.227685 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.227697 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.227719 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.227732 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:39Z","lastTransitionTime":"2025-12-01T13:58:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.330187 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.330227 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.330236 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.330252 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.330262 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:39Z","lastTransitionTime":"2025-12-01T13:58:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.433623 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.433706 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.433733 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.433767 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.433791 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:39Z","lastTransitionTime":"2025-12-01T13:58:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.464108 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.464182 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.464200 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.464226 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.464245 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:39Z","lastTransitionTime":"2025-12-01T13:58:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:39 crc kubenswrapper[4585]: E1201 13:58:39.480901 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e38ee8f4-73f3-495c-b626-577353e9a008\\\",\\\"systemUUID\\\":\\\"fcf25ef7-53cd-4591-aa80-a73d07c13768\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:39Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.485720 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.485805 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.485826 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.485852 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.485870 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:39Z","lastTransitionTime":"2025-12-01T13:58:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:39 crc kubenswrapper[4585]: E1201 13:58:39.509478 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e38ee8f4-73f3-495c-b626-577353e9a008\\\",\\\"systemUUID\\\":\\\"fcf25ef7-53cd-4591-aa80-a73d07c13768\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:39Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.514753 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.514929 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.514942 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.514963 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.514993 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:39Z","lastTransitionTime":"2025-12-01T13:58:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:39 crc kubenswrapper[4585]: E1201 13:58:39.529845 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e38ee8f4-73f3-495c-b626-577353e9a008\\\",\\\"systemUUID\\\":\\\"fcf25ef7-53cd-4591-aa80-a73d07c13768\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:39Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.534728 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.534772 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.534788 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.534808 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.534822 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:39Z","lastTransitionTime":"2025-12-01T13:58:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:39 crc kubenswrapper[4585]: E1201 13:58:39.549548 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e38ee8f4-73f3-495c-b626-577353e9a008\\\",\\\"systemUUID\\\":\\\"fcf25ef7-53cd-4591-aa80-a73d07c13768\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:39Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.554452 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.555444 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.555466 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.555474 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.555486 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.555497 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:39Z","lastTransitionTime":"2025-12-01T13:58:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.572789 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b10cdae7-154c-4fd6-a308-02843603d7ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b40ee42d7d1c9043b0b5c0dea3baacc46c24842ec64ed3b0d0ae1acfaf65f8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa22ac0ee9ad705a8affafffed373135005ef0380c9a646e06229a24d04d20b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://949f7fe4da35866e66212aea18ec1ed303bd542b07e53ff6562618c19de112cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d7c5d98b7f6dcade92ba77f78aa8345baf8161727831a616192726f6ef6f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dfb22c5155fdc4bfdf9d54bf540e4018bca6a616835af7e8d5079bebcd45da\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T13:58:25Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 13:58:19.764445 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 13:58:19.765610 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3283210930/tls.crt::/tmp/serving-cert-3283210930/tls.key\\\\\\\"\\\\nI1201 13:58:25.461951 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 13:58:25.464641 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 13:58:25.464660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 13:58:25.464680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 13:58:25.464686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 13:58:25.473932 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 13:58:25.473958 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 13:58:25.473983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 13:58:25.473986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 13:58:25.473989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 13:58:25.474166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 13:58:25.477276 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc9645fabd7968d92ce9867130dddbb6c72664af68d6b5475cf3271d1975878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:39Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:39 crc kubenswrapper[4585]: E1201 13:58:39.575053 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e38ee8f4-73f3-495c-b626-577353e9a008\\\",\\\"systemUUID\\\":\\\"fcf25ef7-53cd-4591-aa80-a73d07c13768\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:39Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:39 crc kubenswrapper[4585]: E1201 13:58:39.575164 4585 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.576865 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.576886 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.576894 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.576909 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.576922 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:39Z","lastTransitionTime":"2025-12-01T13:58:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.596176 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e9e83cdb2aa3565484c1ff3e5bbd72c7f50a132de50e60aec1aa88ad145ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:39Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.611782 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7beb40d-bcd0-43c8-a9fe-c32408790a4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59e8d51b7e0c8b526fab11f39a0a94685aecda8a1d300e1ee2259eb4bbac9003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c2cf1c1e4965502ab56b6d9bbf7840d00225b831b1e613295befad0748d7d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lj9gs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:39Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.637913 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90163e3776ee8b2a338e3f8e6aa3d918d1d6123d246fbbd88659d342348167a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37437844a9c9719627f96c0afa303f0d449c46def7249b8c69ebd44900661fa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec93c685227d533ac1ab980dca6f72792f746e68511a7a546fb25189b02507d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b64c1a0d36f8addf6040455b1e933756582857ef8098c935b9cc5a115e8afee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d740a1be02b59a062ea8d5c44620cb57725a7ec494fd365e44a8980947c0d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdbccd3910729d23067d94c9ed2f5f3c917cd4880185a3b7e8c4c811cd30e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41a150020fb39ba2ce75b7cba4d24fff9e245b088b62f43a3feb8dcf8344aa33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2035d90bb6ffbd065eccd56b401e72fe58499ff5b323f28d73cbf6082ff23848\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T13:58:37Z\\\",\\\"message\\\":\\\"y (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1201 13:58:36.905704 5794 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 13:58:36.905900 5794 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 13:58:36.906012 5794 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 13:58:36.906359 5794 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 13:58:36.906460 5794 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 13:58:36.906473 5794 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 13:58:36.906540 5794 factory.go:656] Stopping watch factory\\\\nI1201 13:58:36.906555 5794 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 13:58:36.906562 5794 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 13:58:36.906742 5794 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 13:58:36.906871 5794 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c85de7cd35b11158d931065cec6bafaf456be350f4a6095b8a9d0851bf87b1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:39Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.654786 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nhp6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da6d315c-d478-4216-9f3d-57b20ce5ced8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ebdaa73edbe04780c0f421ea88446b4268ed1792af30e1ebb945167db80a548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n25g8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nhp6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:39Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.669665 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:39Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.679480 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.679508 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.679516 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.679531 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.679541 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:39Z","lastTransitionTime":"2025-12-01T13:58:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.686930 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9wjs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e7ad3ad-7937-409b-b1c9-9c801f937400\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3662f79465b11c168f80f17c534c792fb1cc6b8a5c0115d219d39def6db073\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvc7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9wjs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:39Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.702923 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:39Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.710869 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kn5d"] Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.711510 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kn5d" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.714829 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.714932 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.718624 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tjkqr_b0b45150-070d-4f7c-b53a-d76dcbaa6e6d/ovnkube-controller/1.log" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.719906 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tjkqr_b0b45150-070d-4f7c-b53a-d76dcbaa6e6d/ovnkube-controller/0.log" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.727036 4585 generic.go:334] "Generic (PLEG): container finished" podID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" containerID="41a150020fb39ba2ce75b7cba4d24fff9e245b088b62f43a3feb8dcf8344aa33" exitCode=1 Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.727364 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" event={"ID":"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d","Type":"ContainerDied","Data":"41a150020fb39ba2ce75b7cba4d24fff9e245b088b62f43a3feb8dcf8344aa33"} Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.727503 4585 scope.go:117] "RemoveContainer" containerID="2035d90bb6ffbd065eccd56b401e72fe58499ff5b323f28d73cbf6082ff23848" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.728085 4585 scope.go:117] "RemoveContainer" containerID="41a150020fb39ba2ce75b7cba4d24fff9e245b088b62f43a3feb8dcf8344aa33" Dec 01 13:58:39 crc kubenswrapper[4585]: E1201 13:58:39.728283 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-tjkqr_openshift-ovn-kubernetes(b0b45150-070d-4f7c-b53a-d76dcbaa6e6d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" podUID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.730615 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xh4hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0e1eaa6-bee9-401a-b01a-9bf49a938b29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db9b353d1934e3336c4282cb057c52ef64e01675fa36e23d1fa0af7133508a62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f3e0b6c5dac9ff9b4686937a7d1646c4796a3326193349cf1556876bf1ae61a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f3e0b6c5dac9ff9b4686937a7d1646c4796a3326193349cf1556876bf1ae61a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2451dad343dcc9ec3bd5eb7109aefab60da5b700ae69d8eff28fef524795a758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2451dad343dcc9ec3bd5eb7109aefab60da5b700ae69d8eff28fef524795a758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9b888c3602a8f5d0f8cb762ab7d9b6676a0467f9546dbc38c5d2dd99acedd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9b888c3602a8f5d0f8cb762ab7d9b6676a0467f9546dbc38c5d2dd99acedd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3110c36a439f88ab7fc0903fd766db1abef51f129bb1fc68634531d0e30cab0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3110c36a439f88ab7fc0903fd766db1abef51f129bb1fc68634531d0e30cab0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xh4hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:39Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.753024 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b62eed93-4d28-4748-b991-dd9692b58341\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e952d1592344330416348b9ec4add60740a0d23ab554afb70c6419977fb12533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e35deff79558e54b62aa868535d5bc4d4374ede161463dcf5f3a33584867447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9736ff6b49635cb7c26db7bc2292b36cf63053e0d1e4ffa2980a520dd71d9ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe7079ed6fa6b8b64d16c67287c33c2bffa0f42df2166fe66e3f8e173ca87d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83ba28d0b57f4744f3f7f97915fde983e2de227958ddaecd43b3eae9eb406774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:39Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.769411 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4030376c-31f1-420a-935f-f1ee174914e7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2571804bbf066bfec1f4fde3430800aa87d9c5b6db48cc1f3d352d742657ca8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d240ebf61216f20038983f7b8d2e0792fc0f1ed8db050debf02c777197d417b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f66a3bd7503966a7b443b7ea1adbd8554c99e810749bc275e5cfa4e95c6b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4434211239bc61f432b46918bbb691c6257fded2fbd9552ce3344d10e091b3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:39Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.781784 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.782040 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.782136 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.782273 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.782355 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:39Z","lastTransitionTime":"2025-12-01T13:58:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.788777 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b3ea01760c8f1e58a55a59a0d453c25ca1298ede32d471f02092fd9d0a43830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e7aa15babbd745afed6bc0d4bbf5526180b8d32f86f8db736066fad3feb506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:39Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.805000 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4694a9b9e5ea09fb1d1963504d94853e22fce9688a502036324f92f39bf8a8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:39Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.822194 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:39Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.824918 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/84c45f35-33ae-4eba-97c7-eb85a6db85a5-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-6kn5d\" (UID: \"84c45f35-33ae-4eba-97c7-eb85a6db85a5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kn5d" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.825040 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsfmk\" (UniqueName: \"kubernetes.io/projected/84c45f35-33ae-4eba-97c7-eb85a6db85a5-kube-api-access-rsfmk\") pod \"ovnkube-control-plane-749d76644c-6kn5d\" (UID: \"84c45f35-33ae-4eba-97c7-eb85a6db85a5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kn5d" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.825754 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/84c45f35-33ae-4eba-97c7-eb85a6db85a5-env-overrides\") pod \"ovnkube-control-plane-749d76644c-6kn5d\" (UID: \"84c45f35-33ae-4eba-97c7-eb85a6db85a5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kn5d" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.825823 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/84c45f35-33ae-4eba-97c7-eb85a6db85a5-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-6kn5d\" (UID: \"84c45f35-33ae-4eba-97c7-eb85a6db85a5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kn5d" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.837791 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-62bsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98be7526-98f6-4a4a-b4a6-1d10e76b7a99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d4df68e891df5e4e44371a4c884475b5723d55f6a10d32889cf194ddbe6179\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxjs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-62bsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:39Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.854390 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kn5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c45f35-33ae-4eba-97c7-eb85a6db85a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6kn5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:39Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.873356 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b10cdae7-154c-4fd6-a308-02843603d7ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b40ee42d7d1c9043b0b5c0dea3baacc46c24842ec64ed3b0d0ae1acfaf65f8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa22ac0ee9ad705a8affafffed373135005ef0380c9a646e06229a24d04d20b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://949f7fe4da35866e66212aea18ec1ed303bd542b07e53ff6562618c19de112cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d7c5d98b7f6dcade92ba77f78aa8345baf8161727831a616192726f6ef6f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dfb22c5155fdc4bfdf9d54bf540e4018bca6a616835af7e8d5079bebcd45da\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T13:58:25Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 13:58:19.764445 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 13:58:19.765610 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3283210930/tls.crt::/tmp/serving-cert-3283210930/tls.key\\\\\\\"\\\\nI1201 13:58:25.461951 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 13:58:25.464641 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 13:58:25.464660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 13:58:25.464680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 13:58:25.464686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 13:58:25.473932 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 13:58:25.473958 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 13:58:25.473983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 13:58:25.473986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 13:58:25.473989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 13:58:25.474166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 13:58:25.477276 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc9645fabd7968d92ce9867130dddbb6c72664af68d6b5475cf3271d1975878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:39Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.885314 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.885366 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.885380 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.885405 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.885421 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:39Z","lastTransitionTime":"2025-12-01T13:58:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.890372 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e9e83cdb2aa3565484c1ff3e5bbd72c7f50a132de50e60aec1aa88ad145ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:39Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.906531 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7beb40d-bcd0-43c8-a9fe-c32408790a4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59e8d51b7e0c8b526fab11f39a0a94685aecda8a1d300e1ee2259eb4bbac9003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c2cf1c1e4965502ab56b6d9bbf7840d00225b831b1e613295befad0748d7d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lj9gs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:39Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.926758 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/84c45f35-33ae-4eba-97c7-eb85a6db85a5-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-6kn5d\" (UID: \"84c45f35-33ae-4eba-97c7-eb85a6db85a5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kn5d" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.926816 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/84c45f35-33ae-4eba-97c7-eb85a6db85a5-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-6kn5d\" (UID: \"84c45f35-33ae-4eba-97c7-eb85a6db85a5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kn5d" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.926857 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsfmk\" (UniqueName: \"kubernetes.io/projected/84c45f35-33ae-4eba-97c7-eb85a6db85a5-kube-api-access-rsfmk\") pod \"ovnkube-control-plane-749d76644c-6kn5d\" (UID: \"84c45f35-33ae-4eba-97c7-eb85a6db85a5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kn5d" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.926908 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/84c45f35-33ae-4eba-97c7-eb85a6db85a5-env-overrides\") pod \"ovnkube-control-plane-749d76644c-6kn5d\" (UID: \"84c45f35-33ae-4eba-97c7-eb85a6db85a5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kn5d" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.927757 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/84c45f35-33ae-4eba-97c7-eb85a6db85a5-env-overrides\") pod \"ovnkube-control-plane-749d76644c-6kn5d\" (UID: \"84c45f35-33ae-4eba-97c7-eb85a6db85a5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kn5d" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.928194 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/84c45f35-33ae-4eba-97c7-eb85a6db85a5-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-6kn5d\" (UID: \"84c45f35-33ae-4eba-97c7-eb85a6db85a5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kn5d" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.929707 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90163e3776ee8b2a338e3f8e6aa3d918d1d6123d246fbbd88659d342348167a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37437844a9c9719627f96c0afa303f0d449c46def7249b8c69ebd44900661fa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec93c685227d533ac1ab980dca6f72792f746e68511a7a546fb25189b02507d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b64c1a0d36f8addf6040455b1e933756582857ef8098c935b9cc5a115e8afee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d740a1be02b59a062ea8d5c44620cb57725a7ec494fd365e44a8980947c0d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdbccd3910729d23067d94c9ed2f5f3c917cd4880185a3b7e8c4c811cd30e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41a150020fb39ba2ce75b7cba4d24fff9e245b088b62f43a3feb8dcf8344aa33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2035d90bb6ffbd065eccd56b401e72fe58499ff5b323f28d73cbf6082ff23848\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T13:58:37Z\\\",\\\"message\\\":\\\"y (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1201 13:58:36.905704 5794 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 13:58:36.905900 5794 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 13:58:36.906012 5794 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 13:58:36.906359 5794 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 13:58:36.906460 5794 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 13:58:36.906473 5794 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 13:58:36.906540 5794 factory.go:656] Stopping watch factory\\\\nI1201 13:58:36.906555 5794 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 13:58:36.906562 5794 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 13:58:36.906742 5794 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 13:58:36.906871 5794 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41a150020fb39ba2ce75b7cba4d24fff9e245b088b62f43a3feb8dcf8344aa33\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"message\\\":\\\"nal_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/olm-operator-metrics]} name:Service_openshift-operator-lifecycle-manager/olm-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.168:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {63b1440a-0908-4cab-8799-012fa1cf0b07}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 13:58:38.627536 5917 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1201 13:58:38.627873 5917 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1201 13:58:38.627909 5917 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c85de7cd35b11158d931065cec6bafaf456be350f4a6095b8a9d0851bf87b1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:39Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.941534 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/84c45f35-33ae-4eba-97c7-eb85a6db85a5-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-6kn5d\" (UID: \"84c45f35-33ae-4eba-97c7-eb85a6db85a5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kn5d" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.943275 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nhp6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da6d315c-d478-4216-9f3d-57b20ce5ced8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ebdaa73edbe04780c0f421ea88446b4268ed1792af30e1ebb945167db80a548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n25g8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nhp6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:39Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.946488 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsfmk\" (UniqueName: \"kubernetes.io/projected/84c45f35-33ae-4eba-97c7-eb85a6db85a5-kube-api-access-rsfmk\") pod \"ovnkube-control-plane-749d76644c-6kn5d\" (UID: \"84c45f35-33ae-4eba-97c7-eb85a6db85a5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kn5d" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.961075 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:39Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.977169 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9wjs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e7ad3ad-7937-409b-b1c9-9c801f937400\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3662f79465b11c168f80f17c534c792fb1cc6b8a5c0115d219d39def6db073\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvc7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9wjs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:39Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.988290 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.988342 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.988354 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.988373 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.988387 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:39Z","lastTransitionTime":"2025-12-01T13:58:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:39 crc kubenswrapper[4585]: I1201 13:58:39.991672 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:39Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.008768 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xh4hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0e1eaa6-bee9-401a-b01a-9bf49a938b29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db9b353d1934e3336c4282cb057c52ef64e01675fa36e23d1fa0af7133508a62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f3e0b6c5dac9ff9b4686937a7d1646c4796a3326193349cf1556876bf1ae61a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f3e0b6c5dac9ff9b4686937a7d1646c4796a3326193349cf1556876bf1ae61a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2451dad343dcc9ec3bd5eb7109aefab60da5b700ae69d8eff28fef524795a758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2451dad343dcc9ec3bd5eb7109aefab60da5b700ae69d8eff28fef524795a758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9b888c3602a8f5d0f8cb762ab7d9b6676a0467f9546dbc38c5d2dd99acedd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9b888c3602a8f5d0f8cb762ab7d9b6676a0467f9546dbc38c5d2dd99acedd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3110c36a439f88ab7fc0903fd766db1abef51f129bb1fc68634531d0e30cab0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3110c36a439f88ab7fc0903fd766db1abef51f129bb1fc68634531d0e30cab0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xh4hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:40Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.030294 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kn5d" Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.036339 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b62eed93-4d28-4748-b991-dd9692b58341\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e952d1592344330416348b9ec4add60740a0d23ab554afb70c6419977fb12533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e35deff79558e54b62aa868535d5bc4d4374ede161463dcf5f3a33584867447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9736ff6b49635cb7c26db7bc2292b36cf63053e0d1e4ffa2980a520dd71d9ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe7079ed6fa6b8b64d16c67287c33c2bffa0f42df2166fe66e3f8e173ca87d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83ba28d0b57f4744f3f7f97915fde983e2de227958ddaecd43b3eae9eb406774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:40Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.055704 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4030376c-31f1-420a-935f-f1ee174914e7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2571804bbf066bfec1f4fde3430800aa87d9c5b6db48cc1f3d352d742657ca8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d240ebf61216f20038983f7b8d2e0792fc0f1ed8db050debf02c777197d417b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f66a3bd7503966a7b443b7ea1adbd8554c99e810749bc275e5cfa4e95c6b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4434211239bc61f432b46918bbb691c6257fded2fbd9552ce3344d10e091b3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:40Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.075745 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b3ea01760c8f1e58a55a59a0d453c25ca1298ede32d471f02092fd9d0a43830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e7aa15babbd745afed6bc0d4bbf5526180b8d32f86f8db736066fad3feb506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:40Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.094263 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.094307 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.094319 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.094339 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.094350 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:40Z","lastTransitionTime":"2025-12-01T13:58:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.094466 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4694a9b9e5ea09fb1d1963504d94853e22fce9688a502036324f92f39bf8a8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:40Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.108056 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:40Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.121667 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-62bsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98be7526-98f6-4a4a-b4a6-1d10e76b7a99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d4df68e891df5e4e44371a4c884475b5723d55f6a10d32889cf194ddbe6179\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxjs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-62bsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:40Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.197688 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.197737 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.197750 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.197767 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.197781 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:40Z","lastTransitionTime":"2025-12-01T13:58:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.299993 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.300056 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.300070 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.300092 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.300103 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:40Z","lastTransitionTime":"2025-12-01T13:58:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.402147 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.402193 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.402206 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.402222 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.402235 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:40Z","lastTransitionTime":"2025-12-01T13:58:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.413175 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 13:58:40 crc kubenswrapper[4585]: E1201 13:58:40.413308 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.413662 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 13:58:40 crc kubenswrapper[4585]: E1201 13:58:40.413720 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.413768 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 13:58:40 crc kubenswrapper[4585]: E1201 13:58:40.413812 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.511054 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.511104 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.511114 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.511130 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.511140 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:40Z","lastTransitionTime":"2025-12-01T13:58:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.613957 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.614025 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.614035 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.614053 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.614064 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:40Z","lastTransitionTime":"2025-12-01T13:58:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.716449 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.716493 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.716502 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.716520 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.716542 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:40Z","lastTransitionTime":"2025-12-01T13:58:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.733067 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kn5d" event={"ID":"84c45f35-33ae-4eba-97c7-eb85a6db85a5","Type":"ContainerStarted","Data":"b27c5e93a6707bc6b2b6ef81596e39344c0b29f8cac30ac5d62c7eaa5e635f4a"} Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.733120 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kn5d" event={"ID":"84c45f35-33ae-4eba-97c7-eb85a6db85a5","Type":"ContainerStarted","Data":"cf9d61c1ae31a9cc21ea918e419af7e8badf234ef1614ef73bafdc95381935cc"} Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.733134 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kn5d" event={"ID":"84c45f35-33ae-4eba-97c7-eb85a6db85a5","Type":"ContainerStarted","Data":"27cffa2495d86a82da6a457dcde4929cc7cb2a3644ae5ca0164414af5d6b93d8"} Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.734729 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tjkqr_b0b45150-070d-4f7c-b53a-d76dcbaa6e6d/ovnkube-controller/1.log" Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.738216 4585 scope.go:117] "RemoveContainer" containerID="41a150020fb39ba2ce75b7cba4d24fff9e245b088b62f43a3feb8dcf8344aa33" Dec 01 13:58:40 crc kubenswrapper[4585]: E1201 13:58:40.738412 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-tjkqr_openshift-ovn-kubernetes(b0b45150-070d-4f7c-b53a-d76dcbaa6e6d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" podUID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.754116 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:40Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.769871 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9wjs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e7ad3ad-7937-409b-b1c9-9c801f937400\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3662f79465b11c168f80f17c534c792fb1cc6b8a5c0115d219d39def6db073\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvc7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9wjs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:40Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.787019 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b3ea01760c8f1e58a55a59a0d453c25ca1298ede32d471f02092fd9d0a43830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e7aa15babbd745afed6bc0d4bbf5526180b8d32f86f8db736066fad3feb506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:40Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.803060 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4694a9b9e5ea09fb1d1963504d94853e22fce9688a502036324f92f39bf8a8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:40Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.819396 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.819452 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.819462 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.819483 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.819498 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:40Z","lastTransitionTime":"2025-12-01T13:58:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.820139 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:40Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.835817 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xh4hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0e1eaa6-bee9-401a-b01a-9bf49a938b29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db9b353d1934e3336c4282cb057c52ef64e01675fa36e23d1fa0af7133508a62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f3e0b6c5dac9ff9b4686937a7d1646c4796a3326193349cf1556876bf1ae61a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f3e0b6c5dac9ff9b4686937a7d1646c4796a3326193349cf1556876bf1ae61a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2451dad343dcc9ec3bd5eb7109aefab60da5b700ae69d8eff28fef524795a758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2451dad343dcc9ec3bd5eb7109aefab60da5b700ae69d8eff28fef524795a758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9b888c3602a8f5d0f8cb762ab7d9b6676a0467f9546dbc38c5d2dd99acedd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9b888c3602a8f5d0f8cb762ab7d9b6676a0467f9546dbc38c5d2dd99acedd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3110c36a439f88ab7fc0903fd766db1abef51f129bb1fc68634531d0e30cab0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3110c36a439f88ab7fc0903fd766db1abef51f129bb1fc68634531d0e30cab0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xh4hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:40Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.860074 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b62eed93-4d28-4748-b991-dd9692b58341\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e952d1592344330416348b9ec4add60740a0d23ab554afb70c6419977fb12533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e35deff79558e54b62aa868535d5bc4d4374ede161463dcf5f3a33584867447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9736ff6b49635cb7c26db7bc2292b36cf63053e0d1e4ffa2980a520dd71d9ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe7079ed6fa6b8b64d16c67287c33c2bffa0f42df2166fe66e3f8e173ca87d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83ba28d0b57f4744f3f7f97915fde983e2de227958ddaecd43b3eae9eb406774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:40Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.876110 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4030376c-31f1-420a-935f-f1ee174914e7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2571804bbf066bfec1f4fde3430800aa87d9c5b6db48cc1f3d352d742657ca8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d240ebf61216f20038983f7b8d2e0792fc0f1ed8db050debf02c777197d417b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f66a3bd7503966a7b443b7ea1adbd8554c99e810749bc275e5cfa4e95c6b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4434211239bc61f432b46918bbb691c6257fded2fbd9552ce3344d10e091b3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:40Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.888335 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:40Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.898126 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-62bsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98be7526-98f6-4a4a-b4a6-1d10e76b7a99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d4df68e891df5e4e44371a4c884475b5723d55f6a10d32889cf194ddbe6179\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxjs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-62bsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:40Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.916207 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90163e3776ee8b2a338e3f8e6aa3d918d1d6123d246fbbd88659d342348167a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37437844a9c9719627f96c0afa303f0d449c46def7249b8c69ebd44900661fa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec93c685227d533ac1ab980dca6f72792f746e68511a7a546fb25189b02507d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b64c1a0d36f8addf6040455b1e933756582857ef8098c935b9cc5a115e8afee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d740a1be02b59a062ea8d5c44620cb57725a7ec494fd365e44a8980947c0d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdbccd3910729d23067d94c9ed2f5f3c917cd4880185a3b7e8c4c811cd30e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41a150020fb39ba2ce75b7cba4d24fff9e245b088b62f43a3feb8dcf8344aa33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2035d90bb6ffbd065eccd56b401e72fe58499ff5b323f28d73cbf6082ff23848\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T13:58:37Z\\\",\\\"message\\\":\\\"y (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1201 13:58:36.905704 5794 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 13:58:36.905900 5794 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 13:58:36.906012 5794 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 13:58:36.906359 5794 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 13:58:36.906460 5794 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 13:58:36.906473 5794 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 13:58:36.906540 5794 factory.go:656] Stopping watch factory\\\\nI1201 13:58:36.906555 5794 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 13:58:36.906562 5794 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 13:58:36.906742 5794 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 13:58:36.906871 5794 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41a150020fb39ba2ce75b7cba4d24fff9e245b088b62f43a3feb8dcf8344aa33\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"message\\\":\\\"nal_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/olm-operator-metrics]} name:Service_openshift-operator-lifecycle-manager/olm-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.168:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {63b1440a-0908-4cab-8799-012fa1cf0b07}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 13:58:38.627536 5917 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1201 13:58:38.627873 5917 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1201 13:58:38.627909 5917 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c85de7cd35b11158d931065cec6bafaf456be350f4a6095b8a9d0851bf87b1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:40Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.922233 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.922566 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.922698 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.922798 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.922882 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:40Z","lastTransitionTime":"2025-12-01T13:58:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.926407 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nhp6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da6d315c-d478-4216-9f3d-57b20ce5ced8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ebdaa73edbe04780c0f421ea88446b4268ed1792af30e1ebb945167db80a548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n25g8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nhp6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:40Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.937409 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kn5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c45f35-33ae-4eba-97c7-eb85a6db85a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf9d61c1ae31a9cc21ea918e419af7e8badf234ef1614ef73bafdc95381935cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b27c5e93a6707bc6b2b6ef81596e39344c0b29f8cac30ac5d62c7eaa5e635f4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6kn5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:40Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.951013 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b10cdae7-154c-4fd6-a308-02843603d7ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b40ee42d7d1c9043b0b5c0dea3baacc46c24842ec64ed3b0d0ae1acfaf65f8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa22ac0ee9ad705a8affafffed373135005ef0380c9a646e06229a24d04d20b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://949f7fe4da35866e66212aea18ec1ed303bd542b07e53ff6562618c19de112cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d7c5d98b7f6dcade92ba77f78aa8345baf8161727831a616192726f6ef6f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dfb22c5155fdc4bfdf9d54bf540e4018bca6a616835af7e8d5079bebcd45da\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T13:58:25Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 13:58:19.764445 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 13:58:19.765610 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3283210930/tls.crt::/tmp/serving-cert-3283210930/tls.key\\\\\\\"\\\\nI1201 13:58:25.461951 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 13:58:25.464641 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 13:58:25.464660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 13:58:25.464680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 13:58:25.464686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 13:58:25.473932 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 13:58:25.473958 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 13:58:25.473983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 13:58:25.473986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 13:58:25.473989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 13:58:25.474166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 13:58:25.477276 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc9645fabd7968d92ce9867130dddbb6c72664af68d6b5475cf3271d1975878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:40Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.962150 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e9e83cdb2aa3565484c1ff3e5bbd72c7f50a132de50e60aec1aa88ad145ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:40Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.973247 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7beb40d-bcd0-43c8-a9fe-c32408790a4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59e8d51b7e0c8b526fab11f39a0a94685aecda8a1d300e1ee2259eb4bbac9003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c2cf1c1e4965502ab56b6d9bbf7840d00225b831b1e613295befad0748d7d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lj9gs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:40Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:40 crc kubenswrapper[4585]: I1201 13:58:40.993459 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b62eed93-4d28-4748-b991-dd9692b58341\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e952d1592344330416348b9ec4add60740a0d23ab554afb70c6419977fb12533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e35deff79558e54b62aa868535d5bc4d4374ede161463dcf5f3a33584867447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9736ff6b49635cb7c26db7bc2292b36cf63053e0d1e4ffa2980a520dd71d9ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe7079ed6fa6b8b64d16c67287c33c2bffa0f42df2166fe66e3f8e173ca87d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83ba28d0b57f4744f3f7f97915fde983e2de227958ddaecd43b3eae9eb406774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:40Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.008010 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4030376c-31f1-420a-935f-f1ee174914e7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2571804bbf066bfec1f4fde3430800aa87d9c5b6db48cc1f3d352d742657ca8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d240ebf61216f20038983f7b8d2e0792fc0f1ed8db050debf02c777197d417b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f66a3bd7503966a7b443b7ea1adbd8554c99e810749bc275e5cfa4e95c6b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4434211239bc61f432b46918bbb691c6257fded2fbd9552ce3344d10e091b3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:41Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.021997 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b3ea01760c8f1e58a55a59a0d453c25ca1298ede32d471f02092fd9d0a43830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e7aa15babbd745afed6bc0d4bbf5526180b8d32f86f8db736066fad3feb506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:41Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.025877 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.025932 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.025943 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.025960 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.025996 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:41Z","lastTransitionTime":"2025-12-01T13:58:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.033169 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4694a9b9e5ea09fb1d1963504d94853e22fce9688a502036324f92f39bf8a8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:41Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.043633 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:41Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.055782 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xh4hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0e1eaa6-bee9-401a-b01a-9bf49a938b29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db9b353d1934e3336c4282cb057c52ef64e01675fa36e23d1fa0af7133508a62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f3e0b6c5dac9ff9b4686937a7d1646c4796a3326193349cf1556876bf1ae61a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f3e0b6c5dac9ff9b4686937a7d1646c4796a3326193349cf1556876bf1ae61a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2451dad343dcc9ec3bd5eb7109aefab60da5b700ae69d8eff28fef524795a758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2451dad343dcc9ec3bd5eb7109aefab60da5b700ae69d8eff28fef524795a758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9b888c3602a8f5d0f8cb762ab7d9b6676a0467f9546dbc38c5d2dd99acedd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9b888c3602a8f5d0f8cb762ab7d9b6676a0467f9546dbc38c5d2dd99acedd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3110c36a439f88ab7fc0903fd766db1abef51f129bb1fc68634531d0e30cab0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3110c36a439f88ab7fc0903fd766db1abef51f129bb1fc68634531d0e30cab0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xh4hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:41Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.066115 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:41Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.074430 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-62bsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98be7526-98f6-4a4a-b4a6-1d10e76b7a99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d4df68e891df5e4e44371a4c884475b5723d55f6a10d32889cf194ddbe6179\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxjs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-62bsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:41Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.085277 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b10cdae7-154c-4fd6-a308-02843603d7ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b40ee42d7d1c9043b0b5c0dea3baacc46c24842ec64ed3b0d0ae1acfaf65f8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa22ac0ee9ad705a8affafffed373135005ef0380c9a646e06229a24d04d20b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://949f7fe4da35866e66212aea18ec1ed303bd542b07e53ff6562618c19de112cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d7c5d98b7f6dcade92ba77f78aa8345baf8161727831a616192726f6ef6f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dfb22c5155fdc4bfdf9d54bf540e4018bca6a616835af7e8d5079bebcd45da\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T13:58:25Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 13:58:19.764445 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 13:58:19.765610 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3283210930/tls.crt::/tmp/serving-cert-3283210930/tls.key\\\\\\\"\\\\nI1201 13:58:25.461951 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 13:58:25.464641 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 13:58:25.464660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 13:58:25.464680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 13:58:25.464686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 13:58:25.473932 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 13:58:25.473958 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 13:58:25.473983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 13:58:25.473986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 13:58:25.473989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 13:58:25.474166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 13:58:25.477276 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc9645fabd7968d92ce9867130dddbb6c72664af68d6b5475cf3271d1975878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:41Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.096274 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e9e83cdb2aa3565484c1ff3e5bbd72c7f50a132de50e60aec1aa88ad145ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:41Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.106180 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7beb40d-bcd0-43c8-a9fe-c32408790a4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59e8d51b7e0c8b526fab11f39a0a94685aecda8a1d300e1ee2259eb4bbac9003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c2cf1c1e4965502ab56b6d9bbf7840d00225b831b1e613295befad0748d7d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lj9gs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:41Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.128443 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.128739 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.128814 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.128885 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.128845 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90163e3776ee8b2a338e3f8e6aa3d918d1d6123d246fbbd88659d342348167a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37437844a9c9719627f96c0afa303f0d449c46def7249b8c69ebd44900661fa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec93c685227d533ac1ab980dca6f72792f746e68511a7a546fb25189b02507d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b64c1a0d36f8addf6040455b1e933756582857ef8098c935b9cc5a115e8afee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d740a1be02b59a062ea8d5c44620cb57725a7ec494fd365e44a8980947c0d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdbccd3910729d23067d94c9ed2f5f3c917cd4880185a3b7e8c4c811cd30e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41a150020fb39ba2ce75b7cba4d24fff9e245b088b62f43a3feb8dcf8344aa33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41a150020fb39ba2ce75b7cba4d24fff9e245b088b62f43a3feb8dcf8344aa33\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"message\\\":\\\"nal_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/olm-operator-metrics]} name:Service_openshift-operator-lifecycle-manager/olm-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.168:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {63b1440a-0908-4cab-8799-012fa1cf0b07}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 13:58:38.627536 5917 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1201 13:58:38.627873 5917 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1201 13:58:38.627909 5917 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-tjkqr_openshift-ovn-kubernetes(b0b45150-070d-4f7c-b53a-d76dcbaa6e6d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c85de7cd35b11158d931065cec6bafaf456be350f4a6095b8a9d0851bf87b1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:41Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.128953 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:41Z","lastTransitionTime":"2025-12-01T13:58:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.140796 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nhp6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da6d315c-d478-4216-9f3d-57b20ce5ced8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ebdaa73edbe04780c0f421ea88446b4268ed1792af30e1ebb945167db80a548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n25g8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nhp6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:41Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.152161 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kn5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c45f35-33ae-4eba-97c7-eb85a6db85a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf9d61c1ae31a9cc21ea918e419af7e8badf234ef1614ef73bafdc95381935cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b27c5e93a6707bc6b2b6ef81596e39344c0b29f8cac30ac5d62c7eaa5e635f4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6kn5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:41Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.165762 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:41Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.179051 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9wjs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e7ad3ad-7937-409b-b1c9-9c801f937400\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3662f79465b11c168f80f17c534c792fb1cc6b8a5c0115d219d39def6db073\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvc7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9wjs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:41Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.192677 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-qrdw5"] Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.193219 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qrdw5" Dec 01 13:58:41 crc kubenswrapper[4585]: E1201 13:58:41.193311 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qrdw5" podUID="f11a95e1-135a-4fd2-9a04-1487c56a18e1" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.210065 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:41Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.224540 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-62bsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98be7526-98f6-4a4a-b4a6-1d10e76b7a99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d4df68e891df5e4e44371a4c884475b5723d55f6a10d32889cf194ddbe6179\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxjs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-62bsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:41Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.231357 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.231394 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.231404 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.231422 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.231434 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:41Z","lastTransitionTime":"2025-12-01T13:58:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.237760 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nhp6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da6d315c-d478-4216-9f3d-57b20ce5ced8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ebdaa73edbe04780c0f421ea88446b4268ed1792af30e1ebb945167db80a548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n25g8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nhp6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:41Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.251130 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kn5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c45f35-33ae-4eba-97c7-eb85a6db85a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf9d61c1ae31a9cc21ea918e419af7e8badf234ef1614ef73bafdc95381935cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b27c5e93a6707bc6b2b6ef81596e39344c0b29f8cac30ac5d62c7eaa5e635f4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6kn5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:41Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.262783 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b10cdae7-154c-4fd6-a308-02843603d7ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b40ee42d7d1c9043b0b5c0dea3baacc46c24842ec64ed3b0d0ae1acfaf65f8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa22ac0ee9ad705a8affafffed373135005ef0380c9a646e06229a24d04d20b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://949f7fe4da35866e66212aea18ec1ed303bd542b07e53ff6562618c19de112cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d7c5d98b7f6dcade92ba77f78aa8345baf8161727831a616192726f6ef6f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dfb22c5155fdc4bfdf9d54bf540e4018bca6a616835af7e8d5079bebcd45da\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T13:58:25Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 13:58:19.764445 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 13:58:19.765610 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3283210930/tls.crt::/tmp/serving-cert-3283210930/tls.key\\\\\\\"\\\\nI1201 13:58:25.461951 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 13:58:25.464641 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 13:58:25.464660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 13:58:25.464680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 13:58:25.464686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 13:58:25.473932 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 13:58:25.473958 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 13:58:25.473983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 13:58:25.473986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 13:58:25.473989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 13:58:25.474166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 13:58:25.477276 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc9645fabd7968d92ce9867130dddbb6c72664af68d6b5475cf3271d1975878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:41Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.274773 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e9e83cdb2aa3565484c1ff3e5bbd72c7f50a132de50e60aec1aa88ad145ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:41Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.285196 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7beb40d-bcd0-43c8-a9fe-c32408790a4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59e8d51b7e0c8b526fab11f39a0a94685aecda8a1d300e1ee2259eb4bbac9003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c2cf1c1e4965502ab56b6d9bbf7840d00225b831b1e613295befad0748d7d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lj9gs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:41Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.309324 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90163e3776ee8b2a338e3f8e6aa3d918d1d6123d246fbbd88659d342348167a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37437844a9c9719627f96c0afa303f0d449c46def7249b8c69ebd44900661fa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec93c685227d533ac1ab980dca6f72792f746e68511a7a546fb25189b02507d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b64c1a0d36f8addf6040455b1e933756582857ef8098c935b9cc5a115e8afee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d740a1be02b59a062ea8d5c44620cb57725a7ec494fd365e44a8980947c0d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdbccd3910729d23067d94c9ed2f5f3c917cd4880185a3b7e8c4c811cd30e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41a150020fb39ba2ce75b7cba4d24fff9e245b088b62f43a3feb8dcf8344aa33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41a150020fb39ba2ce75b7cba4d24fff9e245b088b62f43a3feb8dcf8344aa33\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"message\\\":\\\"nal_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/olm-operator-metrics]} name:Service_openshift-operator-lifecycle-manager/olm-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.168:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {63b1440a-0908-4cab-8799-012fa1cf0b07}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 13:58:38.627536 5917 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1201 13:58:38.627873 5917 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1201 13:58:38.627909 5917 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-tjkqr_openshift-ovn-kubernetes(b0b45150-070d-4f7c-b53a-d76dcbaa6e6d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c85de7cd35b11158d931065cec6bafaf456be350f4a6095b8a9d0851bf87b1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:41Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.326283 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:41Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.334406 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.334443 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.334477 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.334493 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.334503 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:41Z","lastTransitionTime":"2025-12-01T13:58:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.340901 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.341056 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f11a95e1-135a-4fd2-9a04-1487c56a18e1-metrics-certs\") pod \"network-metrics-daemon-qrdw5\" (UID: \"f11a95e1-135a-4fd2-9a04-1487c56a18e1\") " pod="openshift-multus/network-metrics-daemon-qrdw5" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.341090 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmr99\" (UniqueName: \"kubernetes.io/projected/f11a95e1-135a-4fd2-9a04-1487c56a18e1-kube-api-access-pmr99\") pod \"network-metrics-daemon-qrdw5\" (UID: \"f11a95e1-135a-4fd2-9a04-1487c56a18e1\") " pod="openshift-multus/network-metrics-daemon-qrdw5" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.341134 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.341165 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.341192 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.341220 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 13:58:41 crc kubenswrapper[4585]: E1201 13:58:41.341328 4585 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 13:58:41 crc kubenswrapper[4585]: E1201 13:58:41.341411 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 13:58:57.341388346 +0000 UTC m=+51.325602201 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 13:58:41 crc kubenswrapper[4585]: E1201 13:58:41.341560 4585 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 13:58:41 crc kubenswrapper[4585]: E1201 13:58:41.341623 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 13:58:57.341610342 +0000 UTC m=+51.325824197 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 13:58:41 crc kubenswrapper[4585]: E1201 13:58:41.341690 4585 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 13:58:41 crc kubenswrapper[4585]: E1201 13:58:41.341731 4585 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 13:58:41 crc kubenswrapper[4585]: E1201 13:58:41.341755 4585 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 13:58:41 crc kubenswrapper[4585]: E1201 13:58:41.341836 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 13:58:57.341801567 +0000 UTC m=+51.326015422 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 13:58:41 crc kubenswrapper[4585]: E1201 13:58:41.341949 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 13:58:57.341938431 +0000 UTC m=+51.326152286 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 13:58:41 crc kubenswrapper[4585]: E1201 13:58:41.342062 4585 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 13:58:41 crc kubenswrapper[4585]: E1201 13:58:41.342079 4585 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 13:58:41 crc kubenswrapper[4585]: E1201 13:58:41.342088 4585 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 13:58:41 crc kubenswrapper[4585]: E1201 13:58:41.342125 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 13:58:57.342117866 +0000 UTC m=+51.326331721 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.342385 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9wjs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e7ad3ad-7937-409b-b1c9-9c801f937400\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3662f79465b11c168f80f17c534c792fb1cc6b8a5c0115d219d39def6db073\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvc7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9wjs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:41Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.355043 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4694a9b9e5ea09fb1d1963504d94853e22fce9688a502036324f92f39bf8a8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:41Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.367199 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:41Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.381847 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xh4hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0e1eaa6-bee9-401a-b01a-9bf49a938b29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db9b353d1934e3336c4282cb057c52ef64e01675fa36e23d1fa0af7133508a62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f3e0b6c5dac9ff9b4686937a7d1646c4796a3326193349cf1556876bf1ae61a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f3e0b6c5dac9ff9b4686937a7d1646c4796a3326193349cf1556876bf1ae61a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2451dad343dcc9ec3bd5eb7109aefab60da5b700ae69d8eff28fef524795a758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2451dad343dcc9ec3bd5eb7109aefab60da5b700ae69d8eff28fef524795a758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9b888c3602a8f5d0f8cb762ab7d9b6676a0467f9546dbc38c5d2dd99acedd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9b888c3602a8f5d0f8cb762ab7d9b6676a0467f9546dbc38c5d2dd99acedd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3110c36a439f88ab7fc0903fd766db1abef51f129bb1fc68634531d0e30cab0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3110c36a439f88ab7fc0903fd766db1abef51f129bb1fc68634531d0e30cab0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xh4hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:41Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.392370 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qrdw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f11a95e1-135a-4fd2-9a04-1487c56a18e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmr99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmr99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qrdw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:41Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.413371 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b62eed93-4d28-4748-b991-dd9692b58341\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e952d1592344330416348b9ec4add60740a0d23ab554afb70c6419977fb12533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e35deff79558e54b62aa868535d5bc4d4374ede161463dcf5f3a33584867447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9736ff6b49635cb7c26db7bc2292b36cf63053e0d1e4ffa2980a520dd71d9ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe7079ed6fa6b8b64d16c67287c33c2bffa0f42df2166fe66e3f8e173ca87d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83ba28d0b57f4744f3f7f97915fde983e2de227958ddaecd43b3eae9eb406774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:41Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.427228 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4030376c-31f1-420a-935f-f1ee174914e7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2571804bbf066bfec1f4fde3430800aa87d9c5b6db48cc1f3d352d742657ca8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d240ebf61216f20038983f7b8d2e0792fc0f1ed8db050debf02c777197d417b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f66a3bd7503966a7b443b7ea1adbd8554c99e810749bc275e5cfa4e95c6b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4434211239bc61f432b46918bbb691c6257fded2fbd9552ce3344d10e091b3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:41Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.437504 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.437546 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.437555 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.437575 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.437584 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:41Z","lastTransitionTime":"2025-12-01T13:58:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.441309 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b3ea01760c8f1e58a55a59a0d453c25ca1298ede32d471f02092fd9d0a43830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e7aa15babbd745afed6bc0d4bbf5526180b8d32f86f8db736066fad3feb506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:41Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.441721 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f11a95e1-135a-4fd2-9a04-1487c56a18e1-metrics-certs\") pod \"network-metrics-daemon-qrdw5\" (UID: \"f11a95e1-135a-4fd2-9a04-1487c56a18e1\") " pod="openshift-multus/network-metrics-daemon-qrdw5" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.441747 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmr99\" (UniqueName: \"kubernetes.io/projected/f11a95e1-135a-4fd2-9a04-1487c56a18e1-kube-api-access-pmr99\") pod \"network-metrics-daemon-qrdw5\" (UID: \"f11a95e1-135a-4fd2-9a04-1487c56a18e1\") " pod="openshift-multus/network-metrics-daemon-qrdw5" Dec 01 13:58:41 crc kubenswrapper[4585]: E1201 13:58:41.441922 4585 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 13:58:41 crc kubenswrapper[4585]: E1201 13:58:41.442094 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f11a95e1-135a-4fd2-9a04-1487c56a18e1-metrics-certs podName:f11a95e1-135a-4fd2-9a04-1487c56a18e1 nodeName:}" failed. No retries permitted until 2025-12-01 13:58:41.94206857 +0000 UTC m=+35.926282495 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f11a95e1-135a-4fd2-9a04-1487c56a18e1-metrics-certs") pod "network-metrics-daemon-qrdw5" (UID: "f11a95e1-135a-4fd2-9a04-1487c56a18e1") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.459515 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmr99\" (UniqueName: \"kubernetes.io/projected/f11a95e1-135a-4fd2-9a04-1487c56a18e1-kube-api-access-pmr99\") pod \"network-metrics-daemon-qrdw5\" (UID: \"f11a95e1-135a-4fd2-9a04-1487c56a18e1\") " pod="openshift-multus/network-metrics-daemon-qrdw5" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.540362 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.540637 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.540699 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.540766 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.540829 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:41Z","lastTransitionTime":"2025-12-01T13:58:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.644068 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.644116 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.644127 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.644145 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.644157 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:41Z","lastTransitionTime":"2025-12-01T13:58:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.747736 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.748173 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.748185 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.748206 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.748217 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:41Z","lastTransitionTime":"2025-12-01T13:58:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.850892 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.850945 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.850956 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.851000 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.851014 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:41Z","lastTransitionTime":"2025-12-01T13:58:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.946123 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f11a95e1-135a-4fd2-9a04-1487c56a18e1-metrics-certs\") pod \"network-metrics-daemon-qrdw5\" (UID: \"f11a95e1-135a-4fd2-9a04-1487c56a18e1\") " pod="openshift-multus/network-metrics-daemon-qrdw5" Dec 01 13:58:41 crc kubenswrapper[4585]: E1201 13:58:41.946376 4585 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 13:58:41 crc kubenswrapper[4585]: E1201 13:58:41.946485 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f11a95e1-135a-4fd2-9a04-1487c56a18e1-metrics-certs podName:f11a95e1-135a-4fd2-9a04-1487c56a18e1 nodeName:}" failed. No retries permitted until 2025-12-01 13:58:42.946460996 +0000 UTC m=+36.930674851 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f11a95e1-135a-4fd2-9a04-1487c56a18e1-metrics-certs") pod "network-metrics-daemon-qrdw5" (UID: "f11a95e1-135a-4fd2-9a04-1487c56a18e1") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.953908 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.953992 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.954009 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.954028 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:41 crc kubenswrapper[4585]: I1201 13:58:41.954041 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:41Z","lastTransitionTime":"2025-12-01T13:58:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:42 crc kubenswrapper[4585]: I1201 13:58:42.056111 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:42 crc kubenswrapper[4585]: I1201 13:58:42.056140 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:42 crc kubenswrapper[4585]: I1201 13:58:42.056149 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:42 crc kubenswrapper[4585]: I1201 13:58:42.056163 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:42 crc kubenswrapper[4585]: I1201 13:58:42.056172 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:42Z","lastTransitionTime":"2025-12-01T13:58:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:42 crc kubenswrapper[4585]: I1201 13:58:42.157900 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:42 crc kubenswrapper[4585]: I1201 13:58:42.157931 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:42 crc kubenswrapper[4585]: I1201 13:58:42.157940 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:42 crc kubenswrapper[4585]: I1201 13:58:42.157955 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:42 crc kubenswrapper[4585]: I1201 13:58:42.157966 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:42Z","lastTransitionTime":"2025-12-01T13:58:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:42 crc kubenswrapper[4585]: I1201 13:58:42.259535 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:42 crc kubenswrapper[4585]: I1201 13:58:42.259595 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:42 crc kubenswrapper[4585]: I1201 13:58:42.259608 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:42 crc kubenswrapper[4585]: I1201 13:58:42.259625 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:42 crc kubenswrapper[4585]: I1201 13:58:42.259637 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:42Z","lastTransitionTime":"2025-12-01T13:58:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:42 crc kubenswrapper[4585]: I1201 13:58:42.362121 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:42 crc kubenswrapper[4585]: I1201 13:58:42.362161 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:42 crc kubenswrapper[4585]: I1201 13:58:42.362173 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:42 crc kubenswrapper[4585]: I1201 13:58:42.362193 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:42 crc kubenswrapper[4585]: I1201 13:58:42.362485 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:42Z","lastTransitionTime":"2025-12-01T13:58:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:42 crc kubenswrapper[4585]: I1201 13:58:42.412187 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 13:58:42 crc kubenswrapper[4585]: I1201 13:58:42.412260 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 13:58:42 crc kubenswrapper[4585]: E1201 13:58:42.412378 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 13:58:42 crc kubenswrapper[4585]: I1201 13:58:42.412399 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 13:58:42 crc kubenswrapper[4585]: E1201 13:58:42.412503 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 13:58:42 crc kubenswrapper[4585]: E1201 13:58:42.412571 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 13:58:42 crc kubenswrapper[4585]: I1201 13:58:42.464623 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:42 crc kubenswrapper[4585]: I1201 13:58:42.464665 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:42 crc kubenswrapper[4585]: I1201 13:58:42.464676 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:42 crc kubenswrapper[4585]: I1201 13:58:42.464693 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:42 crc kubenswrapper[4585]: I1201 13:58:42.464704 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:42Z","lastTransitionTime":"2025-12-01T13:58:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:42 crc kubenswrapper[4585]: I1201 13:58:42.566445 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:42 crc kubenswrapper[4585]: I1201 13:58:42.566482 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:42 crc kubenswrapper[4585]: I1201 13:58:42.566491 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:42 crc kubenswrapper[4585]: I1201 13:58:42.566506 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:42 crc kubenswrapper[4585]: I1201 13:58:42.566516 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:42Z","lastTransitionTime":"2025-12-01T13:58:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:42 crc kubenswrapper[4585]: I1201 13:58:42.668471 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:42 crc kubenswrapper[4585]: I1201 13:58:42.668506 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:42 crc kubenswrapper[4585]: I1201 13:58:42.668515 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:42 crc kubenswrapper[4585]: I1201 13:58:42.668530 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:42 crc kubenswrapper[4585]: I1201 13:58:42.668540 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:42Z","lastTransitionTime":"2025-12-01T13:58:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:42 crc kubenswrapper[4585]: I1201 13:58:42.770520 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:42 crc kubenswrapper[4585]: I1201 13:58:42.770568 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:42 crc kubenswrapper[4585]: I1201 13:58:42.770578 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:42 crc kubenswrapper[4585]: I1201 13:58:42.770593 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:42 crc kubenswrapper[4585]: I1201 13:58:42.770603 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:42Z","lastTransitionTime":"2025-12-01T13:58:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:42 crc kubenswrapper[4585]: I1201 13:58:42.872552 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:42 crc kubenswrapper[4585]: I1201 13:58:42.872581 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:42 crc kubenswrapper[4585]: I1201 13:58:42.872591 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:42 crc kubenswrapper[4585]: I1201 13:58:42.872606 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:42 crc kubenswrapper[4585]: I1201 13:58:42.872614 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:42Z","lastTransitionTime":"2025-12-01T13:58:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:42 crc kubenswrapper[4585]: I1201 13:58:42.958959 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f11a95e1-135a-4fd2-9a04-1487c56a18e1-metrics-certs\") pod \"network-metrics-daemon-qrdw5\" (UID: \"f11a95e1-135a-4fd2-9a04-1487c56a18e1\") " pod="openshift-multus/network-metrics-daemon-qrdw5" Dec 01 13:58:42 crc kubenswrapper[4585]: E1201 13:58:42.959170 4585 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 13:58:42 crc kubenswrapper[4585]: E1201 13:58:42.959249 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f11a95e1-135a-4fd2-9a04-1487c56a18e1-metrics-certs podName:f11a95e1-135a-4fd2-9a04-1487c56a18e1 nodeName:}" failed. No retries permitted until 2025-12-01 13:58:44.959226821 +0000 UTC m=+38.943440676 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f11a95e1-135a-4fd2-9a04-1487c56a18e1-metrics-certs") pod "network-metrics-daemon-qrdw5" (UID: "f11a95e1-135a-4fd2-9a04-1487c56a18e1") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 13:58:42 crc kubenswrapper[4585]: I1201 13:58:42.975185 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:42 crc kubenswrapper[4585]: I1201 13:58:42.975230 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:42 crc kubenswrapper[4585]: I1201 13:58:42.975240 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:42 crc kubenswrapper[4585]: I1201 13:58:42.975259 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:42 crc kubenswrapper[4585]: I1201 13:58:42.975271 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:42Z","lastTransitionTime":"2025-12-01T13:58:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:43 crc kubenswrapper[4585]: I1201 13:58:43.078315 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:43 crc kubenswrapper[4585]: I1201 13:58:43.078354 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:43 crc kubenswrapper[4585]: I1201 13:58:43.078384 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:43 crc kubenswrapper[4585]: I1201 13:58:43.078403 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:43 crc kubenswrapper[4585]: I1201 13:58:43.078415 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:43Z","lastTransitionTime":"2025-12-01T13:58:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:43 crc kubenswrapper[4585]: I1201 13:58:43.181597 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:43 crc kubenswrapper[4585]: I1201 13:58:43.181649 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:43 crc kubenswrapper[4585]: I1201 13:58:43.181660 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:43 crc kubenswrapper[4585]: I1201 13:58:43.181679 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:43 crc kubenswrapper[4585]: I1201 13:58:43.181690 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:43Z","lastTransitionTime":"2025-12-01T13:58:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:43 crc kubenswrapper[4585]: I1201 13:58:43.285606 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:43 crc kubenswrapper[4585]: I1201 13:58:43.285657 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:43 crc kubenswrapper[4585]: I1201 13:58:43.285668 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:43 crc kubenswrapper[4585]: I1201 13:58:43.285687 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:43 crc kubenswrapper[4585]: I1201 13:58:43.285701 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:43Z","lastTransitionTime":"2025-12-01T13:58:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:43 crc kubenswrapper[4585]: I1201 13:58:43.388487 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:43 crc kubenswrapper[4585]: I1201 13:58:43.388851 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:43 crc kubenswrapper[4585]: I1201 13:58:43.388877 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:43 crc kubenswrapper[4585]: I1201 13:58:43.388910 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:43 crc kubenswrapper[4585]: I1201 13:58:43.388933 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:43Z","lastTransitionTime":"2025-12-01T13:58:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:43 crc kubenswrapper[4585]: I1201 13:58:43.411837 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qrdw5" Dec 01 13:58:43 crc kubenswrapper[4585]: E1201 13:58:43.412039 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qrdw5" podUID="f11a95e1-135a-4fd2-9a04-1487c56a18e1" Dec 01 13:58:43 crc kubenswrapper[4585]: I1201 13:58:43.491348 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:43 crc kubenswrapper[4585]: I1201 13:58:43.491401 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:43 crc kubenswrapper[4585]: I1201 13:58:43.491411 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:43 crc kubenswrapper[4585]: I1201 13:58:43.491431 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:43 crc kubenswrapper[4585]: I1201 13:58:43.491445 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:43Z","lastTransitionTime":"2025-12-01T13:58:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:43 crc kubenswrapper[4585]: I1201 13:58:43.595095 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:43 crc kubenswrapper[4585]: I1201 13:58:43.595146 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:43 crc kubenswrapper[4585]: I1201 13:58:43.595158 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:43 crc kubenswrapper[4585]: I1201 13:58:43.595178 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:43 crc kubenswrapper[4585]: I1201 13:58:43.595189 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:43Z","lastTransitionTime":"2025-12-01T13:58:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:43 crc kubenswrapper[4585]: I1201 13:58:43.698085 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:43 crc kubenswrapper[4585]: I1201 13:58:43.698868 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:43 crc kubenswrapper[4585]: I1201 13:58:43.698925 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:43 crc kubenswrapper[4585]: I1201 13:58:43.698963 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:43 crc kubenswrapper[4585]: I1201 13:58:43.698994 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:43Z","lastTransitionTime":"2025-12-01T13:58:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:43 crc kubenswrapper[4585]: I1201 13:58:43.802727 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:43 crc kubenswrapper[4585]: I1201 13:58:43.802784 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:43 crc kubenswrapper[4585]: I1201 13:58:43.802799 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:43 crc kubenswrapper[4585]: I1201 13:58:43.802819 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:43 crc kubenswrapper[4585]: I1201 13:58:43.802834 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:43Z","lastTransitionTime":"2025-12-01T13:58:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:43 crc kubenswrapper[4585]: I1201 13:58:43.905418 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:43 crc kubenswrapper[4585]: I1201 13:58:43.905469 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:43 crc kubenswrapper[4585]: I1201 13:58:43.905480 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:43 crc kubenswrapper[4585]: I1201 13:58:43.905502 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:43 crc kubenswrapper[4585]: I1201 13:58:43.905514 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:43Z","lastTransitionTime":"2025-12-01T13:58:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:44 crc kubenswrapper[4585]: I1201 13:58:44.007793 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:44 crc kubenswrapper[4585]: I1201 13:58:44.007856 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:44 crc kubenswrapper[4585]: I1201 13:58:44.007870 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:44 crc kubenswrapper[4585]: I1201 13:58:44.007893 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:44 crc kubenswrapper[4585]: I1201 13:58:44.007908 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:44Z","lastTransitionTime":"2025-12-01T13:58:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:44 crc kubenswrapper[4585]: I1201 13:58:44.110298 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:44 crc kubenswrapper[4585]: I1201 13:58:44.110335 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:44 crc kubenswrapper[4585]: I1201 13:58:44.110344 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:44 crc kubenswrapper[4585]: I1201 13:58:44.110359 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:44 crc kubenswrapper[4585]: I1201 13:58:44.110369 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:44Z","lastTransitionTime":"2025-12-01T13:58:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:44 crc kubenswrapper[4585]: I1201 13:58:44.212766 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:44 crc kubenswrapper[4585]: I1201 13:58:44.212806 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:44 crc kubenswrapper[4585]: I1201 13:58:44.212816 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:44 crc kubenswrapper[4585]: I1201 13:58:44.212832 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:44 crc kubenswrapper[4585]: I1201 13:58:44.212841 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:44Z","lastTransitionTime":"2025-12-01T13:58:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:44 crc kubenswrapper[4585]: I1201 13:58:44.315267 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:44 crc kubenswrapper[4585]: I1201 13:58:44.315525 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:44 crc kubenswrapper[4585]: I1201 13:58:44.315645 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:44 crc kubenswrapper[4585]: I1201 13:58:44.315713 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:44 crc kubenswrapper[4585]: I1201 13:58:44.315781 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:44Z","lastTransitionTime":"2025-12-01T13:58:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:44 crc kubenswrapper[4585]: I1201 13:58:44.412054 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 13:58:44 crc kubenswrapper[4585]: E1201 13:58:44.412240 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 13:58:44 crc kubenswrapper[4585]: I1201 13:58:44.412065 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 13:58:44 crc kubenswrapper[4585]: E1201 13:58:44.412311 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 13:58:44 crc kubenswrapper[4585]: I1201 13:58:44.412054 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 13:58:44 crc kubenswrapper[4585]: E1201 13:58:44.412358 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 13:58:44 crc kubenswrapper[4585]: I1201 13:58:44.417251 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:44 crc kubenswrapper[4585]: I1201 13:58:44.417279 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:44 crc kubenswrapper[4585]: I1201 13:58:44.417309 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:44 crc kubenswrapper[4585]: I1201 13:58:44.417326 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:44 crc kubenswrapper[4585]: I1201 13:58:44.417337 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:44Z","lastTransitionTime":"2025-12-01T13:58:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:44 crc kubenswrapper[4585]: I1201 13:58:44.519654 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:44 crc kubenswrapper[4585]: I1201 13:58:44.519733 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:44 crc kubenswrapper[4585]: I1201 13:58:44.519746 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:44 crc kubenswrapper[4585]: I1201 13:58:44.519793 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:44 crc kubenswrapper[4585]: I1201 13:58:44.519808 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:44Z","lastTransitionTime":"2025-12-01T13:58:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:44 crc kubenswrapper[4585]: I1201 13:58:44.622844 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:44 crc kubenswrapper[4585]: I1201 13:58:44.622897 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:44 crc kubenswrapper[4585]: I1201 13:58:44.622918 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:44 crc kubenswrapper[4585]: I1201 13:58:44.622938 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:44 crc kubenswrapper[4585]: I1201 13:58:44.622949 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:44Z","lastTransitionTime":"2025-12-01T13:58:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:44 crc kubenswrapper[4585]: I1201 13:58:44.737269 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:44 crc kubenswrapper[4585]: I1201 13:58:44.737321 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:44 crc kubenswrapper[4585]: I1201 13:58:44.737330 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:44 crc kubenswrapper[4585]: I1201 13:58:44.737345 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:44 crc kubenswrapper[4585]: I1201 13:58:44.737355 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:44Z","lastTransitionTime":"2025-12-01T13:58:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:44 crc kubenswrapper[4585]: I1201 13:58:44.839821 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:44 crc kubenswrapper[4585]: I1201 13:58:44.840454 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:44 crc kubenswrapper[4585]: I1201 13:58:44.840528 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:44 crc kubenswrapper[4585]: I1201 13:58:44.840592 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:44 crc kubenswrapper[4585]: I1201 13:58:44.840648 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:44Z","lastTransitionTime":"2025-12-01T13:58:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:44 crc kubenswrapper[4585]: I1201 13:58:44.943662 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:44 crc kubenswrapper[4585]: I1201 13:58:44.943730 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:44 crc kubenswrapper[4585]: I1201 13:58:44.943744 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:44 crc kubenswrapper[4585]: I1201 13:58:44.943768 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:44 crc kubenswrapper[4585]: I1201 13:58:44.943785 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:44Z","lastTransitionTime":"2025-12-01T13:58:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:45 crc kubenswrapper[4585]: I1201 13:58:45.038382 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f11a95e1-135a-4fd2-9a04-1487c56a18e1-metrics-certs\") pod \"network-metrics-daemon-qrdw5\" (UID: \"f11a95e1-135a-4fd2-9a04-1487c56a18e1\") " pod="openshift-multus/network-metrics-daemon-qrdw5" Dec 01 13:58:45 crc kubenswrapper[4585]: E1201 13:58:45.038548 4585 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 13:58:45 crc kubenswrapper[4585]: E1201 13:58:45.038610 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f11a95e1-135a-4fd2-9a04-1487c56a18e1-metrics-certs podName:f11a95e1-135a-4fd2-9a04-1487c56a18e1 nodeName:}" failed. No retries permitted until 2025-12-01 13:58:49.038590297 +0000 UTC m=+43.022804152 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f11a95e1-135a-4fd2-9a04-1487c56a18e1-metrics-certs") pod "network-metrics-daemon-qrdw5" (UID: "f11a95e1-135a-4fd2-9a04-1487c56a18e1") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 13:58:45 crc kubenswrapper[4585]: I1201 13:58:45.046204 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:45 crc kubenswrapper[4585]: I1201 13:58:45.046248 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:45 crc kubenswrapper[4585]: I1201 13:58:45.046260 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:45 crc kubenswrapper[4585]: I1201 13:58:45.046278 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:45 crc kubenswrapper[4585]: I1201 13:58:45.046290 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:45Z","lastTransitionTime":"2025-12-01T13:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:45 crc kubenswrapper[4585]: I1201 13:58:45.148450 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:45 crc kubenswrapper[4585]: I1201 13:58:45.148484 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:45 crc kubenswrapper[4585]: I1201 13:58:45.148495 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:45 crc kubenswrapper[4585]: I1201 13:58:45.148511 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:45 crc kubenswrapper[4585]: I1201 13:58:45.148521 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:45Z","lastTransitionTime":"2025-12-01T13:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:45 crc kubenswrapper[4585]: I1201 13:58:45.251237 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:45 crc kubenswrapper[4585]: I1201 13:58:45.251276 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:45 crc kubenswrapper[4585]: I1201 13:58:45.251285 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:45 crc kubenswrapper[4585]: I1201 13:58:45.251303 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:45 crc kubenswrapper[4585]: I1201 13:58:45.251312 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:45Z","lastTransitionTime":"2025-12-01T13:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:45 crc kubenswrapper[4585]: I1201 13:58:45.354384 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:45 crc kubenswrapper[4585]: I1201 13:58:45.354423 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:45 crc kubenswrapper[4585]: I1201 13:58:45.354434 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:45 crc kubenswrapper[4585]: I1201 13:58:45.354452 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:45 crc kubenswrapper[4585]: I1201 13:58:45.354465 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:45Z","lastTransitionTime":"2025-12-01T13:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:45 crc kubenswrapper[4585]: I1201 13:58:45.411883 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qrdw5" Dec 01 13:58:45 crc kubenswrapper[4585]: E1201 13:58:45.412458 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qrdw5" podUID="f11a95e1-135a-4fd2-9a04-1487c56a18e1" Dec 01 13:58:45 crc kubenswrapper[4585]: I1201 13:58:45.457769 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:45 crc kubenswrapper[4585]: I1201 13:58:45.457814 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:45 crc kubenswrapper[4585]: I1201 13:58:45.457823 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:45 crc kubenswrapper[4585]: I1201 13:58:45.457841 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:45 crc kubenswrapper[4585]: I1201 13:58:45.457852 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:45Z","lastTransitionTime":"2025-12-01T13:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:45 crc kubenswrapper[4585]: I1201 13:58:45.559926 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:45 crc kubenswrapper[4585]: I1201 13:58:45.559960 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:45 crc kubenswrapper[4585]: I1201 13:58:45.559999 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:45 crc kubenswrapper[4585]: I1201 13:58:45.560016 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:45 crc kubenswrapper[4585]: I1201 13:58:45.560025 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:45Z","lastTransitionTime":"2025-12-01T13:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:45 crc kubenswrapper[4585]: I1201 13:58:45.663105 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:45 crc kubenswrapper[4585]: I1201 13:58:45.663430 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:45 crc kubenswrapper[4585]: I1201 13:58:45.663488 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:45 crc kubenswrapper[4585]: I1201 13:58:45.663549 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:45 crc kubenswrapper[4585]: I1201 13:58:45.663602 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:45Z","lastTransitionTime":"2025-12-01T13:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:45 crc kubenswrapper[4585]: I1201 13:58:45.765947 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:45 crc kubenswrapper[4585]: I1201 13:58:45.766267 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:45 crc kubenswrapper[4585]: I1201 13:58:45.766403 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:45 crc kubenswrapper[4585]: I1201 13:58:45.766496 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:45 crc kubenswrapper[4585]: I1201 13:58:45.766556 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:45Z","lastTransitionTime":"2025-12-01T13:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:45 crc kubenswrapper[4585]: I1201 13:58:45.869480 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:45 crc kubenswrapper[4585]: I1201 13:58:45.870171 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:45 crc kubenswrapper[4585]: I1201 13:58:45.870209 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:45 crc kubenswrapper[4585]: I1201 13:58:45.870230 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:45 crc kubenswrapper[4585]: I1201 13:58:45.870243 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:45Z","lastTransitionTime":"2025-12-01T13:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:45 crc kubenswrapper[4585]: I1201 13:58:45.973260 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:45 crc kubenswrapper[4585]: I1201 13:58:45.973316 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:45 crc kubenswrapper[4585]: I1201 13:58:45.973326 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:45 crc kubenswrapper[4585]: I1201 13:58:45.973343 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:45 crc kubenswrapper[4585]: I1201 13:58:45.973352 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:45Z","lastTransitionTime":"2025-12-01T13:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:46 crc kubenswrapper[4585]: I1201 13:58:46.076030 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:46 crc kubenswrapper[4585]: I1201 13:58:46.076075 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:46 crc kubenswrapper[4585]: I1201 13:58:46.076084 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:46 crc kubenswrapper[4585]: I1201 13:58:46.076100 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:46 crc kubenswrapper[4585]: I1201 13:58:46.076110 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:46Z","lastTransitionTime":"2025-12-01T13:58:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:46 crc kubenswrapper[4585]: I1201 13:58:46.178380 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:46 crc kubenswrapper[4585]: I1201 13:58:46.178674 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:46 crc kubenswrapper[4585]: I1201 13:58:46.178963 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:46 crc kubenswrapper[4585]: I1201 13:58:46.179173 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:46 crc kubenswrapper[4585]: I1201 13:58:46.179345 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:46Z","lastTransitionTime":"2025-12-01T13:58:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:46 crc kubenswrapper[4585]: I1201 13:58:46.282094 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:46 crc kubenswrapper[4585]: I1201 13:58:46.282128 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:46 crc kubenswrapper[4585]: I1201 13:58:46.282136 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:46 crc kubenswrapper[4585]: I1201 13:58:46.282150 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:46 crc kubenswrapper[4585]: I1201 13:58:46.282159 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:46Z","lastTransitionTime":"2025-12-01T13:58:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:46 crc kubenswrapper[4585]: I1201 13:58:46.385396 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:46 crc kubenswrapper[4585]: I1201 13:58:46.385458 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:46 crc kubenswrapper[4585]: I1201 13:58:46.385468 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:46 crc kubenswrapper[4585]: I1201 13:58:46.385488 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:46 crc kubenswrapper[4585]: I1201 13:58:46.385498 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:46Z","lastTransitionTime":"2025-12-01T13:58:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:46 crc kubenswrapper[4585]: I1201 13:58:46.411827 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 13:58:46 crc kubenswrapper[4585]: I1201 13:58:46.411885 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 13:58:46 crc kubenswrapper[4585]: I1201 13:58:46.411861 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 13:58:46 crc kubenswrapper[4585]: E1201 13:58:46.412255 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 13:58:46 crc kubenswrapper[4585]: E1201 13:58:46.412516 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 13:58:46 crc kubenswrapper[4585]: E1201 13:58:46.412554 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 13:58:46 crc kubenswrapper[4585]: I1201 13:58:46.426340 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:46Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:46 crc kubenswrapper[4585]: I1201 13:58:46.440167 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9wjs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e7ad3ad-7937-409b-b1c9-9c801f937400\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3662f79465b11c168f80f17c534c792fb1cc6b8a5c0115d219d39def6db073\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvc7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9wjs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:46Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:46 crc kubenswrapper[4585]: I1201 13:58:46.453865 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b3ea01760c8f1e58a55a59a0d453c25ca1298ede32d471f02092fd9d0a43830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e7aa15babbd745afed6bc0d4bbf5526180b8d32f86f8db736066fad3feb506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:46Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:46 crc kubenswrapper[4585]: I1201 13:58:46.469792 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4694a9b9e5ea09fb1d1963504d94853e22fce9688a502036324f92f39bf8a8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:46Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:46 crc kubenswrapper[4585]: I1201 13:58:46.488841 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:46 crc kubenswrapper[4585]: I1201 13:58:46.489153 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:46 crc kubenswrapper[4585]: I1201 13:58:46.489245 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:46 crc kubenswrapper[4585]: I1201 13:58:46.489339 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:46 crc kubenswrapper[4585]: I1201 13:58:46.489422 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:46Z","lastTransitionTime":"2025-12-01T13:58:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:46 crc kubenswrapper[4585]: I1201 13:58:46.489553 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:46Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:46 crc kubenswrapper[4585]: I1201 13:58:46.508269 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xh4hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0e1eaa6-bee9-401a-b01a-9bf49a938b29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db9b353d1934e3336c4282cb057c52ef64e01675fa36e23d1fa0af7133508a62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f3e0b6c5dac9ff9b4686937a7d1646c4796a3326193349cf1556876bf1ae61a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f3e0b6c5dac9ff9b4686937a7d1646c4796a3326193349cf1556876bf1ae61a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2451dad343dcc9ec3bd5eb7109aefab60da5b700ae69d8eff28fef524795a758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2451dad343dcc9ec3bd5eb7109aefab60da5b700ae69d8eff28fef524795a758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9b888c3602a8f5d0f8cb762ab7d9b6676a0467f9546dbc38c5d2dd99acedd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9b888c3602a8f5d0f8cb762ab7d9b6676a0467f9546dbc38c5d2dd99acedd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3110c36a439f88ab7fc0903fd766db1abef51f129bb1fc68634531d0e30cab0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3110c36a439f88ab7fc0903fd766db1abef51f129bb1fc68634531d0e30cab0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xh4hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:46Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:46 crc kubenswrapper[4585]: I1201 13:58:46.519518 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qrdw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f11a95e1-135a-4fd2-9a04-1487c56a18e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmr99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmr99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qrdw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:46Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:46 crc kubenswrapper[4585]: I1201 13:58:46.542376 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b62eed93-4d28-4748-b991-dd9692b58341\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e952d1592344330416348b9ec4add60740a0d23ab554afb70c6419977fb12533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e35deff79558e54b62aa868535d5bc4d4374ede161463dcf5f3a33584867447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9736ff6b49635cb7c26db7bc2292b36cf63053e0d1e4ffa2980a520dd71d9ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe7079ed6fa6b8b64d16c67287c33c2bffa0f42df2166fe66e3f8e173ca87d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83ba28d0b57f4744f3f7f97915fde983e2de227958ddaecd43b3eae9eb406774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:46Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:46 crc kubenswrapper[4585]: I1201 13:58:46.554108 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4030376c-31f1-420a-935f-f1ee174914e7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2571804bbf066bfec1f4fde3430800aa87d9c5b6db48cc1f3d352d742657ca8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d240ebf61216f20038983f7b8d2e0792fc0f1ed8db050debf02c777197d417b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f66a3bd7503966a7b443b7ea1adbd8554c99e810749bc275e5cfa4e95c6b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4434211239bc61f432b46918bbb691c6257fded2fbd9552ce3344d10e091b3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:46Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:46 crc kubenswrapper[4585]: I1201 13:58:46.564679 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-62bsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98be7526-98f6-4a4a-b4a6-1d10e76b7a99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d4df68e891df5e4e44371a4c884475b5723d55f6a10d32889cf194ddbe6179\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxjs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-62bsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:46Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:46 crc kubenswrapper[4585]: I1201 13:58:46.575680 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:46Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:46 crc kubenswrapper[4585]: I1201 13:58:46.587207 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7beb40d-bcd0-43c8-a9fe-c32408790a4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59e8d51b7e0c8b526fab11f39a0a94685aecda8a1d300e1ee2259eb4bbac9003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c2cf1c1e4965502ab56b6d9bbf7840d00225b831b1e613295befad0748d7d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lj9gs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:46Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:46 crc kubenswrapper[4585]: I1201 13:58:46.592053 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:46 crc kubenswrapper[4585]: I1201 13:58:46.592270 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:46 crc kubenswrapper[4585]: I1201 13:58:46.592344 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:46 crc kubenswrapper[4585]: I1201 13:58:46.592422 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:46 crc kubenswrapper[4585]: I1201 13:58:46.592492 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:46Z","lastTransitionTime":"2025-12-01T13:58:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:46 crc kubenswrapper[4585]: I1201 13:58:46.604585 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90163e3776ee8b2a338e3f8e6aa3d918d1d6123d246fbbd88659d342348167a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37437844a9c9719627f96c0afa303f0d449c46def7249b8c69ebd44900661fa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec93c685227d533ac1ab980dca6f72792f746e68511a7a546fb25189b02507d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b64c1a0d36f8addf6040455b1e933756582857ef8098c935b9cc5a115e8afee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d740a1be02b59a062ea8d5c44620cb57725a7ec494fd365e44a8980947c0d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdbccd3910729d23067d94c9ed2f5f3c917cd4880185a3b7e8c4c811cd30e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41a150020fb39ba2ce75b7cba4d24fff9e245b088b62f43a3feb8dcf8344aa33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41a150020fb39ba2ce75b7cba4d24fff9e245b088b62f43a3feb8dcf8344aa33\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"message\\\":\\\"nal_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/olm-operator-metrics]} name:Service_openshift-operator-lifecycle-manager/olm-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.168:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {63b1440a-0908-4cab-8799-012fa1cf0b07}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 13:58:38.627536 5917 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1201 13:58:38.627873 5917 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1201 13:58:38.627909 5917 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-tjkqr_openshift-ovn-kubernetes(b0b45150-070d-4f7c-b53a-d76dcbaa6e6d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c85de7cd35b11158d931065cec6bafaf456be350f4a6095b8a9d0851bf87b1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:46Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:46 crc kubenswrapper[4585]: I1201 13:58:46.615229 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nhp6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da6d315c-d478-4216-9f3d-57b20ce5ced8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ebdaa73edbe04780c0f421ea88446b4268ed1792af30e1ebb945167db80a548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n25g8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nhp6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:46Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:46 crc kubenswrapper[4585]: I1201 13:58:46.625736 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kn5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c45f35-33ae-4eba-97c7-eb85a6db85a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf9d61c1ae31a9cc21ea918e419af7e8badf234ef1614ef73bafdc95381935cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b27c5e93a6707bc6b2b6ef81596e39344c0b29f8cac30ac5d62c7eaa5e635f4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6kn5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:46Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:46 crc kubenswrapper[4585]: I1201 13:58:46.637323 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b10cdae7-154c-4fd6-a308-02843603d7ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b40ee42d7d1c9043b0b5c0dea3baacc46c24842ec64ed3b0d0ae1acfaf65f8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa22ac0ee9ad705a8affafffed373135005ef0380c9a646e06229a24d04d20b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://949f7fe4da35866e66212aea18ec1ed303bd542b07e53ff6562618c19de112cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d7c5d98b7f6dcade92ba77f78aa8345baf8161727831a616192726f6ef6f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dfb22c5155fdc4bfdf9d54bf540e4018bca6a616835af7e8d5079bebcd45da\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T13:58:25Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 13:58:19.764445 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 13:58:19.765610 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3283210930/tls.crt::/tmp/serving-cert-3283210930/tls.key\\\\\\\"\\\\nI1201 13:58:25.461951 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 13:58:25.464641 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 13:58:25.464660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 13:58:25.464680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 13:58:25.464686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 13:58:25.473932 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 13:58:25.473958 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 13:58:25.473983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 13:58:25.473986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 13:58:25.473989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 13:58:25.474166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 13:58:25.477276 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc9645fabd7968d92ce9867130dddbb6c72664af68d6b5475cf3271d1975878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:46Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:46 crc kubenswrapper[4585]: I1201 13:58:46.652058 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e9e83cdb2aa3565484c1ff3e5bbd72c7f50a132de50e60aec1aa88ad145ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:46Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:46 crc kubenswrapper[4585]: I1201 13:58:46.694624 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:46 crc kubenswrapper[4585]: I1201 13:58:46.694891 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:46 crc kubenswrapper[4585]: I1201 13:58:46.695015 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:46 crc kubenswrapper[4585]: I1201 13:58:46.695117 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:46 crc kubenswrapper[4585]: I1201 13:58:46.695214 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:46Z","lastTransitionTime":"2025-12-01T13:58:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:46 crc kubenswrapper[4585]: I1201 13:58:46.797548 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:46 crc kubenswrapper[4585]: I1201 13:58:46.797613 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:46 crc kubenswrapper[4585]: I1201 13:58:46.797626 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:46 crc kubenswrapper[4585]: I1201 13:58:46.797648 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:46 crc kubenswrapper[4585]: I1201 13:58:46.797664 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:46Z","lastTransitionTime":"2025-12-01T13:58:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:46 crc kubenswrapper[4585]: I1201 13:58:46.899832 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:46 crc kubenswrapper[4585]: I1201 13:58:46.899895 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:46 crc kubenswrapper[4585]: I1201 13:58:46.899907 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:46 crc kubenswrapper[4585]: I1201 13:58:46.899933 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:46 crc kubenswrapper[4585]: I1201 13:58:46.899946 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:46Z","lastTransitionTime":"2025-12-01T13:58:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:47 crc kubenswrapper[4585]: I1201 13:58:47.001867 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:47 crc kubenswrapper[4585]: I1201 13:58:47.001913 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:47 crc kubenswrapper[4585]: I1201 13:58:47.001925 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:47 crc kubenswrapper[4585]: I1201 13:58:47.001946 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:47 crc kubenswrapper[4585]: I1201 13:58:47.001962 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:47Z","lastTransitionTime":"2025-12-01T13:58:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:47 crc kubenswrapper[4585]: I1201 13:58:47.104915 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:47 crc kubenswrapper[4585]: I1201 13:58:47.104952 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:47 crc kubenswrapper[4585]: I1201 13:58:47.104961 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:47 crc kubenswrapper[4585]: I1201 13:58:47.104994 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:47 crc kubenswrapper[4585]: I1201 13:58:47.105006 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:47Z","lastTransitionTime":"2025-12-01T13:58:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:47 crc kubenswrapper[4585]: I1201 13:58:47.207669 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:47 crc kubenswrapper[4585]: I1201 13:58:47.207725 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:47 crc kubenswrapper[4585]: I1201 13:58:47.207737 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:47 crc kubenswrapper[4585]: I1201 13:58:47.207755 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:47 crc kubenswrapper[4585]: I1201 13:58:47.207765 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:47Z","lastTransitionTime":"2025-12-01T13:58:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:47 crc kubenswrapper[4585]: I1201 13:58:47.310566 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:47 crc kubenswrapper[4585]: I1201 13:58:47.310633 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:47 crc kubenswrapper[4585]: I1201 13:58:47.310644 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:47 crc kubenswrapper[4585]: I1201 13:58:47.310662 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:47 crc kubenswrapper[4585]: I1201 13:58:47.310674 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:47Z","lastTransitionTime":"2025-12-01T13:58:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:47 crc kubenswrapper[4585]: I1201 13:58:47.411542 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qrdw5" Dec 01 13:58:47 crc kubenswrapper[4585]: E1201 13:58:47.411760 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qrdw5" podUID="f11a95e1-135a-4fd2-9a04-1487c56a18e1" Dec 01 13:58:47 crc kubenswrapper[4585]: I1201 13:58:47.412594 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:47 crc kubenswrapper[4585]: I1201 13:58:47.412628 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:47 crc kubenswrapper[4585]: I1201 13:58:47.412636 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:47 crc kubenswrapper[4585]: I1201 13:58:47.412647 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:47 crc kubenswrapper[4585]: I1201 13:58:47.412656 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:47Z","lastTransitionTime":"2025-12-01T13:58:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:47 crc kubenswrapper[4585]: I1201 13:58:47.515545 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:47 crc kubenswrapper[4585]: I1201 13:58:47.515587 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:47 crc kubenswrapper[4585]: I1201 13:58:47.515598 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:47 crc kubenswrapper[4585]: I1201 13:58:47.515616 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:47 crc kubenswrapper[4585]: I1201 13:58:47.515630 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:47Z","lastTransitionTime":"2025-12-01T13:58:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:47 crc kubenswrapper[4585]: I1201 13:58:47.618351 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:47 crc kubenswrapper[4585]: I1201 13:58:47.618397 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:47 crc kubenswrapper[4585]: I1201 13:58:47.618405 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:47 crc kubenswrapper[4585]: I1201 13:58:47.618419 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:47 crc kubenswrapper[4585]: I1201 13:58:47.618430 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:47Z","lastTransitionTime":"2025-12-01T13:58:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:47 crc kubenswrapper[4585]: I1201 13:58:47.721204 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:47 crc kubenswrapper[4585]: I1201 13:58:47.721258 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:47 crc kubenswrapper[4585]: I1201 13:58:47.721271 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:47 crc kubenswrapper[4585]: I1201 13:58:47.721292 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:47 crc kubenswrapper[4585]: I1201 13:58:47.721304 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:47Z","lastTransitionTime":"2025-12-01T13:58:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:47 crc kubenswrapper[4585]: I1201 13:58:47.824795 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:47 crc kubenswrapper[4585]: I1201 13:58:47.825110 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:47 crc kubenswrapper[4585]: I1201 13:58:47.825182 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:47 crc kubenswrapper[4585]: I1201 13:58:47.825259 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:47 crc kubenswrapper[4585]: I1201 13:58:47.825322 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:47Z","lastTransitionTime":"2025-12-01T13:58:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:47 crc kubenswrapper[4585]: I1201 13:58:47.933062 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:47 crc kubenswrapper[4585]: I1201 13:58:47.933101 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:47 crc kubenswrapper[4585]: I1201 13:58:47.933110 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:47 crc kubenswrapper[4585]: I1201 13:58:47.933124 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:47 crc kubenswrapper[4585]: I1201 13:58:47.933136 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:47Z","lastTransitionTime":"2025-12-01T13:58:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:48 crc kubenswrapper[4585]: I1201 13:58:48.035789 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:48 crc kubenswrapper[4585]: I1201 13:58:48.035821 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:48 crc kubenswrapper[4585]: I1201 13:58:48.035829 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:48 crc kubenswrapper[4585]: I1201 13:58:48.035844 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:48 crc kubenswrapper[4585]: I1201 13:58:48.035854 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:48Z","lastTransitionTime":"2025-12-01T13:58:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:48 crc kubenswrapper[4585]: I1201 13:58:48.138700 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:48 crc kubenswrapper[4585]: I1201 13:58:48.138754 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:48 crc kubenswrapper[4585]: I1201 13:58:48.138768 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:48 crc kubenswrapper[4585]: I1201 13:58:48.138790 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:48 crc kubenswrapper[4585]: I1201 13:58:48.138801 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:48Z","lastTransitionTime":"2025-12-01T13:58:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:48 crc kubenswrapper[4585]: I1201 13:58:48.241340 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:48 crc kubenswrapper[4585]: I1201 13:58:48.241390 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:48 crc kubenswrapper[4585]: I1201 13:58:48.241399 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:48 crc kubenswrapper[4585]: I1201 13:58:48.241415 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:48 crc kubenswrapper[4585]: I1201 13:58:48.241427 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:48Z","lastTransitionTime":"2025-12-01T13:58:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:48 crc kubenswrapper[4585]: I1201 13:58:48.344459 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:48 crc kubenswrapper[4585]: I1201 13:58:48.344772 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:48 crc kubenswrapper[4585]: I1201 13:58:48.344860 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:48 crc kubenswrapper[4585]: I1201 13:58:48.344948 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:48 crc kubenswrapper[4585]: I1201 13:58:48.345048 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:48Z","lastTransitionTime":"2025-12-01T13:58:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:48 crc kubenswrapper[4585]: I1201 13:58:48.412440 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 13:58:48 crc kubenswrapper[4585]: I1201 13:58:48.412544 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 13:58:48 crc kubenswrapper[4585]: E1201 13:58:48.412646 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 13:58:48 crc kubenswrapper[4585]: I1201 13:58:48.412468 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 13:58:48 crc kubenswrapper[4585]: E1201 13:58:48.412789 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 13:58:48 crc kubenswrapper[4585]: E1201 13:58:48.412865 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 13:58:48 crc kubenswrapper[4585]: I1201 13:58:48.446733 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:48 crc kubenswrapper[4585]: I1201 13:58:48.446776 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:48 crc kubenswrapper[4585]: I1201 13:58:48.446785 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:48 crc kubenswrapper[4585]: I1201 13:58:48.446799 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:48 crc kubenswrapper[4585]: I1201 13:58:48.446809 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:48Z","lastTransitionTime":"2025-12-01T13:58:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:48 crc kubenswrapper[4585]: I1201 13:58:48.549440 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:48 crc kubenswrapper[4585]: I1201 13:58:48.549468 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:48 crc kubenswrapper[4585]: I1201 13:58:48.549476 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:48 crc kubenswrapper[4585]: I1201 13:58:48.549491 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:48 crc kubenswrapper[4585]: I1201 13:58:48.549500 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:48Z","lastTransitionTime":"2025-12-01T13:58:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:48 crc kubenswrapper[4585]: I1201 13:58:48.652713 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:48 crc kubenswrapper[4585]: I1201 13:58:48.653094 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:48 crc kubenswrapper[4585]: I1201 13:58:48.653201 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:48 crc kubenswrapper[4585]: I1201 13:58:48.653284 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:48 crc kubenswrapper[4585]: I1201 13:58:48.653346 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:48Z","lastTransitionTime":"2025-12-01T13:58:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:48 crc kubenswrapper[4585]: I1201 13:58:48.756500 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:48 crc kubenswrapper[4585]: I1201 13:58:48.756781 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:48 crc kubenswrapper[4585]: I1201 13:58:48.756901 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:48 crc kubenswrapper[4585]: I1201 13:58:48.757003 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:48 crc kubenswrapper[4585]: I1201 13:58:48.757067 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:48Z","lastTransitionTime":"2025-12-01T13:58:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:48 crc kubenswrapper[4585]: I1201 13:58:48.859823 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:48 crc kubenswrapper[4585]: I1201 13:58:48.859876 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:48 crc kubenswrapper[4585]: I1201 13:58:48.859898 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:48 crc kubenswrapper[4585]: I1201 13:58:48.859919 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:48 crc kubenswrapper[4585]: I1201 13:58:48.859933 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:48Z","lastTransitionTime":"2025-12-01T13:58:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:48 crc kubenswrapper[4585]: I1201 13:58:48.963586 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:48 crc kubenswrapper[4585]: I1201 13:58:48.963643 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:48 crc kubenswrapper[4585]: I1201 13:58:48.963667 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:48 crc kubenswrapper[4585]: I1201 13:58:48.963698 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:48 crc kubenswrapper[4585]: I1201 13:58:48.963721 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:48Z","lastTransitionTime":"2025-12-01T13:58:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.067234 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.067281 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.067291 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.067307 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.067317 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:49Z","lastTransitionTime":"2025-12-01T13:58:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.073655 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f11a95e1-135a-4fd2-9a04-1487c56a18e1-metrics-certs\") pod \"network-metrics-daemon-qrdw5\" (UID: \"f11a95e1-135a-4fd2-9a04-1487c56a18e1\") " pod="openshift-multus/network-metrics-daemon-qrdw5" Dec 01 13:58:49 crc kubenswrapper[4585]: E1201 13:58:49.073771 4585 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 13:58:49 crc kubenswrapper[4585]: E1201 13:58:49.073841 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f11a95e1-135a-4fd2-9a04-1487c56a18e1-metrics-certs podName:f11a95e1-135a-4fd2-9a04-1487c56a18e1 nodeName:}" failed. No retries permitted until 2025-12-01 13:58:57.073820993 +0000 UTC m=+51.058034848 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f11a95e1-135a-4fd2-9a04-1487c56a18e1-metrics-certs") pod "network-metrics-daemon-qrdw5" (UID: "f11a95e1-135a-4fd2-9a04-1487c56a18e1") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.170453 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.170700 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.170760 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.170863 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.170920 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:49Z","lastTransitionTime":"2025-12-01T13:58:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.274261 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.274320 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.274333 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.274349 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.274360 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:49Z","lastTransitionTime":"2025-12-01T13:58:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.376427 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.376466 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.376475 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.376492 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.376503 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:49Z","lastTransitionTime":"2025-12-01T13:58:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.411785 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qrdw5" Dec 01 13:58:49 crc kubenswrapper[4585]: E1201 13:58:49.411966 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qrdw5" podUID="f11a95e1-135a-4fd2-9a04-1487c56a18e1" Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.478808 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.478856 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.478867 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.478886 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.478900 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:49Z","lastTransitionTime":"2025-12-01T13:58:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.581604 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.581659 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.581667 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.581687 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.581698 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:49Z","lastTransitionTime":"2025-12-01T13:58:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.683765 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.683794 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.683804 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.683823 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.683835 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:49Z","lastTransitionTime":"2025-12-01T13:58:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.741535 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.741645 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.741664 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.741693 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.741717 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:49Z","lastTransitionTime":"2025-12-01T13:58:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:49 crc kubenswrapper[4585]: E1201 13:58:49.757486 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:58:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:58:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:58:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:58:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e38ee8f4-73f3-495c-b626-577353e9a008\\\",\\\"systemUUID\\\":\\\"fcf25ef7-53cd-4591-aa80-a73d07c13768\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:49Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.762385 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.762466 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.762481 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.762504 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.762547 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:49Z","lastTransitionTime":"2025-12-01T13:58:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:49 crc kubenswrapper[4585]: E1201 13:58:49.775824 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:58:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:58:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:58:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:58:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e38ee8f4-73f3-495c-b626-577353e9a008\\\",\\\"systemUUID\\\":\\\"fcf25ef7-53cd-4591-aa80-a73d07c13768\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:49Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.780191 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.780247 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.780262 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.780283 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.780297 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:49Z","lastTransitionTime":"2025-12-01T13:58:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:49 crc kubenswrapper[4585]: E1201 13:58:49.799345 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:58:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:58:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:58:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:58:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e38ee8f4-73f3-495c-b626-577353e9a008\\\",\\\"systemUUID\\\":\\\"fcf25ef7-53cd-4591-aa80-a73d07c13768\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:49Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.803467 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.803635 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.803840 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.804093 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.804236 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:49Z","lastTransitionTime":"2025-12-01T13:58:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:49 crc kubenswrapper[4585]: E1201 13:58:49.816322 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:58:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:58:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:58:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:58:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e38ee8f4-73f3-495c-b626-577353e9a008\\\",\\\"systemUUID\\\":\\\"fcf25ef7-53cd-4591-aa80-a73d07c13768\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:49Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.820426 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.820546 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.820725 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.820805 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.820861 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:49Z","lastTransitionTime":"2025-12-01T13:58:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:49 crc kubenswrapper[4585]: E1201 13:58:49.832824 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:58:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:58:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:58:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:58:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e38ee8f4-73f3-495c-b626-577353e9a008\\\",\\\"systemUUID\\\":\\\"fcf25ef7-53cd-4591-aa80-a73d07c13768\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:49Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:49 crc kubenswrapper[4585]: E1201 13:58:49.833198 4585 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.834816 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.834963 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.835071 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.835135 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.835206 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:49Z","lastTransitionTime":"2025-12-01T13:58:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.938304 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.938602 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.938709 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.938795 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:49 crc kubenswrapper[4585]: I1201 13:58:49.938854 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:49Z","lastTransitionTime":"2025-12-01T13:58:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:50 crc kubenswrapper[4585]: I1201 13:58:50.041549 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:50 crc kubenswrapper[4585]: I1201 13:58:50.041594 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:50 crc kubenswrapper[4585]: I1201 13:58:50.041602 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:50 crc kubenswrapper[4585]: I1201 13:58:50.041625 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:50 crc kubenswrapper[4585]: I1201 13:58:50.041637 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:50Z","lastTransitionTime":"2025-12-01T13:58:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:50 crc kubenswrapper[4585]: I1201 13:58:50.145060 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:50 crc kubenswrapper[4585]: I1201 13:58:50.145467 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:50 crc kubenswrapper[4585]: I1201 13:58:50.145581 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:50 crc kubenswrapper[4585]: I1201 13:58:50.145676 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:50 crc kubenswrapper[4585]: I1201 13:58:50.145751 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:50Z","lastTransitionTime":"2025-12-01T13:58:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:50 crc kubenswrapper[4585]: I1201 13:58:50.249076 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:50 crc kubenswrapper[4585]: I1201 13:58:50.249137 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:50 crc kubenswrapper[4585]: I1201 13:58:50.249154 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:50 crc kubenswrapper[4585]: I1201 13:58:50.249177 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:50 crc kubenswrapper[4585]: I1201 13:58:50.249191 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:50Z","lastTransitionTime":"2025-12-01T13:58:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:50 crc kubenswrapper[4585]: I1201 13:58:50.352079 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:50 crc kubenswrapper[4585]: I1201 13:58:50.352115 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:50 crc kubenswrapper[4585]: I1201 13:58:50.352123 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:50 crc kubenswrapper[4585]: I1201 13:58:50.352138 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:50 crc kubenswrapper[4585]: I1201 13:58:50.352149 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:50Z","lastTransitionTime":"2025-12-01T13:58:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:50 crc kubenswrapper[4585]: I1201 13:58:50.412220 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 13:58:50 crc kubenswrapper[4585]: I1201 13:58:50.412220 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 13:58:50 crc kubenswrapper[4585]: I1201 13:58:50.412246 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 13:58:50 crc kubenswrapper[4585]: E1201 13:58:50.412396 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 13:58:50 crc kubenswrapper[4585]: E1201 13:58:50.412501 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 13:58:50 crc kubenswrapper[4585]: E1201 13:58:50.412580 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 13:58:50 crc kubenswrapper[4585]: I1201 13:58:50.454300 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:50 crc kubenswrapper[4585]: I1201 13:58:50.454339 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:50 crc kubenswrapper[4585]: I1201 13:58:50.454352 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:50 crc kubenswrapper[4585]: I1201 13:58:50.454371 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:50 crc kubenswrapper[4585]: I1201 13:58:50.454385 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:50Z","lastTransitionTime":"2025-12-01T13:58:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:50 crc kubenswrapper[4585]: I1201 13:58:50.557230 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:50 crc kubenswrapper[4585]: I1201 13:58:50.557267 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:50 crc kubenswrapper[4585]: I1201 13:58:50.557275 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:50 crc kubenswrapper[4585]: I1201 13:58:50.557292 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:50 crc kubenswrapper[4585]: I1201 13:58:50.557301 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:50Z","lastTransitionTime":"2025-12-01T13:58:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:50 crc kubenswrapper[4585]: I1201 13:58:50.659740 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:50 crc kubenswrapper[4585]: I1201 13:58:50.659782 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:50 crc kubenswrapper[4585]: I1201 13:58:50.659792 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:50 crc kubenswrapper[4585]: I1201 13:58:50.659809 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:50 crc kubenswrapper[4585]: I1201 13:58:50.659821 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:50Z","lastTransitionTime":"2025-12-01T13:58:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:50 crc kubenswrapper[4585]: I1201 13:58:50.763262 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:50 crc kubenswrapper[4585]: I1201 13:58:50.763303 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:50 crc kubenswrapper[4585]: I1201 13:58:50.763314 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:50 crc kubenswrapper[4585]: I1201 13:58:50.763333 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:50 crc kubenswrapper[4585]: I1201 13:58:50.763345 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:50Z","lastTransitionTime":"2025-12-01T13:58:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:50 crc kubenswrapper[4585]: I1201 13:58:50.866335 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:50 crc kubenswrapper[4585]: I1201 13:58:50.866376 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:50 crc kubenswrapper[4585]: I1201 13:58:50.866386 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:50 crc kubenswrapper[4585]: I1201 13:58:50.866403 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:50 crc kubenswrapper[4585]: I1201 13:58:50.866418 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:50Z","lastTransitionTime":"2025-12-01T13:58:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:50 crc kubenswrapper[4585]: I1201 13:58:50.969309 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:50 crc kubenswrapper[4585]: I1201 13:58:50.969417 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:50 crc kubenswrapper[4585]: I1201 13:58:50.969431 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:50 crc kubenswrapper[4585]: I1201 13:58:50.969456 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:50 crc kubenswrapper[4585]: I1201 13:58:50.969469 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:50Z","lastTransitionTime":"2025-12-01T13:58:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:51 crc kubenswrapper[4585]: I1201 13:58:51.072076 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:51 crc kubenswrapper[4585]: I1201 13:58:51.072143 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:51 crc kubenswrapper[4585]: I1201 13:58:51.072158 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:51 crc kubenswrapper[4585]: I1201 13:58:51.072180 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:51 crc kubenswrapper[4585]: I1201 13:58:51.072195 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:51Z","lastTransitionTime":"2025-12-01T13:58:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:51 crc kubenswrapper[4585]: I1201 13:58:51.175045 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:51 crc kubenswrapper[4585]: I1201 13:58:51.175090 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:51 crc kubenswrapper[4585]: I1201 13:58:51.175102 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:51 crc kubenswrapper[4585]: I1201 13:58:51.175127 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:51 crc kubenswrapper[4585]: I1201 13:58:51.175139 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:51Z","lastTransitionTime":"2025-12-01T13:58:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:51 crc kubenswrapper[4585]: I1201 13:58:51.278114 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:51 crc kubenswrapper[4585]: I1201 13:58:51.278162 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:51 crc kubenswrapper[4585]: I1201 13:58:51.278176 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:51 crc kubenswrapper[4585]: I1201 13:58:51.278194 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:51 crc kubenswrapper[4585]: I1201 13:58:51.278206 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:51Z","lastTransitionTime":"2025-12-01T13:58:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:51 crc kubenswrapper[4585]: I1201 13:58:51.381130 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:51 crc kubenswrapper[4585]: I1201 13:58:51.381165 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:51 crc kubenswrapper[4585]: I1201 13:58:51.381174 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:51 crc kubenswrapper[4585]: I1201 13:58:51.381190 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:51 crc kubenswrapper[4585]: I1201 13:58:51.381202 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:51Z","lastTransitionTime":"2025-12-01T13:58:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:51 crc kubenswrapper[4585]: I1201 13:58:51.412423 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qrdw5" Dec 01 13:58:51 crc kubenswrapper[4585]: E1201 13:58:51.412655 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qrdw5" podUID="f11a95e1-135a-4fd2-9a04-1487c56a18e1" Dec 01 13:58:51 crc kubenswrapper[4585]: I1201 13:58:51.484267 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:51 crc kubenswrapper[4585]: I1201 13:58:51.484308 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:51 crc kubenswrapper[4585]: I1201 13:58:51.484318 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:51 crc kubenswrapper[4585]: I1201 13:58:51.484333 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:51 crc kubenswrapper[4585]: I1201 13:58:51.484342 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:51Z","lastTransitionTime":"2025-12-01T13:58:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:51 crc kubenswrapper[4585]: I1201 13:58:51.587033 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:51 crc kubenswrapper[4585]: I1201 13:58:51.587073 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:51 crc kubenswrapper[4585]: I1201 13:58:51.587084 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:51 crc kubenswrapper[4585]: I1201 13:58:51.587104 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:51 crc kubenswrapper[4585]: I1201 13:58:51.587117 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:51Z","lastTransitionTime":"2025-12-01T13:58:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:51 crc kubenswrapper[4585]: I1201 13:58:51.690834 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:51 crc kubenswrapper[4585]: I1201 13:58:51.690894 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:51 crc kubenswrapper[4585]: I1201 13:58:51.690905 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:51 crc kubenswrapper[4585]: I1201 13:58:51.690924 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:51 crc kubenswrapper[4585]: I1201 13:58:51.690944 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:51Z","lastTransitionTime":"2025-12-01T13:58:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:51 crc kubenswrapper[4585]: I1201 13:58:51.793408 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:51 crc kubenswrapper[4585]: I1201 13:58:51.793457 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:51 crc kubenswrapper[4585]: I1201 13:58:51.793468 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:51 crc kubenswrapper[4585]: I1201 13:58:51.793484 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:51 crc kubenswrapper[4585]: I1201 13:58:51.793496 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:51Z","lastTransitionTime":"2025-12-01T13:58:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:51 crc kubenswrapper[4585]: I1201 13:58:51.895610 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:51 crc kubenswrapper[4585]: I1201 13:58:51.895682 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:51 crc kubenswrapper[4585]: I1201 13:58:51.895692 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:51 crc kubenswrapper[4585]: I1201 13:58:51.895706 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:51 crc kubenswrapper[4585]: I1201 13:58:51.895715 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:51Z","lastTransitionTime":"2025-12-01T13:58:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:51 crc kubenswrapper[4585]: I1201 13:58:51.998227 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:51 crc kubenswrapper[4585]: I1201 13:58:51.998268 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:51 crc kubenswrapper[4585]: I1201 13:58:51.998279 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:51 crc kubenswrapper[4585]: I1201 13:58:51.998297 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:51 crc kubenswrapper[4585]: I1201 13:58:51.998309 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:51Z","lastTransitionTime":"2025-12-01T13:58:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:52 crc kubenswrapper[4585]: I1201 13:58:52.101177 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:52 crc kubenswrapper[4585]: I1201 13:58:52.101211 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:52 crc kubenswrapper[4585]: I1201 13:58:52.101222 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:52 crc kubenswrapper[4585]: I1201 13:58:52.101240 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:52 crc kubenswrapper[4585]: I1201 13:58:52.101251 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:52Z","lastTransitionTime":"2025-12-01T13:58:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:52 crc kubenswrapper[4585]: I1201 13:58:52.204622 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:52 crc kubenswrapper[4585]: I1201 13:58:52.204670 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:52 crc kubenswrapper[4585]: I1201 13:58:52.204682 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:52 crc kubenswrapper[4585]: I1201 13:58:52.204701 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:52 crc kubenswrapper[4585]: I1201 13:58:52.204712 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:52Z","lastTransitionTime":"2025-12-01T13:58:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:52 crc kubenswrapper[4585]: I1201 13:58:52.307876 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:52 crc kubenswrapper[4585]: I1201 13:58:52.308270 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:52 crc kubenswrapper[4585]: I1201 13:58:52.308391 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:52 crc kubenswrapper[4585]: I1201 13:58:52.308481 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:52 crc kubenswrapper[4585]: I1201 13:58:52.308557 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:52Z","lastTransitionTime":"2025-12-01T13:58:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:52 crc kubenswrapper[4585]: I1201 13:58:52.411166 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:52 crc kubenswrapper[4585]: I1201 13:58:52.411212 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:52 crc kubenswrapper[4585]: I1201 13:58:52.411222 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:52 crc kubenswrapper[4585]: I1201 13:58:52.411238 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:52 crc kubenswrapper[4585]: I1201 13:58:52.411248 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:52Z","lastTransitionTime":"2025-12-01T13:58:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:52 crc kubenswrapper[4585]: I1201 13:58:52.411446 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 13:58:52 crc kubenswrapper[4585]: I1201 13:58:52.411490 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 13:58:52 crc kubenswrapper[4585]: I1201 13:58:52.411535 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 13:58:52 crc kubenswrapper[4585]: E1201 13:58:52.411576 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 13:58:52 crc kubenswrapper[4585]: E1201 13:58:52.411659 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 13:58:52 crc kubenswrapper[4585]: E1201 13:58:52.411759 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 13:58:52 crc kubenswrapper[4585]: I1201 13:58:52.513246 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:52 crc kubenswrapper[4585]: I1201 13:58:52.513284 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:52 crc kubenswrapper[4585]: I1201 13:58:52.513294 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:52 crc kubenswrapper[4585]: I1201 13:58:52.513309 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:52 crc kubenswrapper[4585]: I1201 13:58:52.513320 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:52Z","lastTransitionTime":"2025-12-01T13:58:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:52 crc kubenswrapper[4585]: I1201 13:58:52.615761 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:52 crc kubenswrapper[4585]: I1201 13:58:52.616079 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:52 crc kubenswrapper[4585]: I1201 13:58:52.616157 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:52 crc kubenswrapper[4585]: I1201 13:58:52.616276 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:52 crc kubenswrapper[4585]: I1201 13:58:52.616337 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:52Z","lastTransitionTime":"2025-12-01T13:58:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:52 crc kubenswrapper[4585]: I1201 13:58:52.718690 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:52 crc kubenswrapper[4585]: I1201 13:58:52.718725 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:52 crc kubenswrapper[4585]: I1201 13:58:52.718737 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:52 crc kubenswrapper[4585]: I1201 13:58:52.718752 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:52 crc kubenswrapper[4585]: I1201 13:58:52.718764 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:52Z","lastTransitionTime":"2025-12-01T13:58:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:52 crc kubenswrapper[4585]: I1201 13:58:52.820955 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:52 crc kubenswrapper[4585]: I1201 13:58:52.821009 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:52 crc kubenswrapper[4585]: I1201 13:58:52.821020 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:52 crc kubenswrapper[4585]: I1201 13:58:52.821039 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:52 crc kubenswrapper[4585]: I1201 13:58:52.821051 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:52Z","lastTransitionTime":"2025-12-01T13:58:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:52 crc kubenswrapper[4585]: I1201 13:58:52.922944 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:52 crc kubenswrapper[4585]: I1201 13:58:52.923008 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:52 crc kubenswrapper[4585]: I1201 13:58:52.923021 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:52 crc kubenswrapper[4585]: I1201 13:58:52.923038 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:52 crc kubenswrapper[4585]: I1201 13:58:52.923049 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:52Z","lastTransitionTime":"2025-12-01T13:58:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:53 crc kubenswrapper[4585]: I1201 13:58:53.026272 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:53 crc kubenswrapper[4585]: I1201 13:58:53.026544 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:53 crc kubenswrapper[4585]: I1201 13:58:53.026627 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:53 crc kubenswrapper[4585]: I1201 13:58:53.026707 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:53 crc kubenswrapper[4585]: I1201 13:58:53.026784 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:53Z","lastTransitionTime":"2025-12-01T13:58:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:53 crc kubenswrapper[4585]: I1201 13:58:53.129651 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:53 crc kubenswrapper[4585]: I1201 13:58:53.129699 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:53 crc kubenswrapper[4585]: I1201 13:58:53.129711 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:53 crc kubenswrapper[4585]: I1201 13:58:53.129729 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:53 crc kubenswrapper[4585]: I1201 13:58:53.129740 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:53Z","lastTransitionTime":"2025-12-01T13:58:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:53 crc kubenswrapper[4585]: I1201 13:58:53.232483 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:53 crc kubenswrapper[4585]: I1201 13:58:53.232539 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:53 crc kubenswrapper[4585]: I1201 13:58:53.232550 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:53 crc kubenswrapper[4585]: I1201 13:58:53.232569 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:53 crc kubenswrapper[4585]: I1201 13:58:53.232580 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:53Z","lastTransitionTime":"2025-12-01T13:58:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:53 crc kubenswrapper[4585]: I1201 13:58:53.334791 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:53 crc kubenswrapper[4585]: I1201 13:58:53.334846 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:53 crc kubenswrapper[4585]: I1201 13:58:53.334856 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:53 crc kubenswrapper[4585]: I1201 13:58:53.334874 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:53 crc kubenswrapper[4585]: I1201 13:58:53.334886 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:53Z","lastTransitionTime":"2025-12-01T13:58:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:53 crc kubenswrapper[4585]: I1201 13:58:53.411827 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qrdw5" Dec 01 13:58:53 crc kubenswrapper[4585]: E1201 13:58:53.412120 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qrdw5" podUID="f11a95e1-135a-4fd2-9a04-1487c56a18e1" Dec 01 13:58:53 crc kubenswrapper[4585]: I1201 13:58:53.438050 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:53 crc kubenswrapper[4585]: I1201 13:58:53.438453 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:53 crc kubenswrapper[4585]: I1201 13:58:53.438672 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:53 crc kubenswrapper[4585]: I1201 13:58:53.438822 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:53 crc kubenswrapper[4585]: I1201 13:58:53.438950 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:53Z","lastTransitionTime":"2025-12-01T13:58:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:53 crc kubenswrapper[4585]: I1201 13:58:53.541404 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:53 crc kubenswrapper[4585]: I1201 13:58:53.541814 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:53 crc kubenswrapper[4585]: I1201 13:58:53.541914 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:53 crc kubenswrapper[4585]: I1201 13:58:53.542083 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:53 crc kubenswrapper[4585]: I1201 13:58:53.542184 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:53Z","lastTransitionTime":"2025-12-01T13:58:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:53 crc kubenswrapper[4585]: I1201 13:58:53.644610 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:53 crc kubenswrapper[4585]: I1201 13:58:53.644655 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:53 crc kubenswrapper[4585]: I1201 13:58:53.644666 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:53 crc kubenswrapper[4585]: I1201 13:58:53.644683 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:53 crc kubenswrapper[4585]: I1201 13:58:53.644694 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:53Z","lastTransitionTime":"2025-12-01T13:58:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:53 crc kubenswrapper[4585]: I1201 13:58:53.747175 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:53 crc kubenswrapper[4585]: I1201 13:58:53.747212 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:53 crc kubenswrapper[4585]: I1201 13:58:53.747222 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:53 crc kubenswrapper[4585]: I1201 13:58:53.747240 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:53 crc kubenswrapper[4585]: I1201 13:58:53.747252 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:53Z","lastTransitionTime":"2025-12-01T13:58:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:53 crc kubenswrapper[4585]: I1201 13:58:53.849581 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:53 crc kubenswrapper[4585]: I1201 13:58:53.849635 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:53 crc kubenswrapper[4585]: I1201 13:58:53.849646 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:53 crc kubenswrapper[4585]: I1201 13:58:53.849664 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:53 crc kubenswrapper[4585]: I1201 13:58:53.849677 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:53Z","lastTransitionTime":"2025-12-01T13:58:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:53 crc kubenswrapper[4585]: I1201 13:58:53.952270 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:53 crc kubenswrapper[4585]: I1201 13:58:53.952309 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:53 crc kubenswrapper[4585]: I1201 13:58:53.952318 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:53 crc kubenswrapper[4585]: I1201 13:58:53.952334 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:53 crc kubenswrapper[4585]: I1201 13:58:53.952346 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:53Z","lastTransitionTime":"2025-12-01T13:58:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.054669 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.054721 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.054734 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.054755 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.054768 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:54Z","lastTransitionTime":"2025-12-01T13:58:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.157684 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.157731 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.157744 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.157761 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.157772 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:54Z","lastTransitionTime":"2025-12-01T13:58:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.260266 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.260308 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.260319 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.260338 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.260351 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:54Z","lastTransitionTime":"2025-12-01T13:58:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.362750 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.362790 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.362800 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.362815 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.362825 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:54Z","lastTransitionTime":"2025-12-01T13:58:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.411545 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.411611 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 13:58:54 crc kubenswrapper[4585]: E1201 13:58:54.411727 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 13:58:54 crc kubenswrapper[4585]: E1201 13:58:54.411922 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.412506 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 13:58:54 crc kubenswrapper[4585]: E1201 13:58:54.412576 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.413307 4585 scope.go:117] "RemoveContainer" containerID="41a150020fb39ba2ce75b7cba4d24fff9e245b088b62f43a3feb8dcf8344aa33" Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.466169 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.466211 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.466224 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.466301 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.466315 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:54Z","lastTransitionTime":"2025-12-01T13:58:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.569104 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.569366 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.569457 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.569546 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.569652 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:54Z","lastTransitionTime":"2025-12-01T13:58:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.673328 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.673358 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.673367 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.673383 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.673423 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:54Z","lastTransitionTime":"2025-12-01T13:58:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.776592 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.776636 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.776648 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.776669 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.776682 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:54Z","lastTransitionTime":"2025-12-01T13:58:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.780758 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tjkqr_b0b45150-070d-4f7c-b53a-d76dcbaa6e6d/ovnkube-controller/1.log" Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.784113 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" event={"ID":"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d","Type":"ContainerStarted","Data":"b6e182b8e1ae7a92cc190a93197ab3bdc03ffc5cc65257651261acd04689f8ee"} Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.784649 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.805903 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:54Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.823019 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-62bsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98be7526-98f6-4a4a-b4a6-1d10e76b7a99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d4df68e891df5e4e44371a4c884475b5723d55f6a10d32889cf194ddbe6179\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxjs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-62bsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:54Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.847948 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b10cdae7-154c-4fd6-a308-02843603d7ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b40ee42d7d1c9043b0b5c0dea3baacc46c24842ec64ed3b0d0ae1acfaf65f8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa22ac0ee9ad705a8affafffed373135005ef0380c9a646e06229a24d04d20b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://949f7fe4da35866e66212aea18ec1ed303bd542b07e53ff6562618c19de112cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d7c5d98b7f6dcade92ba77f78aa8345baf8161727831a616192726f6ef6f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dfb22c5155fdc4bfdf9d54bf540e4018bca6a616835af7e8d5079bebcd45da\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T13:58:25Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 13:58:19.764445 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 13:58:19.765610 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3283210930/tls.crt::/tmp/serving-cert-3283210930/tls.key\\\\\\\"\\\\nI1201 13:58:25.461951 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 13:58:25.464641 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 13:58:25.464660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 13:58:25.464680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 13:58:25.464686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 13:58:25.473932 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 13:58:25.473958 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 13:58:25.473983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 13:58:25.473986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 13:58:25.473989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 13:58:25.474166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 13:58:25.477276 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc9645fabd7968d92ce9867130dddbb6c72664af68d6b5475cf3271d1975878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:54Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.865420 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e9e83cdb2aa3565484c1ff3e5bbd72c7f50a132de50e60aec1aa88ad145ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:54Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.876470 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7beb40d-bcd0-43c8-a9fe-c32408790a4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59e8d51b7e0c8b526fab11f39a0a94685aecda8a1d300e1ee2259eb4bbac9003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c2cf1c1e4965502ab56b6d9bbf7840d00225b831b1e613295befad0748d7d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lj9gs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:54Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.880177 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.880227 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.880237 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.880253 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.880262 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:54Z","lastTransitionTime":"2025-12-01T13:58:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.894372 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90163e3776ee8b2a338e3f8e6aa3d918d1d6123d246fbbd88659d342348167a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37437844a9c9719627f96c0afa303f0d449c46def7249b8c69ebd44900661fa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec93c685227d533ac1ab980dca6f72792f746e68511a7a546fb25189b02507d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b64c1a0d36f8addf6040455b1e933756582857ef8098c935b9cc5a115e8afee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d740a1be02b59a062ea8d5c44620cb57725a7ec494fd365e44a8980947c0d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdbccd3910729d23067d94c9ed2f5f3c917cd4880185a3b7e8c4c811cd30e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6e182b8e1ae7a92cc190a93197ab3bdc03ffc5cc65257651261acd04689f8ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41a150020fb39ba2ce75b7cba4d24fff9e245b088b62f43a3feb8dcf8344aa33\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"message\\\":\\\"nal_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/olm-operator-metrics]} name:Service_openshift-operator-lifecycle-manager/olm-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.168:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {63b1440a-0908-4cab-8799-012fa1cf0b07}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 13:58:38.627536 5917 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1201 13:58:38.627873 5917 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1201 13:58:38.627909 5917 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c85de7cd35b11158d931065cec6bafaf456be350f4a6095b8a9d0851bf87b1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:54Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.906719 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nhp6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da6d315c-d478-4216-9f3d-57b20ce5ced8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ebdaa73edbe04780c0f421ea88446b4268ed1792af30e1ebb945167db80a548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n25g8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nhp6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:54Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.918924 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kn5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c45f35-33ae-4eba-97c7-eb85a6db85a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf9d61c1ae31a9cc21ea918e419af7e8badf234ef1614ef73bafdc95381935cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b27c5e93a6707bc6b2b6ef81596e39344c0b29f8cac30ac5d62c7eaa5e635f4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6kn5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:54Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.931845 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:54Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.951305 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9wjs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e7ad3ad-7937-409b-b1c9-9c801f937400\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3662f79465b11c168f80f17c534c792fb1cc6b8a5c0115d219d39def6db073\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvc7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9wjs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:54Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.973526 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xh4hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0e1eaa6-bee9-401a-b01a-9bf49a938b29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db9b353d1934e3336c4282cb057c52ef64e01675fa36e23d1fa0af7133508a62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f3e0b6c5dac9ff9b4686937a7d1646c4796a3326193349cf1556876bf1ae61a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f3e0b6c5dac9ff9b4686937a7d1646c4796a3326193349cf1556876bf1ae61a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2451dad343dcc9ec3bd5eb7109aefab60da5b700ae69d8eff28fef524795a758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2451dad343dcc9ec3bd5eb7109aefab60da5b700ae69d8eff28fef524795a758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9b888c3602a8f5d0f8cb762ab7d9b6676a0467f9546dbc38c5d2dd99acedd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9b888c3602a8f5d0f8cb762ab7d9b6676a0467f9546dbc38c5d2dd99acedd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3110c36a439f88ab7fc0903fd766db1abef51f129bb1fc68634531d0e30cab0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3110c36a439f88ab7fc0903fd766db1abef51f129bb1fc68634531d0e30cab0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xh4hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:54Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.982864 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.982907 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.982919 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.982935 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.982946 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:54Z","lastTransitionTime":"2025-12-01T13:58:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:54 crc kubenswrapper[4585]: I1201 13:58:54.988805 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qrdw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f11a95e1-135a-4fd2-9a04-1487c56a18e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmr99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmr99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qrdw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:54Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.011602 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b62eed93-4d28-4748-b991-dd9692b58341\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e952d1592344330416348b9ec4add60740a0d23ab554afb70c6419977fb12533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e35deff79558e54b62aa868535d5bc4d4374ede161463dcf5f3a33584867447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9736ff6b49635cb7c26db7bc2292b36cf63053e0d1e4ffa2980a520dd71d9ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe7079ed6fa6b8b64d16c67287c33c2bffa0f42df2166fe66e3f8e173ca87d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83ba28d0b57f4744f3f7f97915fde983e2de227958ddaecd43b3eae9eb406774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:55Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.029704 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4030376c-31f1-420a-935f-f1ee174914e7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2571804bbf066bfec1f4fde3430800aa87d9c5b6db48cc1f3d352d742657ca8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d240ebf61216f20038983f7b8d2e0792fc0f1ed8db050debf02c777197d417b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f66a3bd7503966a7b443b7ea1adbd8554c99e810749bc275e5cfa4e95c6b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4434211239bc61f432b46918bbb691c6257fded2fbd9552ce3344d10e091b3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:55Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.043858 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b3ea01760c8f1e58a55a59a0d453c25ca1298ede32d471f02092fd9d0a43830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e7aa15babbd745afed6bc0d4bbf5526180b8d32f86f8db736066fad3feb506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:55Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.056794 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4694a9b9e5ea09fb1d1963504d94853e22fce9688a502036324f92f39bf8a8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:55Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.070431 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:55Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.085841 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.085886 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.085898 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.085917 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.085927 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:55Z","lastTransitionTime":"2025-12-01T13:58:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.187756 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.187791 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.187799 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.187814 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.187823 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:55Z","lastTransitionTime":"2025-12-01T13:58:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.290867 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.290931 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.290945 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.290990 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.291003 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:55Z","lastTransitionTime":"2025-12-01T13:58:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.393942 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.394053 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.394078 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.394110 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.394133 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:55Z","lastTransitionTime":"2025-12-01T13:58:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.412049 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qrdw5" Dec 01 13:58:55 crc kubenswrapper[4585]: E1201 13:58:55.412266 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qrdw5" podUID="f11a95e1-135a-4fd2-9a04-1487c56a18e1" Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.496925 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.497044 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.497069 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.497105 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.497129 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:55Z","lastTransitionTime":"2025-12-01T13:58:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.600285 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.600336 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.600348 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.600367 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.600382 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:55Z","lastTransitionTime":"2025-12-01T13:58:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.703410 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.703449 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.703460 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.703476 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.703488 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:55Z","lastTransitionTime":"2025-12-01T13:58:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.790239 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tjkqr_b0b45150-070d-4f7c-b53a-d76dcbaa6e6d/ovnkube-controller/2.log" Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.791349 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tjkqr_b0b45150-070d-4f7c-b53a-d76dcbaa6e6d/ovnkube-controller/1.log" Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.795280 4585 generic.go:334] "Generic (PLEG): container finished" podID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" containerID="b6e182b8e1ae7a92cc190a93197ab3bdc03ffc5cc65257651261acd04689f8ee" exitCode=1 Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.795338 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" event={"ID":"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d","Type":"ContainerDied","Data":"b6e182b8e1ae7a92cc190a93197ab3bdc03ffc5cc65257651261acd04689f8ee"} Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.795381 4585 scope.go:117] "RemoveContainer" containerID="41a150020fb39ba2ce75b7cba4d24fff9e245b088b62f43a3feb8dcf8344aa33" Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.796062 4585 scope.go:117] "RemoveContainer" containerID="b6e182b8e1ae7a92cc190a93197ab3bdc03ffc5cc65257651261acd04689f8ee" Dec 01 13:58:55 crc kubenswrapper[4585]: E1201 13:58:55.796257 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-tjkqr_openshift-ovn-kubernetes(b0b45150-070d-4f7c-b53a-d76dcbaa6e6d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" podUID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.806549 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.806583 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.806592 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.806608 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.806618 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:55Z","lastTransitionTime":"2025-12-01T13:58:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.812705 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:55Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.825683 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-62bsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98be7526-98f6-4a4a-b4a6-1d10e76b7a99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d4df68e891df5e4e44371a4c884475b5723d55f6a10d32889cf194ddbe6179\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxjs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-62bsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:55Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.837824 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kn5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c45f35-33ae-4eba-97c7-eb85a6db85a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf9d61c1ae31a9cc21ea918e419af7e8badf234ef1614ef73bafdc95381935cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b27c5e93a6707bc6b2b6ef81596e39344c0b29f8cac30ac5d62c7eaa5e635f4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6kn5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:55Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.854573 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b10cdae7-154c-4fd6-a308-02843603d7ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b40ee42d7d1c9043b0b5c0dea3baacc46c24842ec64ed3b0d0ae1acfaf65f8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa22ac0ee9ad705a8affafffed373135005ef0380c9a646e06229a24d04d20b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://949f7fe4da35866e66212aea18ec1ed303bd542b07e53ff6562618c19de112cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d7c5d98b7f6dcade92ba77f78aa8345baf8161727831a616192726f6ef6f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dfb22c5155fdc4bfdf9d54bf540e4018bca6a616835af7e8d5079bebcd45da\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T13:58:25Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 13:58:19.764445 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 13:58:19.765610 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3283210930/tls.crt::/tmp/serving-cert-3283210930/tls.key\\\\\\\"\\\\nI1201 13:58:25.461951 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 13:58:25.464641 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 13:58:25.464660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 13:58:25.464680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 13:58:25.464686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 13:58:25.473932 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 13:58:25.473958 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 13:58:25.473983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 13:58:25.473986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 13:58:25.473989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 13:58:25.474166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 13:58:25.477276 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc9645fabd7968d92ce9867130dddbb6c72664af68d6b5475cf3271d1975878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:55Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.868806 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e9e83cdb2aa3565484c1ff3e5bbd72c7f50a132de50e60aec1aa88ad145ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:55Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.879740 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7beb40d-bcd0-43c8-a9fe-c32408790a4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59e8d51b7e0c8b526fab11f39a0a94685aecda8a1d300e1ee2259eb4bbac9003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c2cf1c1e4965502ab56b6d9bbf7840d00225b831b1e613295befad0748d7d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lj9gs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:55Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.898146 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90163e3776ee8b2a338e3f8e6aa3d918d1d6123d246fbbd88659d342348167a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37437844a9c9719627f96c0afa303f0d449c46def7249b8c69ebd44900661fa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec93c685227d533ac1ab980dca6f72792f746e68511a7a546fb25189b02507d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b64c1a0d36f8addf6040455b1e933756582857ef8098c935b9cc5a115e8afee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d740a1be02b59a062ea8d5c44620cb57725a7ec494fd365e44a8980947c0d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdbccd3910729d23067d94c9ed2f5f3c917cd4880185a3b7e8c4c811cd30e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6e182b8e1ae7a92cc190a93197ab3bdc03ffc5cc65257651261acd04689f8ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41a150020fb39ba2ce75b7cba4d24fff9e245b088b62f43a3feb8dcf8344aa33\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"message\\\":\\\"nal_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/olm-operator-metrics]} name:Service_openshift-operator-lifecycle-manager/olm-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.168:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {63b1440a-0908-4cab-8799-012fa1cf0b07}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 13:58:38.627536 5917 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1201 13:58:38.627873 5917 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1201 13:58:38.627909 5917 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e182b8e1ae7a92cc190a93197ab3bdc03ffc5cc65257651261acd04689f8ee\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T13:58:55Z\\\",\\\"message\\\":\\\"Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-image-registry/image-registry]} name:Service_openshift-image-registry/image-registry_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.93:5000:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {83c1e277-3d22-42ae-a355-f7a0ff0bd171}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 13:58:55.244725 6128 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}\\\\nI1201 13:58:55.244735 6128 services_controller.go:360] Finished syncing service etcd on namespace openshift-etcd for network=default : 1.845548ms\\\\nI1201 13:58:55.244747 6128 services_controller.go:356] Processing sync for service openshift-console-operator/metrics for network=default\\\\nF1201 13:58:55.244780 6128 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c85de7cd35b11158d931065cec6bafaf456be350f4a6095b8a9d0851bf87b1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:55Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.908942 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nhp6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da6d315c-d478-4216-9f3d-57b20ce5ced8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ebdaa73edbe04780c0f421ea88446b4268ed1792af30e1ebb945167db80a548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n25g8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nhp6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:55Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.909376 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.909420 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.909428 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.909447 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.909458 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:55Z","lastTransitionTime":"2025-12-01T13:58:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.921238 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:55Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.933213 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9wjs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e7ad3ad-7937-409b-b1c9-9c801f937400\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3662f79465b11c168f80f17c534c792fb1cc6b8a5c0115d219d39def6db073\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvc7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9wjs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:55Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.943925 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:55Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.957790 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xh4hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0e1eaa6-bee9-401a-b01a-9bf49a938b29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db9b353d1934e3336c4282cb057c52ef64e01675fa36e23d1fa0af7133508a62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f3e0b6c5dac9ff9b4686937a7d1646c4796a3326193349cf1556876bf1ae61a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f3e0b6c5dac9ff9b4686937a7d1646c4796a3326193349cf1556876bf1ae61a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2451dad343dcc9ec3bd5eb7109aefab60da5b700ae69d8eff28fef524795a758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2451dad343dcc9ec3bd5eb7109aefab60da5b700ae69d8eff28fef524795a758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9b888c3602a8f5d0f8cb762ab7d9b6676a0467f9546dbc38c5d2dd99acedd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9b888c3602a8f5d0f8cb762ab7d9b6676a0467f9546dbc38c5d2dd99acedd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3110c36a439f88ab7fc0903fd766db1abef51f129bb1fc68634531d0e30cab0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3110c36a439f88ab7fc0903fd766db1abef51f129bb1fc68634531d0e30cab0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xh4hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:55Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.969289 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qrdw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f11a95e1-135a-4fd2-9a04-1487c56a18e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmr99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmr99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qrdw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:55Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:55 crc kubenswrapper[4585]: I1201 13:58:55.987631 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b62eed93-4d28-4748-b991-dd9692b58341\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e952d1592344330416348b9ec4add60740a0d23ab554afb70c6419977fb12533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e35deff79558e54b62aa868535d5bc4d4374ede161463dcf5f3a33584867447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9736ff6b49635cb7c26db7bc2292b36cf63053e0d1e4ffa2980a520dd71d9ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe7079ed6fa6b8b64d16c67287c33c2bffa0f42df2166fe66e3f8e173ca87d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83ba28d0b57f4744f3f7f97915fde983e2de227958ddaecd43b3eae9eb406774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:55Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.000008 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4030376c-31f1-420a-935f-f1ee174914e7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2571804bbf066bfec1f4fde3430800aa87d9c5b6db48cc1f3d352d742657ca8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d240ebf61216f20038983f7b8d2e0792fc0f1ed8db050debf02c777197d417b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f66a3bd7503966a7b443b7ea1adbd8554c99e810749bc275e5cfa4e95c6b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4434211239bc61f432b46918bbb691c6257fded2fbd9552ce3344d10e091b3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:55Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.012765 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.012801 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.012810 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.012829 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.012841 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:56Z","lastTransitionTime":"2025-12-01T13:58:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.013631 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b3ea01760c8f1e58a55a59a0d453c25ca1298ede32d471f02092fd9d0a43830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e7aa15babbd745afed6bc0d4bbf5526180b8d32f86f8db736066fad3feb506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:56Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.025594 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4694a9b9e5ea09fb1d1963504d94853e22fce9688a502036324f92f39bf8a8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:56Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.115807 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.115846 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.115856 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.115871 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.115885 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:56Z","lastTransitionTime":"2025-12-01T13:58:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.218657 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.218737 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.218763 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.218801 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.218830 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:56Z","lastTransitionTime":"2025-12-01T13:58:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.322165 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.322206 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.322215 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.322229 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.322239 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:56Z","lastTransitionTime":"2025-12-01T13:58:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.412361 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.412426 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.412389 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 13:58:56 crc kubenswrapper[4585]: E1201 13:58:56.412532 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 13:58:56 crc kubenswrapper[4585]: E1201 13:58:56.412654 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 13:58:56 crc kubenswrapper[4585]: E1201 13:58:56.412769 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.423945 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.423996 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.424009 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.424030 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.424043 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:56Z","lastTransitionTime":"2025-12-01T13:58:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.426400 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:56Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.439233 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9wjs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e7ad3ad-7937-409b-b1c9-9c801f937400\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3662f79465b11c168f80f17c534c792fb1cc6b8a5c0115d219d39def6db073\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvc7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9wjs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:56Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.453437 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xh4hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0e1eaa6-bee9-401a-b01a-9bf49a938b29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db9b353d1934e3336c4282cb057c52ef64e01675fa36e23d1fa0af7133508a62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f3e0b6c5dac9ff9b4686937a7d1646c4796a3326193349cf1556876bf1ae61a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f3e0b6c5dac9ff9b4686937a7d1646c4796a3326193349cf1556876bf1ae61a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2451dad343dcc9ec3bd5eb7109aefab60da5b700ae69d8eff28fef524795a758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2451dad343dcc9ec3bd5eb7109aefab60da5b700ae69d8eff28fef524795a758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9b888c3602a8f5d0f8cb762ab7d9b6676a0467f9546dbc38c5d2dd99acedd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9b888c3602a8f5d0f8cb762ab7d9b6676a0467f9546dbc38c5d2dd99acedd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3110c36a439f88ab7fc0903fd766db1abef51f129bb1fc68634531d0e30cab0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3110c36a439f88ab7fc0903fd766db1abef51f129bb1fc68634531d0e30cab0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xh4hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:56Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.464486 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qrdw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f11a95e1-135a-4fd2-9a04-1487c56a18e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmr99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmr99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qrdw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:56Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.483404 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b62eed93-4d28-4748-b991-dd9692b58341\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e952d1592344330416348b9ec4add60740a0d23ab554afb70c6419977fb12533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e35deff79558e54b62aa868535d5bc4d4374ede161463dcf5f3a33584867447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9736ff6b49635cb7c26db7bc2292b36cf63053e0d1e4ffa2980a520dd71d9ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe7079ed6fa6b8b64d16c67287c33c2bffa0f42df2166fe66e3f8e173ca87d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83ba28d0b57f4744f3f7f97915fde983e2de227958ddaecd43b3eae9eb406774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:56Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.496839 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4030376c-31f1-420a-935f-f1ee174914e7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2571804bbf066bfec1f4fde3430800aa87d9c5b6db48cc1f3d352d742657ca8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d240ebf61216f20038983f7b8d2e0792fc0f1ed8db050debf02c777197d417b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f66a3bd7503966a7b443b7ea1adbd8554c99e810749bc275e5cfa4e95c6b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4434211239bc61f432b46918bbb691c6257fded2fbd9552ce3344d10e091b3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:56Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.511548 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b3ea01760c8f1e58a55a59a0d453c25ca1298ede32d471f02092fd9d0a43830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e7aa15babbd745afed6bc0d4bbf5526180b8d32f86f8db736066fad3feb506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:56Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.524273 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4694a9b9e5ea09fb1d1963504d94853e22fce9688a502036324f92f39bf8a8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:56Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.528848 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.528886 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.528907 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.529051 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.529095 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:56Z","lastTransitionTime":"2025-12-01T13:58:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.545165 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:56Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.558586 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:56Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.569344 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-62bsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98be7526-98f6-4a4a-b4a6-1d10e76b7a99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d4df68e891df5e4e44371a4c884475b5723d55f6a10d32889cf194ddbe6179\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxjs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-62bsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:56Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.583152 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b10cdae7-154c-4fd6-a308-02843603d7ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b40ee42d7d1c9043b0b5c0dea3baacc46c24842ec64ed3b0d0ae1acfaf65f8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa22ac0ee9ad705a8affafffed373135005ef0380c9a646e06229a24d04d20b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://949f7fe4da35866e66212aea18ec1ed303bd542b07e53ff6562618c19de112cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d7c5d98b7f6dcade92ba77f78aa8345baf8161727831a616192726f6ef6f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dfb22c5155fdc4bfdf9d54bf540e4018bca6a616835af7e8d5079bebcd45da\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T13:58:25Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 13:58:19.764445 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 13:58:19.765610 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3283210930/tls.crt::/tmp/serving-cert-3283210930/tls.key\\\\\\\"\\\\nI1201 13:58:25.461951 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 13:58:25.464641 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 13:58:25.464660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 13:58:25.464680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 13:58:25.464686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 13:58:25.473932 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 13:58:25.473958 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 13:58:25.473983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 13:58:25.473986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 13:58:25.473989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 13:58:25.474166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 13:58:25.477276 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc9645fabd7968d92ce9867130dddbb6c72664af68d6b5475cf3271d1975878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:56Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.600246 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e9e83cdb2aa3565484c1ff3e5bbd72c7f50a132de50e60aec1aa88ad145ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:56Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.613174 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7beb40d-bcd0-43c8-a9fe-c32408790a4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59e8d51b7e0c8b526fab11f39a0a94685aecda8a1d300e1ee2259eb4bbac9003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c2cf1c1e4965502ab56b6d9bbf7840d00225b831b1e613295befad0748d7d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lj9gs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:56Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.631528 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.631585 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.631595 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.631616 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.631626 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:56Z","lastTransitionTime":"2025-12-01T13:58:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.633914 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90163e3776ee8b2a338e3f8e6aa3d918d1d6123d246fbbd88659d342348167a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37437844a9c9719627f96c0afa303f0d449c46def7249b8c69ebd44900661fa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec93c685227d533ac1ab980dca6f72792f746e68511a7a546fb25189b02507d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b64c1a0d36f8addf6040455b1e933756582857ef8098c935b9cc5a115e8afee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d740a1be02b59a062ea8d5c44620cb57725a7ec494fd365e44a8980947c0d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdbccd3910729d23067d94c9ed2f5f3c917cd4880185a3b7e8c4c811cd30e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6e182b8e1ae7a92cc190a93197ab3bdc03ffc5cc65257651261acd04689f8ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41a150020fb39ba2ce75b7cba4d24fff9e245b088b62f43a3feb8dcf8344aa33\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"message\\\":\\\"nal_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-operator-lifecycle-manager/olm-operator-metrics]} name:Service_openshift-operator-lifecycle-manager/olm-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.168:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {63b1440a-0908-4cab-8799-012fa1cf0b07}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 13:58:38.627536 5917 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1201 13:58:38.627873 5917 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1201 13:58:38.627909 5917 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e182b8e1ae7a92cc190a93197ab3bdc03ffc5cc65257651261acd04689f8ee\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T13:58:55Z\\\",\\\"message\\\":\\\"Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-image-registry/image-registry]} name:Service_openshift-image-registry/image-registry_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.93:5000:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {83c1e277-3d22-42ae-a355-f7a0ff0bd171}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 13:58:55.244725 6128 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}\\\\nI1201 13:58:55.244735 6128 services_controller.go:360] Finished syncing service etcd on namespace openshift-etcd for network=default : 1.845548ms\\\\nI1201 13:58:55.244747 6128 services_controller.go:356] Processing sync for service openshift-console-operator/metrics for network=default\\\\nF1201 13:58:55.244780 6128 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c85de7cd35b11158d931065cec6bafaf456be350f4a6095b8a9d0851bf87b1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:56Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.644521 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nhp6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da6d315c-d478-4216-9f3d-57b20ce5ced8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ebdaa73edbe04780c0f421ea88446b4268ed1792af30e1ebb945167db80a548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n25g8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nhp6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:56Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.654715 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kn5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c45f35-33ae-4eba-97c7-eb85a6db85a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf9d61c1ae31a9cc21ea918e419af7e8badf234ef1614ef73bafdc95381935cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b27c5e93a6707bc6b2b6ef81596e39344c0b29f8cac30ac5d62c7eaa5e635f4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6kn5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:56Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.734500 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.734552 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.734563 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.734581 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.734595 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:56Z","lastTransitionTime":"2025-12-01T13:58:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.800501 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tjkqr_b0b45150-070d-4f7c-b53a-d76dcbaa6e6d/ovnkube-controller/2.log" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.804620 4585 scope.go:117] "RemoveContainer" containerID="b6e182b8e1ae7a92cc190a93197ab3bdc03ffc5cc65257651261acd04689f8ee" Dec 01 13:58:56 crc kubenswrapper[4585]: E1201 13:58:56.804789 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-tjkqr_openshift-ovn-kubernetes(b0b45150-070d-4f7c-b53a-d76dcbaa6e6d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" podUID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.818615 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4030376c-31f1-420a-935f-f1ee174914e7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2571804bbf066bfec1f4fde3430800aa87d9c5b6db48cc1f3d352d742657ca8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d240ebf61216f20038983f7b8d2e0792fc0f1ed8db050debf02c777197d417b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f66a3bd7503966a7b443b7ea1adbd8554c99e810749bc275e5cfa4e95c6b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4434211239bc61f432b46918bbb691c6257fded2fbd9552ce3344d10e091b3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:56Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.833573 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b3ea01760c8f1e58a55a59a0d453c25ca1298ede32d471f02092fd9d0a43830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e7aa15babbd745afed6bc0d4bbf5526180b8d32f86f8db736066fad3feb506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:56Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.837568 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.837625 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.837636 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.837653 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.837664 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:56Z","lastTransitionTime":"2025-12-01T13:58:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.845807 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4694a9b9e5ea09fb1d1963504d94853e22fce9688a502036324f92f39bf8a8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:56Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.858359 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:56Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.873398 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xh4hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0e1eaa6-bee9-401a-b01a-9bf49a938b29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db9b353d1934e3336c4282cb057c52ef64e01675fa36e23d1fa0af7133508a62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f3e0b6c5dac9ff9b4686937a7d1646c4796a3326193349cf1556876bf1ae61a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f3e0b6c5dac9ff9b4686937a7d1646c4796a3326193349cf1556876bf1ae61a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2451dad343dcc9ec3bd5eb7109aefab60da5b700ae69d8eff28fef524795a758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2451dad343dcc9ec3bd5eb7109aefab60da5b700ae69d8eff28fef524795a758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9b888c3602a8f5d0f8cb762ab7d9b6676a0467f9546dbc38c5d2dd99acedd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9b888c3602a8f5d0f8cb762ab7d9b6676a0467f9546dbc38c5d2dd99acedd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3110c36a439f88ab7fc0903fd766db1abef51f129bb1fc68634531d0e30cab0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3110c36a439f88ab7fc0903fd766db1abef51f129bb1fc68634531d0e30cab0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xh4hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:56Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.885318 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qrdw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f11a95e1-135a-4fd2-9a04-1487c56a18e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmr99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmr99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qrdw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:56Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.904960 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b62eed93-4d28-4748-b991-dd9692b58341\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e952d1592344330416348b9ec4add60740a0d23ab554afb70c6419977fb12533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e35deff79558e54b62aa868535d5bc4d4374ede161463dcf5f3a33584867447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9736ff6b49635cb7c26db7bc2292b36cf63053e0d1e4ffa2980a520dd71d9ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe7079ed6fa6b8b64d16c67287c33c2bffa0f42df2166fe66e3f8e173ca87d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83ba28d0b57f4744f3f7f97915fde983e2de227958ddaecd43b3eae9eb406774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:56Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.918422 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:56Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.930140 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-62bsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98be7526-98f6-4a4a-b4a6-1d10e76b7a99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d4df68e891df5e4e44371a4c884475b5723d55f6a10d32889cf194ddbe6179\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxjs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-62bsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:56Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.940853 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.940898 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.940912 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.940932 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.940946 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:56Z","lastTransitionTime":"2025-12-01T13:58:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.943915 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e9e83cdb2aa3565484c1ff3e5bbd72c7f50a132de50e60aec1aa88ad145ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:56Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.956296 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7beb40d-bcd0-43c8-a9fe-c32408790a4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59e8d51b7e0c8b526fab11f39a0a94685aecda8a1d300e1ee2259eb4bbac9003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c2cf1c1e4965502ab56b6d9bbf7840d00225b831b1e613295befad0748d7d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lj9gs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:56Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.975644 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90163e3776ee8b2a338e3f8e6aa3d918d1d6123d246fbbd88659d342348167a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37437844a9c9719627f96c0afa303f0d449c46def7249b8c69ebd44900661fa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec93c685227d533ac1ab980dca6f72792f746e68511a7a546fb25189b02507d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b64c1a0d36f8addf6040455b1e933756582857ef8098c935b9cc5a115e8afee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d740a1be02b59a062ea8d5c44620cb57725a7ec494fd365e44a8980947c0d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdbccd3910729d23067d94c9ed2f5f3c917cd4880185a3b7e8c4c811cd30e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6e182b8e1ae7a92cc190a93197ab3bdc03ffc5cc65257651261acd04689f8ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e182b8e1ae7a92cc190a93197ab3bdc03ffc5cc65257651261acd04689f8ee\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T13:58:55Z\\\",\\\"message\\\":\\\"Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-image-registry/image-registry]} name:Service_openshift-image-registry/image-registry_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.93:5000:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {83c1e277-3d22-42ae-a355-f7a0ff0bd171}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 13:58:55.244725 6128 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}\\\\nI1201 13:58:55.244735 6128 services_controller.go:360] Finished syncing service etcd on namespace openshift-etcd for network=default : 1.845548ms\\\\nI1201 13:58:55.244747 6128 services_controller.go:356] Processing sync for service openshift-console-operator/metrics for network=default\\\\nF1201 13:58:55.244780 6128 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-tjkqr_openshift-ovn-kubernetes(b0b45150-070d-4f7c-b53a-d76dcbaa6e6d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c85de7cd35b11158d931065cec6bafaf456be350f4a6095b8a9d0851bf87b1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:56Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.986656 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nhp6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da6d315c-d478-4216-9f3d-57b20ce5ced8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ebdaa73edbe04780c0f421ea88446b4268ed1792af30e1ebb945167db80a548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n25g8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nhp6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:56Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:56 crc kubenswrapper[4585]: I1201 13:58:56.997787 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kn5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c45f35-33ae-4eba-97c7-eb85a6db85a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf9d61c1ae31a9cc21ea918e419af7e8badf234ef1614ef73bafdc95381935cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b27c5e93a6707bc6b2b6ef81596e39344c0b29f8cac30ac5d62c7eaa5e635f4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6kn5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:56Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:57 crc kubenswrapper[4585]: I1201 13:58:57.016778 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b10cdae7-154c-4fd6-a308-02843603d7ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b40ee42d7d1c9043b0b5c0dea3baacc46c24842ec64ed3b0d0ae1acfaf65f8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa22ac0ee9ad705a8affafffed373135005ef0380c9a646e06229a24d04d20b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://949f7fe4da35866e66212aea18ec1ed303bd542b07e53ff6562618c19de112cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d7c5d98b7f6dcade92ba77f78aa8345baf8161727831a616192726f6ef6f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dfb22c5155fdc4bfdf9d54bf540e4018bca6a616835af7e8d5079bebcd45da\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T13:58:25Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 13:58:19.764445 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 13:58:19.765610 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3283210930/tls.crt::/tmp/serving-cert-3283210930/tls.key\\\\\\\"\\\\nI1201 13:58:25.461951 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 13:58:25.464641 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 13:58:25.464660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 13:58:25.464680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 13:58:25.464686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 13:58:25.473932 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 13:58:25.473958 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 13:58:25.473983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 13:58:25.473986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 13:58:25.473989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 13:58:25.474166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 13:58:25.477276 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc9645fabd7968d92ce9867130dddbb6c72664af68d6b5475cf3271d1975878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:57Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:57 crc kubenswrapper[4585]: I1201 13:58:57.030446 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9wjs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e7ad3ad-7937-409b-b1c9-9c801f937400\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3662f79465b11c168f80f17c534c792fb1cc6b8a5c0115d219d39def6db073\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvc7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9wjs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:57Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:57 crc kubenswrapper[4585]: I1201 13:58:57.043695 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:57 crc kubenswrapper[4585]: I1201 13:58:57.043744 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:57 crc kubenswrapper[4585]: I1201 13:58:57.043755 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:57 crc kubenswrapper[4585]: I1201 13:58:57.043774 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:57 crc kubenswrapper[4585]: I1201 13:58:57.043786 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:57Z","lastTransitionTime":"2025-12-01T13:58:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:57 crc kubenswrapper[4585]: I1201 13:58:57.044049 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:58:57Z is after 2025-08-24T17:21:41Z" Dec 01 13:58:57 crc kubenswrapper[4585]: I1201 13:58:57.081595 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f11a95e1-135a-4fd2-9a04-1487c56a18e1-metrics-certs\") pod \"network-metrics-daemon-qrdw5\" (UID: \"f11a95e1-135a-4fd2-9a04-1487c56a18e1\") " pod="openshift-multus/network-metrics-daemon-qrdw5" Dec 01 13:58:57 crc kubenswrapper[4585]: E1201 13:58:57.081777 4585 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 13:58:57 crc kubenswrapper[4585]: E1201 13:58:57.081845 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f11a95e1-135a-4fd2-9a04-1487c56a18e1-metrics-certs podName:f11a95e1-135a-4fd2-9a04-1487c56a18e1 nodeName:}" failed. No retries permitted until 2025-12-01 13:59:13.081822887 +0000 UTC m=+67.066036742 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f11a95e1-135a-4fd2-9a04-1487c56a18e1-metrics-certs") pod "network-metrics-daemon-qrdw5" (UID: "f11a95e1-135a-4fd2-9a04-1487c56a18e1") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 13:58:57 crc kubenswrapper[4585]: I1201 13:58:57.146903 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:57 crc kubenswrapper[4585]: I1201 13:58:57.146951 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:57 crc kubenswrapper[4585]: I1201 13:58:57.146964 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:57 crc kubenswrapper[4585]: I1201 13:58:57.147006 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:57 crc kubenswrapper[4585]: I1201 13:58:57.147020 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:57Z","lastTransitionTime":"2025-12-01T13:58:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:57 crc kubenswrapper[4585]: I1201 13:58:57.249755 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:57 crc kubenswrapper[4585]: I1201 13:58:57.249798 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:57 crc kubenswrapper[4585]: I1201 13:58:57.249810 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:57 crc kubenswrapper[4585]: I1201 13:58:57.249827 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:57 crc kubenswrapper[4585]: I1201 13:58:57.249839 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:57Z","lastTransitionTime":"2025-12-01T13:58:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:57 crc kubenswrapper[4585]: I1201 13:58:57.352760 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:57 crc kubenswrapper[4585]: I1201 13:58:57.352802 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:57 crc kubenswrapper[4585]: I1201 13:58:57.352810 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:57 crc kubenswrapper[4585]: I1201 13:58:57.352828 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:57 crc kubenswrapper[4585]: I1201 13:58:57.352840 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:57Z","lastTransitionTime":"2025-12-01T13:58:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:57 crc kubenswrapper[4585]: I1201 13:58:57.387309 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 13:58:57 crc kubenswrapper[4585]: E1201 13:58:57.387443 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 13:59:29.387416302 +0000 UTC m=+83.371630157 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 13:58:57 crc kubenswrapper[4585]: I1201 13:58:57.387497 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 13:58:57 crc kubenswrapper[4585]: I1201 13:58:57.387524 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 13:58:57 crc kubenswrapper[4585]: I1201 13:58:57.387552 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 13:58:57 crc kubenswrapper[4585]: I1201 13:58:57.387595 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 13:58:57 crc kubenswrapper[4585]: E1201 13:58:57.387682 4585 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 13:58:57 crc kubenswrapper[4585]: E1201 13:58:57.387741 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 13:59:29.387731961 +0000 UTC m=+83.371945816 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 13:58:57 crc kubenswrapper[4585]: E1201 13:58:57.387682 4585 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 13:58:57 crc kubenswrapper[4585]: E1201 13:58:57.387785 4585 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 13:58:57 crc kubenswrapper[4585]: E1201 13:58:57.387803 4585 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 13:58:57 crc kubenswrapper[4585]: E1201 13:58:57.387794 4585 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 13:58:57 crc kubenswrapper[4585]: E1201 13:58:57.387685 4585 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 13:58:57 crc kubenswrapper[4585]: E1201 13:58:57.387866 4585 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 13:58:57 crc kubenswrapper[4585]: E1201 13:58:57.387879 4585 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 13:58:57 crc kubenswrapper[4585]: E1201 13:58:57.387835 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 13:59:29.387826963 +0000 UTC m=+83.372040818 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 13:58:57 crc kubenswrapper[4585]: E1201 13:58:57.387929 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 13:59:29.387917666 +0000 UTC m=+83.372131521 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 13:58:57 crc kubenswrapper[4585]: E1201 13:58:57.387951 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 13:59:29.387943546 +0000 UTC m=+83.372157401 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 13:58:57 crc kubenswrapper[4585]: I1201 13:58:57.411738 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qrdw5" Dec 01 13:58:57 crc kubenswrapper[4585]: E1201 13:58:57.412006 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qrdw5" podUID="f11a95e1-135a-4fd2-9a04-1487c56a18e1" Dec 01 13:58:57 crc kubenswrapper[4585]: I1201 13:58:57.455531 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:57 crc kubenswrapper[4585]: I1201 13:58:57.455586 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:57 crc kubenswrapper[4585]: I1201 13:58:57.455598 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:57 crc kubenswrapper[4585]: I1201 13:58:57.455618 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:57 crc kubenswrapper[4585]: I1201 13:58:57.455631 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:57Z","lastTransitionTime":"2025-12-01T13:58:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:57 crc kubenswrapper[4585]: I1201 13:58:57.558539 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:57 crc kubenswrapper[4585]: I1201 13:58:57.558603 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:57 crc kubenswrapper[4585]: I1201 13:58:57.558612 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:57 crc kubenswrapper[4585]: I1201 13:58:57.558630 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:57 crc kubenswrapper[4585]: I1201 13:58:57.558640 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:57Z","lastTransitionTime":"2025-12-01T13:58:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:57 crc kubenswrapper[4585]: I1201 13:58:57.661658 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:57 crc kubenswrapper[4585]: I1201 13:58:57.662055 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:57 crc kubenswrapper[4585]: I1201 13:58:57.662127 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:57 crc kubenswrapper[4585]: I1201 13:58:57.662210 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:57 crc kubenswrapper[4585]: I1201 13:58:57.662270 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:57Z","lastTransitionTime":"2025-12-01T13:58:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:57 crc kubenswrapper[4585]: I1201 13:58:57.764846 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:57 crc kubenswrapper[4585]: I1201 13:58:57.765287 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:57 crc kubenswrapper[4585]: I1201 13:58:57.765355 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:57 crc kubenswrapper[4585]: I1201 13:58:57.765435 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:57 crc kubenswrapper[4585]: I1201 13:58:57.765515 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:57Z","lastTransitionTime":"2025-12-01T13:58:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:57 crc kubenswrapper[4585]: I1201 13:58:57.867993 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:57 crc kubenswrapper[4585]: I1201 13:58:57.868025 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:57 crc kubenswrapper[4585]: I1201 13:58:57.868035 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:57 crc kubenswrapper[4585]: I1201 13:58:57.868072 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:57 crc kubenswrapper[4585]: I1201 13:58:57.868085 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:57Z","lastTransitionTime":"2025-12-01T13:58:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:57 crc kubenswrapper[4585]: I1201 13:58:57.971834 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:57 crc kubenswrapper[4585]: I1201 13:58:57.971869 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:57 crc kubenswrapper[4585]: I1201 13:58:57.971877 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:57 crc kubenswrapper[4585]: I1201 13:58:57.971895 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:57 crc kubenswrapper[4585]: I1201 13:58:57.971904 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:57Z","lastTransitionTime":"2025-12-01T13:58:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:58 crc kubenswrapper[4585]: I1201 13:58:58.074423 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:58 crc kubenswrapper[4585]: I1201 13:58:58.074716 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:58 crc kubenswrapper[4585]: I1201 13:58:58.074784 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:58 crc kubenswrapper[4585]: I1201 13:58:58.074879 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:58 crc kubenswrapper[4585]: I1201 13:58:58.074948 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:58Z","lastTransitionTime":"2025-12-01T13:58:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:58 crc kubenswrapper[4585]: I1201 13:58:58.178380 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:58 crc kubenswrapper[4585]: I1201 13:58:58.178451 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:58 crc kubenswrapper[4585]: I1201 13:58:58.178470 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:58 crc kubenswrapper[4585]: I1201 13:58:58.178499 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:58 crc kubenswrapper[4585]: I1201 13:58:58.178517 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:58Z","lastTransitionTime":"2025-12-01T13:58:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:58 crc kubenswrapper[4585]: I1201 13:58:58.282583 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:58 crc kubenswrapper[4585]: I1201 13:58:58.282714 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:58 crc kubenswrapper[4585]: I1201 13:58:58.282729 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:58 crc kubenswrapper[4585]: I1201 13:58:58.282751 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:58 crc kubenswrapper[4585]: I1201 13:58:58.282764 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:58Z","lastTransitionTime":"2025-12-01T13:58:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:58 crc kubenswrapper[4585]: I1201 13:58:58.385160 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:58 crc kubenswrapper[4585]: I1201 13:58:58.385226 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:58 crc kubenswrapper[4585]: I1201 13:58:58.385235 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:58 crc kubenswrapper[4585]: I1201 13:58:58.385252 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:58 crc kubenswrapper[4585]: I1201 13:58:58.385263 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:58Z","lastTransitionTime":"2025-12-01T13:58:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:58 crc kubenswrapper[4585]: I1201 13:58:58.411736 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 13:58:58 crc kubenswrapper[4585]: E1201 13:58:58.411883 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 13:58:58 crc kubenswrapper[4585]: I1201 13:58:58.412131 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 13:58:58 crc kubenswrapper[4585]: E1201 13:58:58.412198 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 13:58:58 crc kubenswrapper[4585]: I1201 13:58:58.412249 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 13:58:58 crc kubenswrapper[4585]: E1201 13:58:58.412332 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 13:58:58 crc kubenswrapper[4585]: I1201 13:58:58.488236 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:58 crc kubenswrapper[4585]: I1201 13:58:58.488287 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:58 crc kubenswrapper[4585]: I1201 13:58:58.488298 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:58 crc kubenswrapper[4585]: I1201 13:58:58.488319 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:58 crc kubenswrapper[4585]: I1201 13:58:58.488335 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:58Z","lastTransitionTime":"2025-12-01T13:58:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:58 crc kubenswrapper[4585]: I1201 13:58:58.591008 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:58 crc kubenswrapper[4585]: I1201 13:58:58.591046 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:58 crc kubenswrapper[4585]: I1201 13:58:58.591054 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:58 crc kubenswrapper[4585]: I1201 13:58:58.591070 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:58 crc kubenswrapper[4585]: I1201 13:58:58.591080 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:58Z","lastTransitionTime":"2025-12-01T13:58:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:58 crc kubenswrapper[4585]: I1201 13:58:58.693516 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:58 crc kubenswrapper[4585]: I1201 13:58:58.693562 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:58 crc kubenswrapper[4585]: I1201 13:58:58.693577 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:58 crc kubenswrapper[4585]: I1201 13:58:58.693595 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:58 crc kubenswrapper[4585]: I1201 13:58:58.693608 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:58Z","lastTransitionTime":"2025-12-01T13:58:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:58 crc kubenswrapper[4585]: I1201 13:58:58.796862 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:58 crc kubenswrapper[4585]: I1201 13:58:58.796910 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:58 crc kubenswrapper[4585]: I1201 13:58:58.796926 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:58 crc kubenswrapper[4585]: I1201 13:58:58.796946 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:58 crc kubenswrapper[4585]: I1201 13:58:58.796960 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:58Z","lastTransitionTime":"2025-12-01T13:58:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:58 crc kubenswrapper[4585]: I1201 13:58:58.900315 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:58 crc kubenswrapper[4585]: I1201 13:58:58.900354 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:58 crc kubenswrapper[4585]: I1201 13:58:58.900364 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:58 crc kubenswrapper[4585]: I1201 13:58:58.900383 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:58 crc kubenswrapper[4585]: I1201 13:58:58.900395 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:58Z","lastTransitionTime":"2025-12-01T13:58:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:59 crc kubenswrapper[4585]: I1201 13:58:59.003064 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:59 crc kubenswrapper[4585]: I1201 13:58:59.003102 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:59 crc kubenswrapper[4585]: I1201 13:58:59.003111 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:59 crc kubenswrapper[4585]: I1201 13:58:59.003131 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:59 crc kubenswrapper[4585]: I1201 13:58:59.003183 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:59Z","lastTransitionTime":"2025-12-01T13:58:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:59 crc kubenswrapper[4585]: I1201 13:58:59.110301 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:59 crc kubenswrapper[4585]: I1201 13:58:59.110342 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:59 crc kubenswrapper[4585]: I1201 13:58:59.110353 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:59 crc kubenswrapper[4585]: I1201 13:58:59.110372 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:59 crc kubenswrapper[4585]: I1201 13:58:59.110385 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:59Z","lastTransitionTime":"2025-12-01T13:58:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:59 crc kubenswrapper[4585]: I1201 13:58:59.212785 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:59 crc kubenswrapper[4585]: I1201 13:58:59.212823 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:59 crc kubenswrapper[4585]: I1201 13:58:59.212834 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:59 crc kubenswrapper[4585]: I1201 13:58:59.212849 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:59 crc kubenswrapper[4585]: I1201 13:58:59.212859 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:59Z","lastTransitionTime":"2025-12-01T13:58:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:59 crc kubenswrapper[4585]: I1201 13:58:59.315509 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:59 crc kubenswrapper[4585]: I1201 13:58:59.315545 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:59 crc kubenswrapper[4585]: I1201 13:58:59.315554 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:59 crc kubenswrapper[4585]: I1201 13:58:59.315570 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:59 crc kubenswrapper[4585]: I1201 13:58:59.315580 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:59Z","lastTransitionTime":"2025-12-01T13:58:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:59 crc kubenswrapper[4585]: I1201 13:58:59.412347 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qrdw5" Dec 01 13:58:59 crc kubenswrapper[4585]: E1201 13:58:59.412549 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qrdw5" podUID="f11a95e1-135a-4fd2-9a04-1487c56a18e1" Dec 01 13:58:59 crc kubenswrapper[4585]: I1201 13:58:59.418533 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:59 crc kubenswrapper[4585]: I1201 13:58:59.418615 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:59 crc kubenswrapper[4585]: I1201 13:58:59.418630 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:59 crc kubenswrapper[4585]: I1201 13:58:59.418651 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:59 crc kubenswrapper[4585]: I1201 13:58:59.418696 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:59Z","lastTransitionTime":"2025-12-01T13:58:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:59 crc kubenswrapper[4585]: I1201 13:58:59.521392 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:59 crc kubenswrapper[4585]: I1201 13:58:59.521443 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:59 crc kubenswrapper[4585]: I1201 13:58:59.521456 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:59 crc kubenswrapper[4585]: I1201 13:58:59.521476 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:59 crc kubenswrapper[4585]: I1201 13:58:59.521488 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:59Z","lastTransitionTime":"2025-12-01T13:58:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:59 crc kubenswrapper[4585]: I1201 13:58:59.624616 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:59 crc kubenswrapper[4585]: I1201 13:58:59.625120 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:59 crc kubenswrapper[4585]: I1201 13:58:59.625348 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:59 crc kubenswrapper[4585]: I1201 13:58:59.625565 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:59 crc kubenswrapper[4585]: I1201 13:58:59.625756 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:59Z","lastTransitionTime":"2025-12-01T13:58:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:59 crc kubenswrapper[4585]: I1201 13:58:59.728703 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:59 crc kubenswrapper[4585]: I1201 13:58:59.728766 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:59 crc kubenswrapper[4585]: I1201 13:58:59.728774 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:59 crc kubenswrapper[4585]: I1201 13:58:59.728789 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:59 crc kubenswrapper[4585]: I1201 13:58:59.728820 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:59Z","lastTransitionTime":"2025-12-01T13:58:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:59 crc kubenswrapper[4585]: I1201 13:58:59.831613 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:59 crc kubenswrapper[4585]: I1201 13:58:59.831647 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:59 crc kubenswrapper[4585]: I1201 13:58:59.831655 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:59 crc kubenswrapper[4585]: I1201 13:58:59.831669 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:59 crc kubenswrapper[4585]: I1201 13:58:59.831679 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:59Z","lastTransitionTime":"2025-12-01T13:58:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:58:59 crc kubenswrapper[4585]: I1201 13:58:59.934277 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:58:59 crc kubenswrapper[4585]: I1201 13:58:59.934340 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:58:59 crc kubenswrapper[4585]: I1201 13:58:59.934355 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:58:59 crc kubenswrapper[4585]: I1201 13:58:59.934376 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:58:59 crc kubenswrapper[4585]: I1201 13:58:59.934392 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:58:59Z","lastTransitionTime":"2025-12-01T13:58:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.036513 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.036547 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.036555 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.036573 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.036583 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:00Z","lastTransitionTime":"2025-12-01T13:59:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.066370 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.075849 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.079793 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kn5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c45f35-33ae-4eba-97c7-eb85a6db85a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf9d61c1ae31a9cc21ea918e419af7e8badf234ef1614ef73bafdc95381935cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b27c5e93a6707bc6b2b6ef81596e39344c0b29f8cac30ac5d62c7eaa5e635f4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6kn5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:00Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.093963 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b10cdae7-154c-4fd6-a308-02843603d7ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b40ee42d7d1c9043b0b5c0dea3baacc46c24842ec64ed3b0d0ae1acfaf65f8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa22ac0ee9ad705a8affafffed373135005ef0380c9a646e06229a24d04d20b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://949f7fe4da35866e66212aea18ec1ed303bd542b07e53ff6562618c19de112cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d7c5d98b7f6dcade92ba77f78aa8345baf8161727831a616192726f6ef6f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dfb22c5155fdc4bfdf9d54bf540e4018bca6a616835af7e8d5079bebcd45da\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T13:58:25Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 13:58:19.764445 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 13:58:19.765610 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3283210930/tls.crt::/tmp/serving-cert-3283210930/tls.key\\\\\\\"\\\\nI1201 13:58:25.461951 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 13:58:25.464641 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 13:58:25.464660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 13:58:25.464680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 13:58:25.464686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 13:58:25.473932 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 13:58:25.473958 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 13:58:25.473983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 13:58:25.473986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 13:58:25.473989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 13:58:25.474166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 13:58:25.477276 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc9645fabd7968d92ce9867130dddbb6c72664af68d6b5475cf3271d1975878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:00Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.100627 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.100691 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.100704 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.100724 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.100736 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:00Z","lastTransitionTime":"2025-12-01T13:59:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.110031 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e9e83cdb2aa3565484c1ff3e5bbd72c7f50a132de50e60aec1aa88ad145ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:00Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:00 crc kubenswrapper[4585]: E1201 13:59:00.112464 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e38ee8f4-73f3-495c-b626-577353e9a008\\\",\\\"systemUUID\\\":\\\"fcf25ef7-53cd-4591-aa80-a73d07c13768\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:00Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.120588 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.120641 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.120653 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.120674 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.120686 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:00Z","lastTransitionTime":"2025-12-01T13:59:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.123476 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7beb40d-bcd0-43c8-a9fe-c32408790a4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59e8d51b7e0c8b526fab11f39a0a94685aecda8a1d300e1ee2259eb4bbac9003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c2cf1c1e4965502ab56b6d9bbf7840d00225b831b1e613295befad0748d7d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lj9gs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:00Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:00 crc kubenswrapper[4585]: E1201 13:59:00.133913 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e38ee8f4-73f3-495c-b626-577353e9a008\\\",\\\"systemUUID\\\":\\\"fcf25ef7-53cd-4591-aa80-a73d07c13768\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:00Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.137841 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.137875 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.137888 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.137907 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.137920 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:00Z","lastTransitionTime":"2025-12-01T13:59:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.143760 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90163e3776ee8b2a338e3f8e6aa3d918d1d6123d246fbbd88659d342348167a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37437844a9c9719627f96c0afa303f0d449c46def7249b8c69ebd44900661fa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec93c685227d533ac1ab980dca6f72792f746e68511a7a546fb25189b02507d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b64c1a0d36f8addf6040455b1e933756582857ef8098c935b9cc5a115e8afee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d740a1be02b59a062ea8d5c44620cb57725a7ec494fd365e44a8980947c0d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdbccd3910729d23067d94c9ed2f5f3c917cd4880185a3b7e8c4c811cd30e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6e182b8e1ae7a92cc190a93197ab3bdc03ffc5cc65257651261acd04689f8ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e182b8e1ae7a92cc190a93197ab3bdc03ffc5cc65257651261acd04689f8ee\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T13:58:55Z\\\",\\\"message\\\":\\\"Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-image-registry/image-registry]} name:Service_openshift-image-registry/image-registry_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.93:5000:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {83c1e277-3d22-42ae-a355-f7a0ff0bd171}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 13:58:55.244725 6128 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}\\\\nI1201 13:58:55.244735 6128 services_controller.go:360] Finished syncing service etcd on namespace openshift-etcd for network=default : 1.845548ms\\\\nI1201 13:58:55.244747 6128 services_controller.go:356] Processing sync for service openshift-console-operator/metrics for network=default\\\\nF1201 13:58:55.244780 6128 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-tjkqr_openshift-ovn-kubernetes(b0b45150-070d-4f7c-b53a-d76dcbaa6e6d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c85de7cd35b11158d931065cec6bafaf456be350f4a6095b8a9d0851bf87b1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:00Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:00 crc kubenswrapper[4585]: E1201 13:59:00.148692 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e38ee8f4-73f3-495c-b626-577353e9a008\\\",\\\"systemUUID\\\":\\\"fcf25ef7-53cd-4591-aa80-a73d07c13768\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:00Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.152397 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.152450 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.152459 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.152479 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.152490 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:00Z","lastTransitionTime":"2025-12-01T13:59:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.156037 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nhp6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da6d315c-d478-4216-9f3d-57b20ce5ced8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ebdaa73edbe04780c0f421ea88446b4268ed1792af30e1ebb945167db80a548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n25g8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nhp6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:00Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:00 crc kubenswrapper[4585]: E1201 13:59:00.164056 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e38ee8f4-73f3-495c-b626-577353e9a008\\\",\\\"systemUUID\\\":\\\"fcf25ef7-53cd-4591-aa80-a73d07c13768\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:00Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.167130 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:00Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.168007 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.168042 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.168050 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.168066 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.168075 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:00Z","lastTransitionTime":"2025-12-01T13:59:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.180049 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9wjs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e7ad3ad-7937-409b-b1c9-9c801f937400\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3662f79465b11c168f80f17c534c792fb1cc6b8a5c0115d219d39def6db073\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvc7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9wjs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:00Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:00 crc kubenswrapper[4585]: E1201 13:59:00.185963 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e38ee8f4-73f3-495c-b626-577353e9a008\\\",\\\"systemUUID\\\":\\\"fcf25ef7-53cd-4591-aa80-a73d07c13768\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:00Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:00 crc kubenswrapper[4585]: E1201 13:59:00.186104 4585 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.188176 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.188222 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.188237 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.188256 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.188272 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:00Z","lastTransitionTime":"2025-12-01T13:59:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.198139 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:00Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.213141 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xh4hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0e1eaa6-bee9-401a-b01a-9bf49a938b29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db9b353d1934e3336c4282cb057c52ef64e01675fa36e23d1fa0af7133508a62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f3e0b6c5dac9ff9b4686937a7d1646c4796a3326193349cf1556876bf1ae61a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f3e0b6c5dac9ff9b4686937a7d1646c4796a3326193349cf1556876bf1ae61a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2451dad343dcc9ec3bd5eb7109aefab60da5b700ae69d8eff28fef524795a758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2451dad343dcc9ec3bd5eb7109aefab60da5b700ae69d8eff28fef524795a758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9b888c3602a8f5d0f8cb762ab7d9b6676a0467f9546dbc38c5d2dd99acedd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9b888c3602a8f5d0f8cb762ab7d9b6676a0467f9546dbc38c5d2dd99acedd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3110c36a439f88ab7fc0903fd766db1abef51f129bb1fc68634531d0e30cab0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3110c36a439f88ab7fc0903fd766db1abef51f129bb1fc68634531d0e30cab0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xh4hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:00Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.222377 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qrdw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f11a95e1-135a-4fd2-9a04-1487c56a18e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmr99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmr99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qrdw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:00Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.241248 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b62eed93-4d28-4748-b991-dd9692b58341\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e952d1592344330416348b9ec4add60740a0d23ab554afb70c6419977fb12533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e35deff79558e54b62aa868535d5bc4d4374ede161463dcf5f3a33584867447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9736ff6b49635cb7c26db7bc2292b36cf63053e0d1e4ffa2980a520dd71d9ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe7079ed6fa6b8b64d16c67287c33c2bffa0f42df2166fe66e3f8e173ca87d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83ba28d0b57f4744f3f7f97915fde983e2de227958ddaecd43b3eae9eb406774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:00Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.253724 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4030376c-31f1-420a-935f-f1ee174914e7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2571804bbf066bfec1f4fde3430800aa87d9c5b6db48cc1f3d352d742657ca8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d240ebf61216f20038983f7b8d2e0792fc0f1ed8db050debf02c777197d417b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f66a3bd7503966a7b443b7ea1adbd8554c99e810749bc275e5cfa4e95c6b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4434211239bc61f432b46918bbb691c6257fded2fbd9552ce3344d10e091b3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:00Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.269033 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b3ea01760c8f1e58a55a59a0d453c25ca1298ede32d471f02092fd9d0a43830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e7aa15babbd745afed6bc0d4bbf5526180b8d32f86f8db736066fad3feb506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:00Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.282071 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4694a9b9e5ea09fb1d1963504d94853e22fce9688a502036324f92f39bf8a8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:00Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.291415 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.291473 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.291486 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.291510 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.291525 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:00Z","lastTransitionTime":"2025-12-01T13:59:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.298499 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:00Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.309755 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-62bsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98be7526-98f6-4a4a-b4a6-1d10e76b7a99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d4df68e891df5e4e44371a4c884475b5723d55f6a10d32889cf194ddbe6179\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxjs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-62bsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:00Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.394292 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.394334 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.394344 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.394380 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.394394 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:00Z","lastTransitionTime":"2025-12-01T13:59:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.411724 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.411799 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 13:59:00 crc kubenswrapper[4585]: E1201 13:59:00.411928 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 13:59:00 crc kubenswrapper[4585]: E1201 13:59:00.412031 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.412251 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 13:59:00 crc kubenswrapper[4585]: E1201 13:59:00.412504 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.496924 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.496966 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.496988 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.497002 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.497013 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:00Z","lastTransitionTime":"2025-12-01T13:59:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.599887 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.599950 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.599959 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.599994 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.600006 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:00Z","lastTransitionTime":"2025-12-01T13:59:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.702304 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.702348 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.702360 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.702377 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.702387 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:00Z","lastTransitionTime":"2025-12-01T13:59:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.805184 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.805229 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.805244 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.805263 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.805276 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:00Z","lastTransitionTime":"2025-12-01T13:59:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.908495 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.908568 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.908577 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.908593 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:00 crc kubenswrapper[4585]: I1201 13:59:00.908602 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:00Z","lastTransitionTime":"2025-12-01T13:59:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:01 crc kubenswrapper[4585]: I1201 13:59:01.011004 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:01 crc kubenswrapper[4585]: I1201 13:59:01.011078 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:01 crc kubenswrapper[4585]: I1201 13:59:01.011096 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:01 crc kubenswrapper[4585]: I1201 13:59:01.011119 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:01 crc kubenswrapper[4585]: I1201 13:59:01.011134 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:01Z","lastTransitionTime":"2025-12-01T13:59:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:01 crc kubenswrapper[4585]: I1201 13:59:01.114336 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:01 crc kubenswrapper[4585]: I1201 13:59:01.114394 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:01 crc kubenswrapper[4585]: I1201 13:59:01.114407 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:01 crc kubenswrapper[4585]: I1201 13:59:01.114427 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:01 crc kubenswrapper[4585]: I1201 13:59:01.114440 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:01Z","lastTransitionTime":"2025-12-01T13:59:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:01 crc kubenswrapper[4585]: I1201 13:59:01.217758 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:01 crc kubenswrapper[4585]: I1201 13:59:01.217804 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:01 crc kubenswrapper[4585]: I1201 13:59:01.217813 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:01 crc kubenswrapper[4585]: I1201 13:59:01.217832 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:01 crc kubenswrapper[4585]: I1201 13:59:01.217843 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:01Z","lastTransitionTime":"2025-12-01T13:59:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:01 crc kubenswrapper[4585]: I1201 13:59:01.320940 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:01 crc kubenswrapper[4585]: I1201 13:59:01.320988 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:01 crc kubenswrapper[4585]: I1201 13:59:01.321010 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:01 crc kubenswrapper[4585]: I1201 13:59:01.321025 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:01 crc kubenswrapper[4585]: I1201 13:59:01.321036 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:01Z","lastTransitionTime":"2025-12-01T13:59:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:01 crc kubenswrapper[4585]: I1201 13:59:01.411898 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qrdw5" Dec 01 13:59:01 crc kubenswrapper[4585]: E1201 13:59:01.412186 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qrdw5" podUID="f11a95e1-135a-4fd2-9a04-1487c56a18e1" Dec 01 13:59:01 crc kubenswrapper[4585]: I1201 13:59:01.423897 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:01 crc kubenswrapper[4585]: I1201 13:59:01.423992 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:01 crc kubenswrapper[4585]: I1201 13:59:01.424008 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:01 crc kubenswrapper[4585]: I1201 13:59:01.424026 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:01 crc kubenswrapper[4585]: I1201 13:59:01.424038 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:01Z","lastTransitionTime":"2025-12-01T13:59:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:01 crc kubenswrapper[4585]: I1201 13:59:01.527201 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:01 crc kubenswrapper[4585]: I1201 13:59:01.527239 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:01 crc kubenswrapper[4585]: I1201 13:59:01.527249 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:01 crc kubenswrapper[4585]: I1201 13:59:01.527264 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:01 crc kubenswrapper[4585]: I1201 13:59:01.527273 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:01Z","lastTransitionTime":"2025-12-01T13:59:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:01 crc kubenswrapper[4585]: I1201 13:59:01.630906 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:01 crc kubenswrapper[4585]: I1201 13:59:01.630950 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:01 crc kubenswrapper[4585]: I1201 13:59:01.630960 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:01 crc kubenswrapper[4585]: I1201 13:59:01.630997 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:01 crc kubenswrapper[4585]: I1201 13:59:01.631010 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:01Z","lastTransitionTime":"2025-12-01T13:59:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:01 crc kubenswrapper[4585]: I1201 13:59:01.733347 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:01 crc kubenswrapper[4585]: I1201 13:59:01.733379 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:01 crc kubenswrapper[4585]: I1201 13:59:01.733388 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:01 crc kubenswrapper[4585]: I1201 13:59:01.733403 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:01 crc kubenswrapper[4585]: I1201 13:59:01.733412 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:01Z","lastTransitionTime":"2025-12-01T13:59:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:01 crc kubenswrapper[4585]: I1201 13:59:01.836277 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:01 crc kubenswrapper[4585]: I1201 13:59:01.836349 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:01 crc kubenswrapper[4585]: I1201 13:59:01.836362 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:01 crc kubenswrapper[4585]: I1201 13:59:01.836382 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:01 crc kubenswrapper[4585]: I1201 13:59:01.836396 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:01Z","lastTransitionTime":"2025-12-01T13:59:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:01 crc kubenswrapper[4585]: I1201 13:59:01.942661 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:01 crc kubenswrapper[4585]: I1201 13:59:01.942708 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:01 crc kubenswrapper[4585]: I1201 13:59:01.942720 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:01 crc kubenswrapper[4585]: I1201 13:59:01.942738 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:01 crc kubenswrapper[4585]: I1201 13:59:01.942781 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:01Z","lastTransitionTime":"2025-12-01T13:59:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:02 crc kubenswrapper[4585]: I1201 13:59:02.045186 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:02 crc kubenswrapper[4585]: I1201 13:59:02.045221 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:02 crc kubenswrapper[4585]: I1201 13:59:02.045232 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:02 crc kubenswrapper[4585]: I1201 13:59:02.045248 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:02 crc kubenswrapper[4585]: I1201 13:59:02.045260 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:02Z","lastTransitionTime":"2025-12-01T13:59:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:02 crc kubenswrapper[4585]: I1201 13:59:02.147563 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:02 crc kubenswrapper[4585]: I1201 13:59:02.147609 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:02 crc kubenswrapper[4585]: I1201 13:59:02.147625 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:02 crc kubenswrapper[4585]: I1201 13:59:02.147647 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:02 crc kubenswrapper[4585]: I1201 13:59:02.147662 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:02Z","lastTransitionTime":"2025-12-01T13:59:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:02 crc kubenswrapper[4585]: I1201 13:59:02.250261 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:02 crc kubenswrapper[4585]: I1201 13:59:02.250296 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:02 crc kubenswrapper[4585]: I1201 13:59:02.250307 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:02 crc kubenswrapper[4585]: I1201 13:59:02.250349 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:02 crc kubenswrapper[4585]: I1201 13:59:02.250361 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:02Z","lastTransitionTime":"2025-12-01T13:59:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:02 crc kubenswrapper[4585]: I1201 13:59:02.352795 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:02 crc kubenswrapper[4585]: I1201 13:59:02.352857 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:02 crc kubenswrapper[4585]: I1201 13:59:02.352897 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:02 crc kubenswrapper[4585]: I1201 13:59:02.352937 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:02 crc kubenswrapper[4585]: I1201 13:59:02.352963 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:02Z","lastTransitionTime":"2025-12-01T13:59:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:02 crc kubenswrapper[4585]: I1201 13:59:02.411786 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 13:59:02 crc kubenswrapper[4585]: I1201 13:59:02.411838 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 13:59:02 crc kubenswrapper[4585]: I1201 13:59:02.411940 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 13:59:02 crc kubenswrapper[4585]: E1201 13:59:02.412066 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 13:59:02 crc kubenswrapper[4585]: E1201 13:59:02.412248 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 13:59:02 crc kubenswrapper[4585]: E1201 13:59:02.412348 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 13:59:02 crc kubenswrapper[4585]: I1201 13:59:02.456124 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:02 crc kubenswrapper[4585]: I1201 13:59:02.456208 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:02 crc kubenswrapper[4585]: I1201 13:59:02.456470 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:02 crc kubenswrapper[4585]: I1201 13:59:02.456489 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:02 crc kubenswrapper[4585]: I1201 13:59:02.456499 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:02Z","lastTransitionTime":"2025-12-01T13:59:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:02 crc kubenswrapper[4585]: I1201 13:59:02.559744 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:02 crc kubenswrapper[4585]: I1201 13:59:02.560268 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:02 crc kubenswrapper[4585]: I1201 13:59:02.560466 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:02 crc kubenswrapper[4585]: I1201 13:59:02.560601 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:02 crc kubenswrapper[4585]: I1201 13:59:02.560735 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:02Z","lastTransitionTime":"2025-12-01T13:59:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:02 crc kubenswrapper[4585]: I1201 13:59:02.663834 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:02 crc kubenswrapper[4585]: I1201 13:59:02.663879 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:02 crc kubenswrapper[4585]: I1201 13:59:02.663888 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:02 crc kubenswrapper[4585]: I1201 13:59:02.663910 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:02 crc kubenswrapper[4585]: I1201 13:59:02.663921 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:02Z","lastTransitionTime":"2025-12-01T13:59:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:02 crc kubenswrapper[4585]: I1201 13:59:02.767481 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:02 crc kubenswrapper[4585]: I1201 13:59:02.768995 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:02 crc kubenswrapper[4585]: I1201 13:59:02.769309 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:02 crc kubenswrapper[4585]: I1201 13:59:02.769484 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:02 crc kubenswrapper[4585]: I1201 13:59:02.769646 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:02Z","lastTransitionTime":"2025-12-01T13:59:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:02 crc kubenswrapper[4585]: I1201 13:59:02.873038 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:02 crc kubenswrapper[4585]: I1201 13:59:02.873073 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:02 crc kubenswrapper[4585]: I1201 13:59:02.873082 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:02 crc kubenswrapper[4585]: I1201 13:59:02.873097 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:02 crc kubenswrapper[4585]: I1201 13:59:02.873106 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:02Z","lastTransitionTime":"2025-12-01T13:59:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:02 crc kubenswrapper[4585]: I1201 13:59:02.975817 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:02 crc kubenswrapper[4585]: I1201 13:59:02.975855 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:02 crc kubenswrapper[4585]: I1201 13:59:02.975866 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:02 crc kubenswrapper[4585]: I1201 13:59:02.975885 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:02 crc kubenswrapper[4585]: I1201 13:59:02.975908 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:02Z","lastTransitionTime":"2025-12-01T13:59:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:03 crc kubenswrapper[4585]: I1201 13:59:03.078772 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:03 crc kubenswrapper[4585]: I1201 13:59:03.078824 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:03 crc kubenswrapper[4585]: I1201 13:59:03.078835 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:03 crc kubenswrapper[4585]: I1201 13:59:03.078853 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:03 crc kubenswrapper[4585]: I1201 13:59:03.078869 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:03Z","lastTransitionTime":"2025-12-01T13:59:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:03 crc kubenswrapper[4585]: I1201 13:59:03.181543 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:03 crc kubenswrapper[4585]: I1201 13:59:03.181844 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:03 crc kubenswrapper[4585]: I1201 13:59:03.181854 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:03 crc kubenswrapper[4585]: I1201 13:59:03.181871 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:03 crc kubenswrapper[4585]: I1201 13:59:03.181883 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:03Z","lastTransitionTime":"2025-12-01T13:59:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:03 crc kubenswrapper[4585]: I1201 13:59:03.284164 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:03 crc kubenswrapper[4585]: I1201 13:59:03.284200 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:03 crc kubenswrapper[4585]: I1201 13:59:03.284212 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:03 crc kubenswrapper[4585]: I1201 13:59:03.284231 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:03 crc kubenswrapper[4585]: I1201 13:59:03.284245 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:03Z","lastTransitionTime":"2025-12-01T13:59:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:03 crc kubenswrapper[4585]: I1201 13:59:03.389134 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:03 crc kubenswrapper[4585]: I1201 13:59:03.389221 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:03 crc kubenswrapper[4585]: I1201 13:59:03.389254 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:03 crc kubenswrapper[4585]: I1201 13:59:03.389276 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:03 crc kubenswrapper[4585]: I1201 13:59:03.389288 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:03Z","lastTransitionTime":"2025-12-01T13:59:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:03 crc kubenswrapper[4585]: I1201 13:59:03.412453 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qrdw5" Dec 01 13:59:03 crc kubenswrapper[4585]: E1201 13:59:03.412619 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qrdw5" podUID="f11a95e1-135a-4fd2-9a04-1487c56a18e1" Dec 01 13:59:03 crc kubenswrapper[4585]: I1201 13:59:03.492624 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:03 crc kubenswrapper[4585]: I1201 13:59:03.493225 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:03 crc kubenswrapper[4585]: I1201 13:59:03.493442 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:03 crc kubenswrapper[4585]: I1201 13:59:03.493534 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:03 crc kubenswrapper[4585]: I1201 13:59:03.493608 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:03Z","lastTransitionTime":"2025-12-01T13:59:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:03 crc kubenswrapper[4585]: I1201 13:59:03.596674 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:03 crc kubenswrapper[4585]: I1201 13:59:03.597002 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:03 crc kubenswrapper[4585]: I1201 13:59:03.597083 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:03 crc kubenswrapper[4585]: I1201 13:59:03.597210 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:03 crc kubenswrapper[4585]: I1201 13:59:03.597286 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:03Z","lastTransitionTime":"2025-12-01T13:59:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:03 crc kubenswrapper[4585]: I1201 13:59:03.700133 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:03 crc kubenswrapper[4585]: I1201 13:59:03.700444 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:03 crc kubenswrapper[4585]: I1201 13:59:03.700584 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:03 crc kubenswrapper[4585]: I1201 13:59:03.700767 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:03 crc kubenswrapper[4585]: I1201 13:59:03.700914 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:03Z","lastTransitionTime":"2025-12-01T13:59:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:03 crc kubenswrapper[4585]: I1201 13:59:03.804428 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:03 crc kubenswrapper[4585]: I1201 13:59:03.804926 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:03 crc kubenswrapper[4585]: I1201 13:59:03.805187 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:03 crc kubenswrapper[4585]: I1201 13:59:03.805416 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:03 crc kubenswrapper[4585]: I1201 13:59:03.805830 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:03Z","lastTransitionTime":"2025-12-01T13:59:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:03 crc kubenswrapper[4585]: I1201 13:59:03.909040 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:03 crc kubenswrapper[4585]: I1201 13:59:03.909366 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:03 crc kubenswrapper[4585]: I1201 13:59:03.909462 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:03 crc kubenswrapper[4585]: I1201 13:59:03.909583 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:03 crc kubenswrapper[4585]: I1201 13:59:03.909818 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:03Z","lastTransitionTime":"2025-12-01T13:59:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:04 crc kubenswrapper[4585]: I1201 13:59:04.012953 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:04 crc kubenswrapper[4585]: I1201 13:59:04.013282 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:04 crc kubenswrapper[4585]: I1201 13:59:04.013379 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:04 crc kubenswrapper[4585]: I1201 13:59:04.013480 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:04 crc kubenswrapper[4585]: I1201 13:59:04.013575 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:04Z","lastTransitionTime":"2025-12-01T13:59:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:04 crc kubenswrapper[4585]: I1201 13:59:04.123334 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:04 crc kubenswrapper[4585]: I1201 13:59:04.123371 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:04 crc kubenswrapper[4585]: I1201 13:59:04.123382 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:04 crc kubenswrapper[4585]: I1201 13:59:04.123425 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:04 crc kubenswrapper[4585]: I1201 13:59:04.123436 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:04Z","lastTransitionTime":"2025-12-01T13:59:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:04 crc kubenswrapper[4585]: I1201 13:59:04.226334 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:04 crc kubenswrapper[4585]: I1201 13:59:04.226699 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:04 crc kubenswrapper[4585]: I1201 13:59:04.226791 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:04 crc kubenswrapper[4585]: I1201 13:59:04.226886 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:04 crc kubenswrapper[4585]: I1201 13:59:04.227148 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:04Z","lastTransitionTime":"2025-12-01T13:59:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:04 crc kubenswrapper[4585]: I1201 13:59:04.330356 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:04 crc kubenswrapper[4585]: I1201 13:59:04.330423 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:04 crc kubenswrapper[4585]: I1201 13:59:04.330437 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:04 crc kubenswrapper[4585]: I1201 13:59:04.330451 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:04 crc kubenswrapper[4585]: I1201 13:59:04.330461 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:04Z","lastTransitionTime":"2025-12-01T13:59:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:04 crc kubenswrapper[4585]: I1201 13:59:04.412813 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 13:59:04 crc kubenswrapper[4585]: E1201 13:59:04.412929 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 13:59:04 crc kubenswrapper[4585]: I1201 13:59:04.413022 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 13:59:04 crc kubenswrapper[4585]: I1201 13:59:04.412814 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 13:59:04 crc kubenswrapper[4585]: E1201 13:59:04.414915 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 13:59:04 crc kubenswrapper[4585]: E1201 13:59:04.415678 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 13:59:04 crc kubenswrapper[4585]: I1201 13:59:04.433637 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:04 crc kubenswrapper[4585]: I1201 13:59:04.433690 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:04 crc kubenswrapper[4585]: I1201 13:59:04.433703 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:04 crc kubenswrapper[4585]: I1201 13:59:04.433720 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:04 crc kubenswrapper[4585]: I1201 13:59:04.433735 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:04Z","lastTransitionTime":"2025-12-01T13:59:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:04 crc kubenswrapper[4585]: I1201 13:59:04.536835 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:04 crc kubenswrapper[4585]: I1201 13:59:04.536863 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:04 crc kubenswrapper[4585]: I1201 13:59:04.536872 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:04 crc kubenswrapper[4585]: I1201 13:59:04.536884 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:04 crc kubenswrapper[4585]: I1201 13:59:04.536892 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:04Z","lastTransitionTime":"2025-12-01T13:59:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:04 crc kubenswrapper[4585]: I1201 13:59:04.639478 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:04 crc kubenswrapper[4585]: I1201 13:59:04.639805 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:04 crc kubenswrapper[4585]: I1201 13:59:04.640041 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:04 crc kubenswrapper[4585]: I1201 13:59:04.640286 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:04 crc kubenswrapper[4585]: I1201 13:59:04.640478 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:04Z","lastTransitionTime":"2025-12-01T13:59:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:04 crc kubenswrapper[4585]: I1201 13:59:04.742244 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:04 crc kubenswrapper[4585]: I1201 13:59:04.742505 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:04 crc kubenswrapper[4585]: I1201 13:59:04.742566 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:04 crc kubenswrapper[4585]: I1201 13:59:04.742635 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:04 crc kubenswrapper[4585]: I1201 13:59:04.742700 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:04Z","lastTransitionTime":"2025-12-01T13:59:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:04 crc kubenswrapper[4585]: I1201 13:59:04.844916 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:04 crc kubenswrapper[4585]: I1201 13:59:04.844953 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:04 crc kubenswrapper[4585]: I1201 13:59:04.844964 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:04 crc kubenswrapper[4585]: I1201 13:59:04.844993 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:04 crc kubenswrapper[4585]: I1201 13:59:04.845004 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:04Z","lastTransitionTime":"2025-12-01T13:59:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:04 crc kubenswrapper[4585]: I1201 13:59:04.947523 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:04 crc kubenswrapper[4585]: I1201 13:59:04.947589 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:04 crc kubenswrapper[4585]: I1201 13:59:04.947600 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:04 crc kubenswrapper[4585]: I1201 13:59:04.947619 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:04 crc kubenswrapper[4585]: I1201 13:59:04.947630 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:04Z","lastTransitionTime":"2025-12-01T13:59:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:05 crc kubenswrapper[4585]: I1201 13:59:05.050163 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:05 crc kubenswrapper[4585]: I1201 13:59:05.050516 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:05 crc kubenswrapper[4585]: I1201 13:59:05.050683 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:05 crc kubenswrapper[4585]: I1201 13:59:05.050842 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:05 crc kubenswrapper[4585]: I1201 13:59:05.051093 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:05Z","lastTransitionTime":"2025-12-01T13:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:05 crc kubenswrapper[4585]: I1201 13:59:05.154139 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:05 crc kubenswrapper[4585]: I1201 13:59:05.154192 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:05 crc kubenswrapper[4585]: I1201 13:59:05.154204 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:05 crc kubenswrapper[4585]: I1201 13:59:05.154221 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:05 crc kubenswrapper[4585]: I1201 13:59:05.154233 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:05Z","lastTransitionTime":"2025-12-01T13:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:05 crc kubenswrapper[4585]: I1201 13:59:05.256198 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:05 crc kubenswrapper[4585]: I1201 13:59:05.256225 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:05 crc kubenswrapper[4585]: I1201 13:59:05.256233 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:05 crc kubenswrapper[4585]: I1201 13:59:05.256247 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:05 crc kubenswrapper[4585]: I1201 13:59:05.256255 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:05Z","lastTransitionTime":"2025-12-01T13:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:05 crc kubenswrapper[4585]: I1201 13:59:05.357910 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:05 crc kubenswrapper[4585]: I1201 13:59:05.357997 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:05 crc kubenswrapper[4585]: I1201 13:59:05.358010 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:05 crc kubenswrapper[4585]: I1201 13:59:05.358027 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:05 crc kubenswrapper[4585]: I1201 13:59:05.358037 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:05Z","lastTransitionTime":"2025-12-01T13:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:05 crc kubenswrapper[4585]: I1201 13:59:05.412045 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qrdw5" Dec 01 13:59:05 crc kubenswrapper[4585]: E1201 13:59:05.412239 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qrdw5" podUID="f11a95e1-135a-4fd2-9a04-1487c56a18e1" Dec 01 13:59:05 crc kubenswrapper[4585]: I1201 13:59:05.460640 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:05 crc kubenswrapper[4585]: I1201 13:59:05.460678 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:05 crc kubenswrapper[4585]: I1201 13:59:05.460687 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:05 crc kubenswrapper[4585]: I1201 13:59:05.460701 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:05 crc kubenswrapper[4585]: I1201 13:59:05.460709 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:05Z","lastTransitionTime":"2025-12-01T13:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:05 crc kubenswrapper[4585]: I1201 13:59:05.562656 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:05 crc kubenswrapper[4585]: I1201 13:59:05.562717 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:05 crc kubenswrapper[4585]: I1201 13:59:05.562728 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:05 crc kubenswrapper[4585]: I1201 13:59:05.562746 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:05 crc kubenswrapper[4585]: I1201 13:59:05.562760 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:05Z","lastTransitionTime":"2025-12-01T13:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:05 crc kubenswrapper[4585]: I1201 13:59:05.664839 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:05 crc kubenswrapper[4585]: I1201 13:59:05.665214 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:05 crc kubenswrapper[4585]: I1201 13:59:05.665285 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:05 crc kubenswrapper[4585]: I1201 13:59:05.665352 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:05 crc kubenswrapper[4585]: I1201 13:59:05.665414 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:05Z","lastTransitionTime":"2025-12-01T13:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:05 crc kubenswrapper[4585]: I1201 13:59:05.767366 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:05 crc kubenswrapper[4585]: I1201 13:59:05.767642 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:05 crc kubenswrapper[4585]: I1201 13:59:05.767739 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:05 crc kubenswrapper[4585]: I1201 13:59:05.767830 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:05 crc kubenswrapper[4585]: I1201 13:59:05.767906 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:05Z","lastTransitionTime":"2025-12-01T13:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:05 crc kubenswrapper[4585]: I1201 13:59:05.870532 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:05 crc kubenswrapper[4585]: I1201 13:59:05.870752 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:05 crc kubenswrapper[4585]: I1201 13:59:05.870816 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:05 crc kubenswrapper[4585]: I1201 13:59:05.870934 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:05 crc kubenswrapper[4585]: I1201 13:59:05.871038 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:05Z","lastTransitionTime":"2025-12-01T13:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:05 crc kubenswrapper[4585]: I1201 13:59:05.973822 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:05 crc kubenswrapper[4585]: I1201 13:59:05.974129 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:05 crc kubenswrapper[4585]: I1201 13:59:05.974196 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:05 crc kubenswrapper[4585]: I1201 13:59:05.974387 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:05 crc kubenswrapper[4585]: I1201 13:59:05.974674 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:05Z","lastTransitionTime":"2025-12-01T13:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.076716 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.076950 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.077192 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.077427 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.077623 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:06Z","lastTransitionTime":"2025-12-01T13:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.179923 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.180195 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.180254 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.180320 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.180408 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:06Z","lastTransitionTime":"2025-12-01T13:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.281893 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.282254 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.282337 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.282419 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.282504 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:06Z","lastTransitionTime":"2025-12-01T13:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.384528 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.384559 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.384569 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.384581 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.384589 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:06Z","lastTransitionTime":"2025-12-01T13:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.411728 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 13:59:06 crc kubenswrapper[4585]: E1201 13:59:06.411840 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.411892 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 13:59:06 crc kubenswrapper[4585]: E1201 13:59:06.412089 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.412105 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 13:59:06 crc kubenswrapper[4585]: E1201 13:59:06.412183 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.426368 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:06Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.436813 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-62bsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98be7526-98f6-4a4a-b4a6-1d10e76b7a99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d4df68e891df5e4e44371a4c884475b5723d55f6a10d32889cf194ddbe6179\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxjs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-62bsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:06Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.454633 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90163e3776ee8b2a338e3f8e6aa3d918d1d6123d246fbbd88659d342348167a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37437844a9c9719627f96c0afa303f0d449c46def7249b8c69ebd44900661fa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec93c685227d533ac1ab980dca6f72792f746e68511a7a546fb25189b02507d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b64c1a0d36f8addf6040455b1e933756582857ef8098c935b9cc5a115e8afee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d740a1be02b59a062ea8d5c44620cb57725a7ec494fd365e44a8980947c0d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdbccd3910729d23067d94c9ed2f5f3c917cd4880185a3b7e8c4c811cd30e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6e182b8e1ae7a92cc190a93197ab3bdc03ffc5cc65257651261acd04689f8ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e182b8e1ae7a92cc190a93197ab3bdc03ffc5cc65257651261acd04689f8ee\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T13:58:55Z\\\",\\\"message\\\":\\\"Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-image-registry/image-registry]} name:Service_openshift-image-registry/image-registry_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.93:5000:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {83c1e277-3d22-42ae-a355-f7a0ff0bd171}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 13:58:55.244725 6128 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}\\\\nI1201 13:58:55.244735 6128 services_controller.go:360] Finished syncing service etcd on namespace openshift-etcd for network=default : 1.845548ms\\\\nI1201 13:58:55.244747 6128 services_controller.go:356] Processing sync for service openshift-console-operator/metrics for network=default\\\\nF1201 13:58:55.244780 6128 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-tjkqr_openshift-ovn-kubernetes(b0b45150-070d-4f7c-b53a-d76dcbaa6e6d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c85de7cd35b11158d931065cec6bafaf456be350f4a6095b8a9d0851bf87b1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:06Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.466001 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nhp6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da6d315c-d478-4216-9f3d-57b20ce5ced8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ebdaa73edbe04780c0f421ea88446b4268ed1792af30e1ebb945167db80a548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n25g8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nhp6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:06Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.477259 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kn5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c45f35-33ae-4eba-97c7-eb85a6db85a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf9d61c1ae31a9cc21ea918e419af7e8badf234ef1614ef73bafdc95381935cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b27c5e93a6707bc6b2b6ef81596e39344c0b29f8cac30ac5d62c7eaa5e635f4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6kn5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:06Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.486583 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.486859 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.486952 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.487096 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.487191 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:06Z","lastTransitionTime":"2025-12-01T13:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.489288 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b10cdae7-154c-4fd6-a308-02843603d7ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b40ee42d7d1c9043b0b5c0dea3baacc46c24842ec64ed3b0d0ae1acfaf65f8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa22ac0ee9ad705a8affafffed373135005ef0380c9a646e06229a24d04d20b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://949f7fe4da35866e66212aea18ec1ed303bd542b07e53ff6562618c19de112cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d7c5d98b7f6dcade92ba77f78aa8345baf8161727831a616192726f6ef6f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dfb22c5155fdc4bfdf9d54bf540e4018bca6a616835af7e8d5079bebcd45da\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T13:58:25Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 13:58:19.764445 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 13:58:19.765610 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3283210930/tls.crt::/tmp/serving-cert-3283210930/tls.key\\\\\\\"\\\\nI1201 13:58:25.461951 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 13:58:25.464641 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 13:58:25.464660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 13:58:25.464680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 13:58:25.464686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 13:58:25.473932 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 13:58:25.473958 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 13:58:25.473983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 13:58:25.473986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 13:58:25.473989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 13:58:25.474166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 13:58:25.477276 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc9645fabd7968d92ce9867130dddbb6c72664af68d6b5475cf3271d1975878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:06Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.501007 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e9e83cdb2aa3565484c1ff3e5bbd72c7f50a132de50e60aec1aa88ad145ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:06Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.513496 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7beb40d-bcd0-43c8-a9fe-c32408790a4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59e8d51b7e0c8b526fab11f39a0a94685aecda8a1d300e1ee2259eb4bbac9003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c2cf1c1e4965502ab56b6d9bbf7840d00225b831b1e613295befad0748d7d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lj9gs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:06Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.527510 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:06Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.542566 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9wjs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e7ad3ad-7937-409b-b1c9-9c801f937400\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3662f79465b11c168f80f17c534c792fb1cc6b8a5c0115d219d39def6db073\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvc7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9wjs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:06Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.554576 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b3ea01760c8f1e58a55a59a0d453c25ca1298ede32d471f02092fd9d0a43830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e7aa15babbd745afed6bc0d4bbf5526180b8d32f86f8db736066fad3feb506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:06Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.565418 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4694a9b9e5ea09fb1d1963504d94853e22fce9688a502036324f92f39bf8a8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:06Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.575825 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:06Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.588112 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xh4hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0e1eaa6-bee9-401a-b01a-9bf49a938b29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db9b353d1934e3336c4282cb057c52ef64e01675fa36e23d1fa0af7133508a62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f3e0b6c5dac9ff9b4686937a7d1646c4796a3326193349cf1556876bf1ae61a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f3e0b6c5dac9ff9b4686937a7d1646c4796a3326193349cf1556876bf1ae61a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2451dad343dcc9ec3bd5eb7109aefab60da5b700ae69d8eff28fef524795a758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2451dad343dcc9ec3bd5eb7109aefab60da5b700ae69d8eff28fef524795a758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9b888c3602a8f5d0f8cb762ab7d9b6676a0467f9546dbc38c5d2dd99acedd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9b888c3602a8f5d0f8cb762ab7d9b6676a0467f9546dbc38c5d2dd99acedd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3110c36a439f88ab7fc0903fd766db1abef51f129bb1fc68634531d0e30cab0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3110c36a439f88ab7fc0903fd766db1abef51f129bb1fc68634531d0e30cab0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xh4hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:06Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.589814 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.589863 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.589874 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.589891 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.589902 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:06Z","lastTransitionTime":"2025-12-01T13:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.599059 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qrdw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f11a95e1-135a-4fd2-9a04-1487c56a18e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmr99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmr99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qrdw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:06Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.617023 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b62eed93-4d28-4748-b991-dd9692b58341\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e952d1592344330416348b9ec4add60740a0d23ab554afb70c6419977fb12533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e35deff79558e54b62aa868535d5bc4d4374ede161463dcf5f3a33584867447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9736ff6b49635cb7c26db7bc2292b36cf63053e0d1e4ffa2980a520dd71d9ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe7079ed6fa6b8b64d16c67287c33c2bffa0f42df2166fe66e3f8e173ca87d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83ba28d0b57f4744f3f7f97915fde983e2de227958ddaecd43b3eae9eb406774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:06Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.629020 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4030376c-31f1-420a-935f-f1ee174914e7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2571804bbf066bfec1f4fde3430800aa87d9c5b6db48cc1f3d352d742657ca8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d240ebf61216f20038983f7b8d2e0792fc0f1ed8db050debf02c777197d417b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f66a3bd7503966a7b443b7ea1adbd8554c99e810749bc275e5cfa4e95c6b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4434211239bc61f432b46918bbb691c6257fded2fbd9552ce3344d10e091b3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:06Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.638880 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"979efa67-6804-42c6-9661-43397d760d30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43ecbe420cb6fc8702eae53c7d3a369f12a986685824f1f08c15398e47985cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbb6a609f35d4968e1f0a001362568ddcc21b80f897ad76b3b7f9e7f4b1651af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2e64b25d69a7d86297a1761d8a2c7e62e508e69a19ff6d48679efee71724b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfc93b1f443fde238aa3a3b7a475749b55ddefd79b2380833c6288aa73b131db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfc93b1f443fde238aa3a3b7a475749b55ddefd79b2380833c6288aa73b131db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:06Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.692479 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.692759 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.692839 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.692924 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.693040 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:06Z","lastTransitionTime":"2025-12-01T13:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.794948 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.795012 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.795020 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.795035 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.795044 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:06Z","lastTransitionTime":"2025-12-01T13:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.897086 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.897133 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.897144 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.897161 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.897172 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:06Z","lastTransitionTime":"2025-12-01T13:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.998913 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.998943 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.998954 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.998988 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:06 crc kubenswrapper[4585]: I1201 13:59:06.998999 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:06Z","lastTransitionTime":"2025-12-01T13:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:07 crc kubenswrapper[4585]: I1201 13:59:07.101568 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:07 crc kubenswrapper[4585]: I1201 13:59:07.101882 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:07 crc kubenswrapper[4585]: I1201 13:59:07.102054 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:07 crc kubenswrapper[4585]: I1201 13:59:07.102162 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:07 crc kubenswrapper[4585]: I1201 13:59:07.102255 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:07Z","lastTransitionTime":"2025-12-01T13:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:07 crc kubenswrapper[4585]: I1201 13:59:07.206151 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:07 crc kubenswrapper[4585]: I1201 13:59:07.206192 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:07 crc kubenswrapper[4585]: I1201 13:59:07.206203 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:07 crc kubenswrapper[4585]: I1201 13:59:07.206220 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:07 crc kubenswrapper[4585]: I1201 13:59:07.206231 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:07Z","lastTransitionTime":"2025-12-01T13:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:07 crc kubenswrapper[4585]: I1201 13:59:07.339788 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:07 crc kubenswrapper[4585]: I1201 13:59:07.339853 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:07 crc kubenswrapper[4585]: I1201 13:59:07.339864 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:07 crc kubenswrapper[4585]: I1201 13:59:07.339889 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:07 crc kubenswrapper[4585]: I1201 13:59:07.339902 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:07Z","lastTransitionTime":"2025-12-01T13:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:07 crc kubenswrapper[4585]: I1201 13:59:07.411517 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qrdw5" Dec 01 13:59:07 crc kubenswrapper[4585]: E1201 13:59:07.411718 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qrdw5" podUID="f11a95e1-135a-4fd2-9a04-1487c56a18e1" Dec 01 13:59:07 crc kubenswrapper[4585]: I1201 13:59:07.412885 4585 scope.go:117] "RemoveContainer" containerID="b6e182b8e1ae7a92cc190a93197ab3bdc03ffc5cc65257651261acd04689f8ee" Dec 01 13:59:07 crc kubenswrapper[4585]: E1201 13:59:07.413081 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-tjkqr_openshift-ovn-kubernetes(b0b45150-070d-4f7c-b53a-d76dcbaa6e6d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" podUID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" Dec 01 13:59:07 crc kubenswrapper[4585]: I1201 13:59:07.442236 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:07 crc kubenswrapper[4585]: I1201 13:59:07.442283 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:07 crc kubenswrapper[4585]: I1201 13:59:07.442296 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:07 crc kubenswrapper[4585]: I1201 13:59:07.442314 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:07 crc kubenswrapper[4585]: I1201 13:59:07.442327 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:07Z","lastTransitionTime":"2025-12-01T13:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:07 crc kubenswrapper[4585]: I1201 13:59:07.544306 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:07 crc kubenswrapper[4585]: I1201 13:59:07.544343 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:07 crc kubenswrapper[4585]: I1201 13:59:07.544350 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:07 crc kubenswrapper[4585]: I1201 13:59:07.544364 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:07 crc kubenswrapper[4585]: I1201 13:59:07.544373 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:07Z","lastTransitionTime":"2025-12-01T13:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:07 crc kubenswrapper[4585]: I1201 13:59:07.647203 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:07 crc kubenswrapper[4585]: I1201 13:59:07.647250 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:07 crc kubenswrapper[4585]: I1201 13:59:07.647263 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:07 crc kubenswrapper[4585]: I1201 13:59:07.647280 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:07 crc kubenswrapper[4585]: I1201 13:59:07.647292 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:07Z","lastTransitionTime":"2025-12-01T13:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:07 crc kubenswrapper[4585]: I1201 13:59:07.750035 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:07 crc kubenswrapper[4585]: I1201 13:59:07.750088 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:07 crc kubenswrapper[4585]: I1201 13:59:07.750101 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:07 crc kubenswrapper[4585]: I1201 13:59:07.750117 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:07 crc kubenswrapper[4585]: I1201 13:59:07.750128 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:07Z","lastTransitionTime":"2025-12-01T13:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:07 crc kubenswrapper[4585]: I1201 13:59:07.853420 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:07 crc kubenswrapper[4585]: I1201 13:59:07.853781 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:07 crc kubenswrapper[4585]: I1201 13:59:07.853869 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:07 crc kubenswrapper[4585]: I1201 13:59:07.853989 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:07 crc kubenswrapper[4585]: I1201 13:59:07.854102 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:07Z","lastTransitionTime":"2025-12-01T13:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:07 crc kubenswrapper[4585]: I1201 13:59:07.958215 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:07 crc kubenswrapper[4585]: I1201 13:59:07.958321 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:07 crc kubenswrapper[4585]: I1201 13:59:07.958344 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:07 crc kubenswrapper[4585]: I1201 13:59:07.958374 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:07 crc kubenswrapper[4585]: I1201 13:59:07.958394 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:07Z","lastTransitionTime":"2025-12-01T13:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:08 crc kubenswrapper[4585]: I1201 13:59:08.060585 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:08 crc kubenswrapper[4585]: I1201 13:59:08.060646 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:08 crc kubenswrapper[4585]: I1201 13:59:08.060658 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:08 crc kubenswrapper[4585]: I1201 13:59:08.060673 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:08 crc kubenswrapper[4585]: I1201 13:59:08.060682 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:08Z","lastTransitionTime":"2025-12-01T13:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:08 crc kubenswrapper[4585]: I1201 13:59:08.163275 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:08 crc kubenswrapper[4585]: I1201 13:59:08.163322 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:08 crc kubenswrapper[4585]: I1201 13:59:08.163333 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:08 crc kubenswrapper[4585]: I1201 13:59:08.163608 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:08 crc kubenswrapper[4585]: I1201 13:59:08.163623 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:08Z","lastTransitionTime":"2025-12-01T13:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:08 crc kubenswrapper[4585]: I1201 13:59:08.266425 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:08 crc kubenswrapper[4585]: I1201 13:59:08.266487 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:08 crc kubenswrapper[4585]: I1201 13:59:08.266501 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:08 crc kubenswrapper[4585]: I1201 13:59:08.266527 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:08 crc kubenswrapper[4585]: I1201 13:59:08.266539 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:08Z","lastTransitionTime":"2025-12-01T13:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:08 crc kubenswrapper[4585]: I1201 13:59:08.368963 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:08 crc kubenswrapper[4585]: I1201 13:59:08.369019 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:08 crc kubenswrapper[4585]: I1201 13:59:08.369030 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:08 crc kubenswrapper[4585]: I1201 13:59:08.369047 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:08 crc kubenswrapper[4585]: I1201 13:59:08.369057 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:08Z","lastTransitionTime":"2025-12-01T13:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:08 crc kubenswrapper[4585]: I1201 13:59:08.412006 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 13:59:08 crc kubenswrapper[4585]: I1201 13:59:08.412037 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 13:59:08 crc kubenswrapper[4585]: I1201 13:59:08.412037 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 13:59:08 crc kubenswrapper[4585]: E1201 13:59:08.412146 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 13:59:08 crc kubenswrapper[4585]: E1201 13:59:08.412246 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 13:59:08 crc kubenswrapper[4585]: E1201 13:59:08.412305 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 13:59:08 crc kubenswrapper[4585]: I1201 13:59:08.471112 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:08 crc kubenswrapper[4585]: I1201 13:59:08.471155 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:08 crc kubenswrapper[4585]: I1201 13:59:08.471166 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:08 crc kubenswrapper[4585]: I1201 13:59:08.471181 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:08 crc kubenswrapper[4585]: I1201 13:59:08.471191 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:08Z","lastTransitionTime":"2025-12-01T13:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:08 crc kubenswrapper[4585]: I1201 13:59:08.573375 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:08 crc kubenswrapper[4585]: I1201 13:59:08.573415 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:08 crc kubenswrapper[4585]: I1201 13:59:08.573427 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:08 crc kubenswrapper[4585]: I1201 13:59:08.573443 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:08 crc kubenswrapper[4585]: I1201 13:59:08.573454 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:08Z","lastTransitionTime":"2025-12-01T13:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:08 crc kubenswrapper[4585]: I1201 13:59:08.676207 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:08 crc kubenswrapper[4585]: I1201 13:59:08.676246 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:08 crc kubenswrapper[4585]: I1201 13:59:08.676255 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:08 crc kubenswrapper[4585]: I1201 13:59:08.676270 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:08 crc kubenswrapper[4585]: I1201 13:59:08.676279 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:08Z","lastTransitionTime":"2025-12-01T13:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:08 crc kubenswrapper[4585]: I1201 13:59:08.778603 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:08 crc kubenswrapper[4585]: I1201 13:59:08.778655 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:08 crc kubenswrapper[4585]: I1201 13:59:08.778666 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:08 crc kubenswrapper[4585]: I1201 13:59:08.778679 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:08 crc kubenswrapper[4585]: I1201 13:59:08.778690 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:08Z","lastTransitionTime":"2025-12-01T13:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:08 crc kubenswrapper[4585]: I1201 13:59:08.881106 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:08 crc kubenswrapper[4585]: I1201 13:59:08.881146 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:08 crc kubenswrapper[4585]: I1201 13:59:08.881157 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:08 crc kubenswrapper[4585]: I1201 13:59:08.881172 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:08 crc kubenswrapper[4585]: I1201 13:59:08.881183 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:08Z","lastTransitionTime":"2025-12-01T13:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:08 crc kubenswrapper[4585]: I1201 13:59:08.984852 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:08 crc kubenswrapper[4585]: I1201 13:59:08.984909 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:08 crc kubenswrapper[4585]: I1201 13:59:08.984919 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:08 crc kubenswrapper[4585]: I1201 13:59:08.984941 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:08 crc kubenswrapper[4585]: I1201 13:59:08.984955 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:08Z","lastTransitionTime":"2025-12-01T13:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:09 crc kubenswrapper[4585]: I1201 13:59:09.087535 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:09 crc kubenswrapper[4585]: I1201 13:59:09.087588 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:09 crc kubenswrapper[4585]: I1201 13:59:09.087605 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:09 crc kubenswrapper[4585]: I1201 13:59:09.087625 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:09 crc kubenswrapper[4585]: I1201 13:59:09.087638 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:09Z","lastTransitionTime":"2025-12-01T13:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:09 crc kubenswrapper[4585]: I1201 13:59:09.189599 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:09 crc kubenswrapper[4585]: I1201 13:59:09.189649 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:09 crc kubenswrapper[4585]: I1201 13:59:09.189658 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:09 crc kubenswrapper[4585]: I1201 13:59:09.189675 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:09 crc kubenswrapper[4585]: I1201 13:59:09.189684 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:09Z","lastTransitionTime":"2025-12-01T13:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:09 crc kubenswrapper[4585]: I1201 13:59:09.292416 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:09 crc kubenswrapper[4585]: I1201 13:59:09.292455 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:09 crc kubenswrapper[4585]: I1201 13:59:09.292468 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:09 crc kubenswrapper[4585]: I1201 13:59:09.292483 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:09 crc kubenswrapper[4585]: I1201 13:59:09.292492 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:09Z","lastTransitionTime":"2025-12-01T13:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:09 crc kubenswrapper[4585]: I1201 13:59:09.394497 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:09 crc kubenswrapper[4585]: I1201 13:59:09.394560 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:09 crc kubenswrapper[4585]: I1201 13:59:09.394574 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:09 crc kubenswrapper[4585]: I1201 13:59:09.394593 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:09 crc kubenswrapper[4585]: I1201 13:59:09.394607 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:09Z","lastTransitionTime":"2025-12-01T13:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:09 crc kubenswrapper[4585]: I1201 13:59:09.411752 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qrdw5" Dec 01 13:59:09 crc kubenswrapper[4585]: E1201 13:59:09.411852 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qrdw5" podUID="f11a95e1-135a-4fd2-9a04-1487c56a18e1" Dec 01 13:59:09 crc kubenswrapper[4585]: I1201 13:59:09.496826 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:09 crc kubenswrapper[4585]: I1201 13:59:09.496877 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:09 crc kubenswrapper[4585]: I1201 13:59:09.496889 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:09 crc kubenswrapper[4585]: I1201 13:59:09.496905 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:09 crc kubenswrapper[4585]: I1201 13:59:09.496915 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:09Z","lastTransitionTime":"2025-12-01T13:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:09 crc kubenswrapper[4585]: I1201 13:59:09.600186 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:09 crc kubenswrapper[4585]: I1201 13:59:09.600230 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:09 crc kubenswrapper[4585]: I1201 13:59:09.600243 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:09 crc kubenswrapper[4585]: I1201 13:59:09.600258 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:09 crc kubenswrapper[4585]: I1201 13:59:09.600270 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:09Z","lastTransitionTime":"2025-12-01T13:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:09 crc kubenswrapper[4585]: I1201 13:59:09.702894 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:09 crc kubenswrapper[4585]: I1201 13:59:09.703639 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:09 crc kubenswrapper[4585]: I1201 13:59:09.703816 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:09 crc kubenswrapper[4585]: I1201 13:59:09.703944 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:09 crc kubenswrapper[4585]: I1201 13:59:09.704115 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:09Z","lastTransitionTime":"2025-12-01T13:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:09 crc kubenswrapper[4585]: I1201 13:59:09.806201 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:09 crc kubenswrapper[4585]: I1201 13:59:09.806228 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:09 crc kubenswrapper[4585]: I1201 13:59:09.806235 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:09 crc kubenswrapper[4585]: I1201 13:59:09.806248 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:09 crc kubenswrapper[4585]: I1201 13:59:09.806257 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:09Z","lastTransitionTime":"2025-12-01T13:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:09 crc kubenswrapper[4585]: I1201 13:59:09.907905 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:09 crc kubenswrapper[4585]: I1201 13:59:09.908210 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:09 crc kubenswrapper[4585]: I1201 13:59:09.908288 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:09 crc kubenswrapper[4585]: I1201 13:59:09.908354 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:09 crc kubenswrapper[4585]: I1201 13:59:09.908435 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:09Z","lastTransitionTime":"2025-12-01T13:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.010610 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.011125 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.011297 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.011495 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.011642 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:10Z","lastTransitionTime":"2025-12-01T13:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.114075 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.114104 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.114114 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.114129 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.114140 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:10Z","lastTransitionTime":"2025-12-01T13:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.216082 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.216109 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.216119 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.216134 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.216143 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:10Z","lastTransitionTime":"2025-12-01T13:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.318572 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.318604 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.318615 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.318630 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.318642 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:10Z","lastTransitionTime":"2025-12-01T13:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.414939 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 13:59:10 crc kubenswrapper[4585]: E1201 13:59:10.415082 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.415281 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 13:59:10 crc kubenswrapper[4585]: E1201 13:59:10.415341 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.415474 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 13:59:10 crc kubenswrapper[4585]: E1201 13:59:10.415666 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.420800 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.420825 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.420834 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.420845 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.420855 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:10Z","lastTransitionTime":"2025-12-01T13:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.523250 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.523287 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.523299 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.523314 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.523325 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:10Z","lastTransitionTime":"2025-12-01T13:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.577250 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.577311 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.577323 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.577341 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.577354 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:10Z","lastTransitionTime":"2025-12-01T13:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:10 crc kubenswrapper[4585]: E1201 13:59:10.589296 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e38ee8f4-73f3-495c-b626-577353e9a008\\\",\\\"systemUUID\\\":\\\"fcf25ef7-53cd-4591-aa80-a73d07c13768\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:10Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.593256 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.593295 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.593307 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.593325 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.593337 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:10Z","lastTransitionTime":"2025-12-01T13:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:10 crc kubenswrapper[4585]: E1201 13:59:10.605806 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e38ee8f4-73f3-495c-b626-577353e9a008\\\",\\\"systemUUID\\\":\\\"fcf25ef7-53cd-4591-aa80-a73d07c13768\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:10Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.609410 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.609605 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.609788 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.609898 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.610005 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:10Z","lastTransitionTime":"2025-12-01T13:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:10 crc kubenswrapper[4585]: E1201 13:59:10.622441 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e38ee8f4-73f3-495c-b626-577353e9a008\\\",\\\"systemUUID\\\":\\\"fcf25ef7-53cd-4591-aa80-a73d07c13768\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:10Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.625964 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.626119 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.626211 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.626527 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.626619 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:10Z","lastTransitionTime":"2025-12-01T13:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:10 crc kubenswrapper[4585]: E1201 13:59:10.636999 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e38ee8f4-73f3-495c-b626-577353e9a008\\\",\\\"systemUUID\\\":\\\"fcf25ef7-53cd-4591-aa80-a73d07c13768\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:10Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.640156 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.640195 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.640208 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.640224 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.640235 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:10Z","lastTransitionTime":"2025-12-01T13:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:10 crc kubenswrapper[4585]: E1201 13:59:10.650581 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e38ee8f4-73f3-495c-b626-577353e9a008\\\",\\\"systemUUID\\\":\\\"fcf25ef7-53cd-4591-aa80-a73d07c13768\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:10Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:10 crc kubenswrapper[4585]: E1201 13:59:10.650775 4585 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.652185 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.652221 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.652233 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.652247 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.652258 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:10Z","lastTransitionTime":"2025-12-01T13:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.755072 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.755703 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.755797 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.755906 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.756027 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:10Z","lastTransitionTime":"2025-12-01T13:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.858568 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.858622 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.858633 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.858653 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.858664 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:10Z","lastTransitionTime":"2025-12-01T13:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.960374 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.960627 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.960742 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.960823 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:10 crc kubenswrapper[4585]: I1201 13:59:10.960887 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:10Z","lastTransitionTime":"2025-12-01T13:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:11 crc kubenswrapper[4585]: I1201 13:59:11.063209 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:11 crc kubenswrapper[4585]: I1201 13:59:11.063285 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:11 crc kubenswrapper[4585]: I1201 13:59:11.063299 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:11 crc kubenswrapper[4585]: I1201 13:59:11.063314 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:11 crc kubenswrapper[4585]: I1201 13:59:11.063347 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:11Z","lastTransitionTime":"2025-12-01T13:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:11 crc kubenswrapper[4585]: I1201 13:59:11.165620 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:11 crc kubenswrapper[4585]: I1201 13:59:11.165662 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:11 crc kubenswrapper[4585]: I1201 13:59:11.165673 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:11 crc kubenswrapper[4585]: I1201 13:59:11.165688 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:11 crc kubenswrapper[4585]: I1201 13:59:11.165700 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:11Z","lastTransitionTime":"2025-12-01T13:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:11 crc kubenswrapper[4585]: I1201 13:59:11.268406 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:11 crc kubenswrapper[4585]: I1201 13:59:11.268453 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:11 crc kubenswrapper[4585]: I1201 13:59:11.268465 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:11 crc kubenswrapper[4585]: I1201 13:59:11.268482 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:11 crc kubenswrapper[4585]: I1201 13:59:11.268493 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:11Z","lastTransitionTime":"2025-12-01T13:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:11 crc kubenswrapper[4585]: I1201 13:59:11.370871 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:11 crc kubenswrapper[4585]: I1201 13:59:11.370919 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:11 crc kubenswrapper[4585]: I1201 13:59:11.370929 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:11 crc kubenswrapper[4585]: I1201 13:59:11.370947 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:11 crc kubenswrapper[4585]: I1201 13:59:11.370959 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:11Z","lastTransitionTime":"2025-12-01T13:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:11 crc kubenswrapper[4585]: I1201 13:59:11.412303 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qrdw5" Dec 01 13:59:11 crc kubenswrapper[4585]: E1201 13:59:11.412436 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qrdw5" podUID="f11a95e1-135a-4fd2-9a04-1487c56a18e1" Dec 01 13:59:11 crc kubenswrapper[4585]: I1201 13:59:11.473329 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:11 crc kubenswrapper[4585]: I1201 13:59:11.473366 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:11 crc kubenswrapper[4585]: I1201 13:59:11.473376 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:11 crc kubenswrapper[4585]: I1201 13:59:11.473394 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:11 crc kubenswrapper[4585]: I1201 13:59:11.473406 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:11Z","lastTransitionTime":"2025-12-01T13:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:11 crc kubenswrapper[4585]: I1201 13:59:11.575701 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:11 crc kubenswrapper[4585]: I1201 13:59:11.575738 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:11 crc kubenswrapper[4585]: I1201 13:59:11.575749 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:11 crc kubenswrapper[4585]: I1201 13:59:11.575764 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:11 crc kubenswrapper[4585]: I1201 13:59:11.575775 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:11Z","lastTransitionTime":"2025-12-01T13:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:11 crc kubenswrapper[4585]: I1201 13:59:11.678166 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:11 crc kubenswrapper[4585]: I1201 13:59:11.678465 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:11 crc kubenswrapper[4585]: I1201 13:59:11.678704 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:11 crc kubenswrapper[4585]: I1201 13:59:11.678894 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:11 crc kubenswrapper[4585]: I1201 13:59:11.679123 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:11Z","lastTransitionTime":"2025-12-01T13:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:11 crc kubenswrapper[4585]: I1201 13:59:11.783720 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:11 crc kubenswrapper[4585]: I1201 13:59:11.784462 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:11 crc kubenswrapper[4585]: I1201 13:59:11.784545 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:11 crc kubenswrapper[4585]: I1201 13:59:11.784654 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:11 crc kubenswrapper[4585]: I1201 13:59:11.784725 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:11Z","lastTransitionTime":"2025-12-01T13:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:11 crc kubenswrapper[4585]: I1201 13:59:11.886562 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:11 crc kubenswrapper[4585]: I1201 13:59:11.886598 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:11 crc kubenswrapper[4585]: I1201 13:59:11.886606 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:11 crc kubenswrapper[4585]: I1201 13:59:11.886621 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:11 crc kubenswrapper[4585]: I1201 13:59:11.886630 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:11Z","lastTransitionTime":"2025-12-01T13:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:11 crc kubenswrapper[4585]: I1201 13:59:11.989051 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:11 crc kubenswrapper[4585]: I1201 13:59:11.989333 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:11 crc kubenswrapper[4585]: I1201 13:59:11.989414 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:11 crc kubenswrapper[4585]: I1201 13:59:11.989508 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:11 crc kubenswrapper[4585]: I1201 13:59:11.989598 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:11Z","lastTransitionTime":"2025-12-01T13:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:12 crc kubenswrapper[4585]: I1201 13:59:12.092071 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:12 crc kubenswrapper[4585]: I1201 13:59:12.092113 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:12 crc kubenswrapper[4585]: I1201 13:59:12.092125 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:12 crc kubenswrapper[4585]: I1201 13:59:12.092142 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:12 crc kubenswrapper[4585]: I1201 13:59:12.092154 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:12Z","lastTransitionTime":"2025-12-01T13:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:12 crc kubenswrapper[4585]: I1201 13:59:12.194731 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:12 crc kubenswrapper[4585]: I1201 13:59:12.194764 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:12 crc kubenswrapper[4585]: I1201 13:59:12.194774 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:12 crc kubenswrapper[4585]: I1201 13:59:12.194788 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:12 crc kubenswrapper[4585]: I1201 13:59:12.194798 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:12Z","lastTransitionTime":"2025-12-01T13:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:12 crc kubenswrapper[4585]: I1201 13:59:12.297542 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:12 crc kubenswrapper[4585]: I1201 13:59:12.297578 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:12 crc kubenswrapper[4585]: I1201 13:59:12.297587 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:12 crc kubenswrapper[4585]: I1201 13:59:12.297601 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:12 crc kubenswrapper[4585]: I1201 13:59:12.297611 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:12Z","lastTransitionTime":"2025-12-01T13:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:12 crc kubenswrapper[4585]: I1201 13:59:12.400058 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:12 crc kubenswrapper[4585]: I1201 13:59:12.400381 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:12 crc kubenswrapper[4585]: I1201 13:59:12.400476 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:12 crc kubenswrapper[4585]: I1201 13:59:12.400574 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:12 crc kubenswrapper[4585]: I1201 13:59:12.400650 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:12Z","lastTransitionTime":"2025-12-01T13:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:12 crc kubenswrapper[4585]: I1201 13:59:12.412390 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 13:59:12 crc kubenswrapper[4585]: E1201 13:59:12.412767 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 13:59:12 crc kubenswrapper[4585]: I1201 13:59:12.412404 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 13:59:12 crc kubenswrapper[4585]: I1201 13:59:12.412390 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 13:59:12 crc kubenswrapper[4585]: E1201 13:59:12.413059 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 13:59:12 crc kubenswrapper[4585]: E1201 13:59:12.413076 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 13:59:12 crc kubenswrapper[4585]: I1201 13:59:12.503438 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:12 crc kubenswrapper[4585]: I1201 13:59:12.503486 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:12 crc kubenswrapper[4585]: I1201 13:59:12.503499 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:12 crc kubenswrapper[4585]: I1201 13:59:12.503517 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:12 crc kubenswrapper[4585]: I1201 13:59:12.503530 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:12Z","lastTransitionTime":"2025-12-01T13:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:12 crc kubenswrapper[4585]: I1201 13:59:12.605841 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:12 crc kubenswrapper[4585]: I1201 13:59:12.605904 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:12 crc kubenswrapper[4585]: I1201 13:59:12.605917 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:12 crc kubenswrapper[4585]: I1201 13:59:12.605936 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:12 crc kubenswrapper[4585]: I1201 13:59:12.605952 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:12Z","lastTransitionTime":"2025-12-01T13:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:12 crc kubenswrapper[4585]: I1201 13:59:12.708412 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:12 crc kubenswrapper[4585]: I1201 13:59:12.708443 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:12 crc kubenswrapper[4585]: I1201 13:59:12.708452 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:12 crc kubenswrapper[4585]: I1201 13:59:12.708466 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:12 crc kubenswrapper[4585]: I1201 13:59:12.708477 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:12Z","lastTransitionTime":"2025-12-01T13:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:12 crc kubenswrapper[4585]: I1201 13:59:12.810735 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:12 crc kubenswrapper[4585]: I1201 13:59:12.811388 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:12 crc kubenswrapper[4585]: I1201 13:59:12.811465 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:12 crc kubenswrapper[4585]: I1201 13:59:12.811706 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:12 crc kubenswrapper[4585]: I1201 13:59:12.811809 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:12Z","lastTransitionTime":"2025-12-01T13:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:12 crc kubenswrapper[4585]: I1201 13:59:12.914819 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:12 crc kubenswrapper[4585]: I1201 13:59:12.914857 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:12 crc kubenswrapper[4585]: I1201 13:59:12.914866 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:12 crc kubenswrapper[4585]: I1201 13:59:12.914880 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:12 crc kubenswrapper[4585]: I1201 13:59:12.914892 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:12Z","lastTransitionTime":"2025-12-01T13:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:13 crc kubenswrapper[4585]: I1201 13:59:13.017196 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:13 crc kubenswrapper[4585]: I1201 13:59:13.017235 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:13 crc kubenswrapper[4585]: I1201 13:59:13.017244 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:13 crc kubenswrapper[4585]: I1201 13:59:13.017258 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:13 crc kubenswrapper[4585]: I1201 13:59:13.017269 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:13Z","lastTransitionTime":"2025-12-01T13:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:13 crc kubenswrapper[4585]: I1201 13:59:13.119563 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:13 crc kubenswrapper[4585]: I1201 13:59:13.119609 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:13 crc kubenswrapper[4585]: I1201 13:59:13.119621 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:13 crc kubenswrapper[4585]: I1201 13:59:13.119638 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:13 crc kubenswrapper[4585]: I1201 13:59:13.119650 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:13Z","lastTransitionTime":"2025-12-01T13:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:13 crc kubenswrapper[4585]: I1201 13:59:13.163189 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f11a95e1-135a-4fd2-9a04-1487c56a18e1-metrics-certs\") pod \"network-metrics-daemon-qrdw5\" (UID: \"f11a95e1-135a-4fd2-9a04-1487c56a18e1\") " pod="openshift-multus/network-metrics-daemon-qrdw5" Dec 01 13:59:13 crc kubenswrapper[4585]: E1201 13:59:13.163382 4585 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 13:59:13 crc kubenswrapper[4585]: E1201 13:59:13.163718 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f11a95e1-135a-4fd2-9a04-1487c56a18e1-metrics-certs podName:f11a95e1-135a-4fd2-9a04-1487c56a18e1 nodeName:}" failed. No retries permitted until 2025-12-01 13:59:45.163694294 +0000 UTC m=+99.147908159 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f11a95e1-135a-4fd2-9a04-1487c56a18e1-metrics-certs") pod "network-metrics-daemon-qrdw5" (UID: "f11a95e1-135a-4fd2-9a04-1487c56a18e1") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 13:59:13 crc kubenswrapper[4585]: I1201 13:59:13.222258 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:13 crc kubenswrapper[4585]: I1201 13:59:13.222303 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:13 crc kubenswrapper[4585]: I1201 13:59:13.222335 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:13 crc kubenswrapper[4585]: I1201 13:59:13.222358 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:13 crc kubenswrapper[4585]: I1201 13:59:13.222370 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:13Z","lastTransitionTime":"2025-12-01T13:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:13 crc kubenswrapper[4585]: I1201 13:59:13.325308 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:13 crc kubenswrapper[4585]: I1201 13:59:13.325340 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:13 crc kubenswrapper[4585]: I1201 13:59:13.325352 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:13 crc kubenswrapper[4585]: I1201 13:59:13.325367 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:13 crc kubenswrapper[4585]: I1201 13:59:13.325376 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:13Z","lastTransitionTime":"2025-12-01T13:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:13 crc kubenswrapper[4585]: I1201 13:59:13.411938 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qrdw5" Dec 01 13:59:13 crc kubenswrapper[4585]: E1201 13:59:13.412209 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qrdw5" podUID="f11a95e1-135a-4fd2-9a04-1487c56a18e1" Dec 01 13:59:13 crc kubenswrapper[4585]: I1201 13:59:13.422510 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 01 13:59:13 crc kubenswrapper[4585]: I1201 13:59:13.429126 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:13 crc kubenswrapper[4585]: I1201 13:59:13.429170 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:13 crc kubenswrapper[4585]: I1201 13:59:13.429182 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:13 crc kubenswrapper[4585]: I1201 13:59:13.429199 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:13 crc kubenswrapper[4585]: I1201 13:59:13.429210 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:13Z","lastTransitionTime":"2025-12-01T13:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:13 crc kubenswrapper[4585]: I1201 13:59:13.531428 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:13 crc kubenswrapper[4585]: I1201 13:59:13.531461 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:13 crc kubenswrapper[4585]: I1201 13:59:13.531469 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:13 crc kubenswrapper[4585]: I1201 13:59:13.531484 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:13 crc kubenswrapper[4585]: I1201 13:59:13.531494 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:13Z","lastTransitionTime":"2025-12-01T13:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:13 crc kubenswrapper[4585]: I1201 13:59:13.633525 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:13 crc kubenswrapper[4585]: I1201 13:59:13.633558 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:13 crc kubenswrapper[4585]: I1201 13:59:13.633570 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:13 crc kubenswrapper[4585]: I1201 13:59:13.633586 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:13 crc kubenswrapper[4585]: I1201 13:59:13.633596 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:13Z","lastTransitionTime":"2025-12-01T13:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:13 crc kubenswrapper[4585]: I1201 13:59:13.735585 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:13 crc kubenswrapper[4585]: I1201 13:59:13.735623 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:13 crc kubenswrapper[4585]: I1201 13:59:13.735634 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:13 crc kubenswrapper[4585]: I1201 13:59:13.735651 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:13 crc kubenswrapper[4585]: I1201 13:59:13.735661 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:13Z","lastTransitionTime":"2025-12-01T13:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:13 crc kubenswrapper[4585]: I1201 13:59:13.838922 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:13 crc kubenswrapper[4585]: I1201 13:59:13.838998 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:13 crc kubenswrapper[4585]: I1201 13:59:13.839013 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:13 crc kubenswrapper[4585]: I1201 13:59:13.839028 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:13 crc kubenswrapper[4585]: I1201 13:59:13.839037 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:13Z","lastTransitionTime":"2025-12-01T13:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:13 crc kubenswrapper[4585]: I1201 13:59:13.856670 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9wjs5_6e7ad3ad-7937-409b-b1c9-9c801f937400/kube-multus/0.log" Dec 01 13:59:13 crc kubenswrapper[4585]: I1201 13:59:13.856717 4585 generic.go:334] "Generic (PLEG): container finished" podID="6e7ad3ad-7937-409b-b1c9-9c801f937400" containerID="9a3662f79465b11c168f80f17c534c792fb1cc6b8a5c0115d219d39def6db073" exitCode=1 Dec 01 13:59:13 crc kubenswrapper[4585]: I1201 13:59:13.856800 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9wjs5" event={"ID":"6e7ad3ad-7937-409b-b1c9-9c801f937400","Type":"ContainerDied","Data":"9a3662f79465b11c168f80f17c534c792fb1cc6b8a5c0115d219d39def6db073"} Dec 01 13:59:13 crc kubenswrapper[4585]: I1201 13:59:13.858055 4585 scope.go:117] "RemoveContainer" containerID="9a3662f79465b11c168f80f17c534c792fb1cc6b8a5c0115d219d39def6db073" Dec 01 13:59:13 crc kubenswrapper[4585]: I1201 13:59:13.869892 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:13Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:13 crc kubenswrapper[4585]: I1201 13:59:13.882269 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9wjs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e7ad3ad-7937-409b-b1c9-9c801f937400\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3662f79465b11c168f80f17c534c792fb1cc6b8a5c0115d219d39def6db073\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a3662f79465b11c168f80f17c534c792fb1cc6b8a5c0115d219d39def6db073\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T13:59:13Z\\\",\\\"message\\\":\\\"2025-12-01T13:58:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8defcfa5-6cc7-4501-b9cf-73be7747a85c\\\\n2025-12-01T13:58:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8defcfa5-6cc7-4501-b9cf-73be7747a85c to /host/opt/cni/bin/\\\\n2025-12-01T13:58:28Z [verbose] multus-daemon started\\\\n2025-12-01T13:58:28Z [verbose] Readiness Indicator file check\\\\n2025-12-01T13:59:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvc7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9wjs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:13Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:13 crc kubenswrapper[4585]: I1201 13:59:13.898571 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qrdw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f11a95e1-135a-4fd2-9a04-1487c56a18e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmr99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmr99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qrdw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:13Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:13 crc kubenswrapper[4585]: I1201 13:59:13.917200 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b62eed93-4d28-4748-b991-dd9692b58341\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e952d1592344330416348b9ec4add60740a0d23ab554afb70c6419977fb12533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e35deff79558e54b62aa868535d5bc4d4374ede161463dcf5f3a33584867447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9736ff6b49635cb7c26db7bc2292b36cf63053e0d1e4ffa2980a520dd71d9ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe7079ed6fa6b8b64d16c67287c33c2bffa0f42df2166fe66e3f8e173ca87d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83ba28d0b57f4744f3f7f97915fde983e2de227958ddaecd43b3eae9eb406774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:13Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:13 crc kubenswrapper[4585]: I1201 13:59:13.930256 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4030376c-31f1-420a-935f-f1ee174914e7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2571804bbf066bfec1f4fde3430800aa87d9c5b6db48cc1f3d352d742657ca8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d240ebf61216f20038983f7b8d2e0792fc0f1ed8db050debf02c777197d417b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f66a3bd7503966a7b443b7ea1adbd8554c99e810749bc275e5cfa4e95c6b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4434211239bc61f432b46918bbb691c6257fded2fbd9552ce3344d10e091b3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:13Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:13 crc kubenswrapper[4585]: I1201 13:59:13.941092 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"979efa67-6804-42c6-9661-43397d760d30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43ecbe420cb6fc8702eae53c7d3a369f12a986685824f1f08c15398e47985cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbb6a609f35d4968e1f0a001362568ddcc21b80f897ad76b3b7f9e7f4b1651af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2e64b25d69a7d86297a1761d8a2c7e62e508e69a19ff6d48679efee71724b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfc93b1f443fde238aa3a3b7a475749b55ddefd79b2380833c6288aa73b131db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfc93b1f443fde238aa3a3b7a475749b55ddefd79b2380833c6288aa73b131db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:13Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:13 crc kubenswrapper[4585]: I1201 13:59:13.942349 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:13 crc kubenswrapper[4585]: I1201 13:59:13.942498 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:13 crc kubenswrapper[4585]: I1201 13:59:13.942559 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:13 crc kubenswrapper[4585]: I1201 13:59:13.942633 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:13 crc kubenswrapper[4585]: I1201 13:59:13.942704 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:13Z","lastTransitionTime":"2025-12-01T13:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:13 crc kubenswrapper[4585]: I1201 13:59:13.955130 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b3ea01760c8f1e58a55a59a0d453c25ca1298ede32d471f02092fd9d0a43830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e7aa15babbd745afed6bc0d4bbf5526180b8d32f86f8db736066fad3feb506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:13Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:13 crc kubenswrapper[4585]: I1201 13:59:13.971366 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4694a9b9e5ea09fb1d1963504d94853e22fce9688a502036324f92f39bf8a8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:13Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:13 crc kubenswrapper[4585]: I1201 13:59:13.984386 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:13Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.000267 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xh4hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0e1eaa6-bee9-401a-b01a-9bf49a938b29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db9b353d1934e3336c4282cb057c52ef64e01675fa36e23d1fa0af7133508a62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f3e0b6c5dac9ff9b4686937a7d1646c4796a3326193349cf1556876bf1ae61a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f3e0b6c5dac9ff9b4686937a7d1646c4796a3326193349cf1556876bf1ae61a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2451dad343dcc9ec3bd5eb7109aefab60da5b700ae69d8eff28fef524795a758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2451dad343dcc9ec3bd5eb7109aefab60da5b700ae69d8eff28fef524795a758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9b888c3602a8f5d0f8cb762ab7d9b6676a0467f9546dbc38c5d2dd99acedd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9b888c3602a8f5d0f8cb762ab7d9b6676a0467f9546dbc38c5d2dd99acedd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3110c36a439f88ab7fc0903fd766db1abef51f129bb1fc68634531d0e30cab0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3110c36a439f88ab7fc0903fd766db1abef51f129bb1fc68634531d0e30cab0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xh4hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:13Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.010990 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e6fbe56-8f79-4138-b082-5fe0f2198590\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4cd6715a83e1d1fc7d714956406fd0260653d09700c83f129dc4399bb089228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4586348a7e68fe58af97d73625dda61628f3b2176e495c513e783f9c8f19d33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4586348a7e68fe58af97d73625dda61628f3b2176e495c513e783f9c8f19d33\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:14Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.023352 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:14Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.034545 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-62bsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98be7526-98f6-4a4a-b4a6-1d10e76b7a99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d4df68e891df5e4e44371a4c884475b5723d55f6a10d32889cf194ddbe6179\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxjs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-62bsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:14Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.045501 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.045536 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.045545 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.045561 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.045571 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:14Z","lastTransitionTime":"2025-12-01T13:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.047899 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b10cdae7-154c-4fd6-a308-02843603d7ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b40ee42d7d1c9043b0b5c0dea3baacc46c24842ec64ed3b0d0ae1acfaf65f8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa22ac0ee9ad705a8affafffed373135005ef0380c9a646e06229a24d04d20b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://949f7fe4da35866e66212aea18ec1ed303bd542b07e53ff6562618c19de112cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d7c5d98b7f6dcade92ba77f78aa8345baf8161727831a616192726f6ef6f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dfb22c5155fdc4bfdf9d54bf540e4018bca6a616835af7e8d5079bebcd45da\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T13:58:25Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 13:58:19.764445 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 13:58:19.765610 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3283210930/tls.crt::/tmp/serving-cert-3283210930/tls.key\\\\\\\"\\\\nI1201 13:58:25.461951 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 13:58:25.464641 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 13:58:25.464660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 13:58:25.464680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 13:58:25.464686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 13:58:25.473932 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 13:58:25.473958 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 13:58:25.473983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 13:58:25.473986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 13:58:25.473989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 13:58:25.474166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 13:58:25.477276 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc9645fabd7968d92ce9867130dddbb6c72664af68d6b5475cf3271d1975878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:14Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.061278 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e9e83cdb2aa3565484c1ff3e5bbd72c7f50a132de50e60aec1aa88ad145ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:14Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.070666 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7beb40d-bcd0-43c8-a9fe-c32408790a4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59e8d51b7e0c8b526fab11f39a0a94685aecda8a1d300e1ee2259eb4bbac9003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c2cf1c1e4965502ab56b6d9bbf7840d00225b831b1e613295befad0748d7d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lj9gs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:14Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.088921 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90163e3776ee8b2a338e3f8e6aa3d918d1d6123d246fbbd88659d342348167a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37437844a9c9719627f96c0afa303f0d449c46def7249b8c69ebd44900661fa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec93c685227d533ac1ab980dca6f72792f746e68511a7a546fb25189b02507d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b64c1a0d36f8addf6040455b1e933756582857ef8098c935b9cc5a115e8afee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d740a1be02b59a062ea8d5c44620cb57725a7ec494fd365e44a8980947c0d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdbccd3910729d23067d94c9ed2f5f3c917cd4880185a3b7e8c4c811cd30e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6e182b8e1ae7a92cc190a93197ab3bdc03ffc5cc65257651261acd04689f8ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e182b8e1ae7a92cc190a93197ab3bdc03ffc5cc65257651261acd04689f8ee\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T13:58:55Z\\\",\\\"message\\\":\\\"Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-image-registry/image-registry]} name:Service_openshift-image-registry/image-registry_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.93:5000:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {83c1e277-3d22-42ae-a355-f7a0ff0bd171}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 13:58:55.244725 6128 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}\\\\nI1201 13:58:55.244735 6128 services_controller.go:360] Finished syncing service etcd on namespace openshift-etcd for network=default : 1.845548ms\\\\nI1201 13:58:55.244747 6128 services_controller.go:356] Processing sync for service openshift-console-operator/metrics for network=default\\\\nF1201 13:58:55.244780 6128 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-tjkqr_openshift-ovn-kubernetes(b0b45150-070d-4f7c-b53a-d76dcbaa6e6d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c85de7cd35b11158d931065cec6bafaf456be350f4a6095b8a9d0851bf87b1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:14Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.109480 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nhp6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da6d315c-d478-4216-9f3d-57b20ce5ced8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ebdaa73edbe04780c0f421ea88446b4268ed1792af30e1ebb945167db80a548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n25g8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nhp6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:14Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.136140 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kn5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c45f35-33ae-4eba-97c7-eb85a6db85a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf9d61c1ae31a9cc21ea918e419af7e8badf234ef1614ef73bafdc95381935cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b27c5e93a6707bc6b2b6ef81596e39344c0b29f8cac30ac5d62c7eaa5e635f4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6kn5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:14Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.147509 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.147556 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.147565 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.147578 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.147586 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:14Z","lastTransitionTime":"2025-12-01T13:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.249854 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.249889 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.249898 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.249915 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.249925 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:14Z","lastTransitionTime":"2025-12-01T13:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.352336 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.352375 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.352386 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.352406 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.352418 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:14Z","lastTransitionTime":"2025-12-01T13:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.412051 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.412131 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 13:59:14 crc kubenswrapper[4585]: E1201 13:59:14.412315 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.412127 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 13:59:14 crc kubenswrapper[4585]: E1201 13:59:14.412415 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 13:59:14 crc kubenswrapper[4585]: E1201 13:59:14.412535 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.454650 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.454694 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.454703 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.454718 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.454728 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:14Z","lastTransitionTime":"2025-12-01T13:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.557155 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.557191 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.557202 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.557219 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.557229 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:14Z","lastTransitionTime":"2025-12-01T13:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.659418 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.659444 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.659454 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.659467 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.659476 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:14Z","lastTransitionTime":"2025-12-01T13:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.761522 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.761548 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.761555 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.761568 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.761576 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:14Z","lastTransitionTime":"2025-12-01T13:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.861439 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9wjs5_6e7ad3ad-7937-409b-b1c9-9c801f937400/kube-multus/0.log" Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.861991 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9wjs5" event={"ID":"6e7ad3ad-7937-409b-b1c9-9c801f937400","Type":"ContainerStarted","Data":"babb191a7dbe739f4bf7c2ae917b82b700f2987751c60a05ef3d9b3d11195953"} Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.864199 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.864245 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.864260 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.864278 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.864291 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:14Z","lastTransitionTime":"2025-12-01T13:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.872849 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e6fbe56-8f79-4138-b082-5fe0f2198590\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4cd6715a83e1d1fc7d714956406fd0260653d09700c83f129dc4399bb089228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4586348a7e68fe58af97d73625dda61628f3b2176e495c513e783f9c8f19d33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4586348a7e68fe58af97d73625dda61628f3b2176e495c513e783f9c8f19d33\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:14Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.884280 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:14Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.892859 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-62bsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98be7526-98f6-4a4a-b4a6-1d10e76b7a99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d4df68e891df5e4e44371a4c884475b5723d55f6a10d32889cf194ddbe6179\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxjs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-62bsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:14Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.905324 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b10cdae7-154c-4fd6-a308-02843603d7ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b40ee42d7d1c9043b0b5c0dea3baacc46c24842ec64ed3b0d0ae1acfaf65f8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa22ac0ee9ad705a8affafffed373135005ef0380c9a646e06229a24d04d20b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://949f7fe4da35866e66212aea18ec1ed303bd542b07e53ff6562618c19de112cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d7c5d98b7f6dcade92ba77f78aa8345baf8161727831a616192726f6ef6f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dfb22c5155fdc4bfdf9d54bf540e4018bca6a616835af7e8d5079bebcd45da\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T13:58:25Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 13:58:19.764445 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 13:58:19.765610 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3283210930/tls.crt::/tmp/serving-cert-3283210930/tls.key\\\\\\\"\\\\nI1201 13:58:25.461951 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 13:58:25.464641 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 13:58:25.464660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 13:58:25.464680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 13:58:25.464686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 13:58:25.473932 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 13:58:25.473958 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 13:58:25.473983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 13:58:25.473986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 13:58:25.473989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 13:58:25.474166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 13:58:25.477276 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc9645fabd7968d92ce9867130dddbb6c72664af68d6b5475cf3271d1975878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:14Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.917094 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e9e83cdb2aa3565484c1ff3e5bbd72c7f50a132de50e60aec1aa88ad145ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:14Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.927629 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7beb40d-bcd0-43c8-a9fe-c32408790a4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59e8d51b7e0c8b526fab11f39a0a94685aecda8a1d300e1ee2259eb4bbac9003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c2cf1c1e4965502ab56b6d9bbf7840d00225b831b1e613295befad0748d7d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lj9gs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:14Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.945737 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90163e3776ee8b2a338e3f8e6aa3d918d1d6123d246fbbd88659d342348167a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37437844a9c9719627f96c0afa303f0d449c46def7249b8c69ebd44900661fa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec93c685227d533ac1ab980dca6f72792f746e68511a7a546fb25189b02507d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b64c1a0d36f8addf6040455b1e933756582857ef8098c935b9cc5a115e8afee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d740a1be02b59a062ea8d5c44620cb57725a7ec494fd365e44a8980947c0d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdbccd3910729d23067d94c9ed2f5f3c917cd4880185a3b7e8c4c811cd30e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6e182b8e1ae7a92cc190a93197ab3bdc03ffc5cc65257651261acd04689f8ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e182b8e1ae7a92cc190a93197ab3bdc03ffc5cc65257651261acd04689f8ee\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T13:58:55Z\\\",\\\"message\\\":\\\"Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-image-registry/image-registry]} name:Service_openshift-image-registry/image-registry_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.93:5000:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {83c1e277-3d22-42ae-a355-f7a0ff0bd171}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 13:58:55.244725 6128 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}\\\\nI1201 13:58:55.244735 6128 services_controller.go:360] Finished syncing service etcd on namespace openshift-etcd for network=default : 1.845548ms\\\\nI1201 13:58:55.244747 6128 services_controller.go:356] Processing sync for service openshift-console-operator/metrics for network=default\\\\nF1201 13:58:55.244780 6128 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-tjkqr_openshift-ovn-kubernetes(b0b45150-070d-4f7c-b53a-d76dcbaa6e6d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c85de7cd35b11158d931065cec6bafaf456be350f4a6095b8a9d0851bf87b1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:14Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.955986 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nhp6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da6d315c-d478-4216-9f3d-57b20ce5ced8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ebdaa73edbe04780c0f421ea88446b4268ed1792af30e1ebb945167db80a548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n25g8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nhp6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:14Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.966016 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kn5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c45f35-33ae-4eba-97c7-eb85a6db85a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf9d61c1ae31a9cc21ea918e419af7e8badf234ef1614ef73bafdc95381935cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b27c5e93a6707bc6b2b6ef81596e39344c0b29f8cac30ac5d62c7eaa5e635f4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6kn5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:14Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.967687 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.967752 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.967766 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.967782 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.967796 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:14Z","lastTransitionTime":"2025-12-01T13:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.979204 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:14Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:14 crc kubenswrapper[4585]: I1201 13:59:14.993718 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9wjs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e7ad3ad-7937-409b-b1c9-9c801f937400\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babb191a7dbe739f4bf7c2ae917b82b700f2987751c60a05ef3d9b3d11195953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a3662f79465b11c168f80f17c534c792fb1cc6b8a5c0115d219d39def6db073\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T13:59:13Z\\\",\\\"message\\\":\\\"2025-12-01T13:58:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8defcfa5-6cc7-4501-b9cf-73be7747a85c\\\\n2025-12-01T13:58:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8defcfa5-6cc7-4501-b9cf-73be7747a85c to /host/opt/cni/bin/\\\\n2025-12-01T13:58:28Z [verbose] multus-daemon started\\\\n2025-12-01T13:58:28Z [verbose] Readiness Indicator file check\\\\n2025-12-01T13:59:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:59:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvc7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9wjs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:14Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:15 crc kubenswrapper[4585]: I1201 13:59:15.004766 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qrdw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f11a95e1-135a-4fd2-9a04-1487c56a18e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmr99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmr99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qrdw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:15Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:15 crc kubenswrapper[4585]: I1201 13:59:15.025565 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b62eed93-4d28-4748-b991-dd9692b58341\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e952d1592344330416348b9ec4add60740a0d23ab554afb70c6419977fb12533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e35deff79558e54b62aa868535d5bc4d4374ede161463dcf5f3a33584867447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9736ff6b49635cb7c26db7bc2292b36cf63053e0d1e4ffa2980a520dd71d9ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe7079ed6fa6b8b64d16c67287c33c2bffa0f42df2166fe66e3f8e173ca87d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83ba28d0b57f4744f3f7f97915fde983e2de227958ddaecd43b3eae9eb406774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:15Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:15 crc kubenswrapper[4585]: I1201 13:59:15.040651 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4030376c-31f1-420a-935f-f1ee174914e7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2571804bbf066bfec1f4fde3430800aa87d9c5b6db48cc1f3d352d742657ca8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d240ebf61216f20038983f7b8d2e0792fc0f1ed8db050debf02c777197d417b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f66a3bd7503966a7b443b7ea1adbd8554c99e810749bc275e5cfa4e95c6b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4434211239bc61f432b46918bbb691c6257fded2fbd9552ce3344d10e091b3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:15Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:15 crc kubenswrapper[4585]: I1201 13:59:15.058246 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"979efa67-6804-42c6-9661-43397d760d30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43ecbe420cb6fc8702eae53c7d3a369f12a986685824f1f08c15398e47985cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbb6a609f35d4968e1f0a001362568ddcc21b80f897ad76b3b7f9e7f4b1651af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2e64b25d69a7d86297a1761d8a2c7e62e508e69a19ff6d48679efee71724b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfc93b1f443fde238aa3a3b7a475749b55ddefd79b2380833c6288aa73b131db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfc93b1f443fde238aa3a3b7a475749b55ddefd79b2380833c6288aa73b131db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:15Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:15 crc kubenswrapper[4585]: I1201 13:59:15.070181 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:15 crc kubenswrapper[4585]: I1201 13:59:15.070220 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:15 crc kubenswrapper[4585]: I1201 13:59:15.070232 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:15 crc kubenswrapper[4585]: I1201 13:59:15.070252 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:15 crc kubenswrapper[4585]: I1201 13:59:15.070265 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:15Z","lastTransitionTime":"2025-12-01T13:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:15 crc kubenswrapper[4585]: I1201 13:59:15.075211 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b3ea01760c8f1e58a55a59a0d453c25ca1298ede32d471f02092fd9d0a43830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e7aa15babbd745afed6bc0d4bbf5526180b8d32f86f8db736066fad3feb506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:15Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:15 crc kubenswrapper[4585]: I1201 13:59:15.088025 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4694a9b9e5ea09fb1d1963504d94853e22fce9688a502036324f92f39bf8a8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:15Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:15 crc kubenswrapper[4585]: I1201 13:59:15.102289 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:15Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:15 crc kubenswrapper[4585]: I1201 13:59:15.117404 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xh4hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0e1eaa6-bee9-401a-b01a-9bf49a938b29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db9b353d1934e3336c4282cb057c52ef64e01675fa36e23d1fa0af7133508a62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f3e0b6c5dac9ff9b4686937a7d1646c4796a3326193349cf1556876bf1ae61a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f3e0b6c5dac9ff9b4686937a7d1646c4796a3326193349cf1556876bf1ae61a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2451dad343dcc9ec3bd5eb7109aefab60da5b700ae69d8eff28fef524795a758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2451dad343dcc9ec3bd5eb7109aefab60da5b700ae69d8eff28fef524795a758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9b888c3602a8f5d0f8cb762ab7d9b6676a0467f9546dbc38c5d2dd99acedd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9b888c3602a8f5d0f8cb762ab7d9b6676a0467f9546dbc38c5d2dd99acedd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3110c36a439f88ab7fc0903fd766db1abef51f129bb1fc68634531d0e30cab0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3110c36a439f88ab7fc0903fd766db1abef51f129bb1fc68634531d0e30cab0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xh4hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:15Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:15 crc kubenswrapper[4585]: I1201 13:59:15.171961 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:15 crc kubenswrapper[4585]: I1201 13:59:15.172008 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:15 crc kubenswrapper[4585]: I1201 13:59:15.172016 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:15 crc kubenswrapper[4585]: I1201 13:59:15.172029 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:15 crc kubenswrapper[4585]: I1201 13:59:15.172038 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:15Z","lastTransitionTime":"2025-12-01T13:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:15 crc kubenswrapper[4585]: I1201 13:59:15.324043 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:15 crc kubenswrapper[4585]: I1201 13:59:15.324091 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:15 crc kubenswrapper[4585]: I1201 13:59:15.324104 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:15 crc kubenswrapper[4585]: I1201 13:59:15.324300 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:15 crc kubenswrapper[4585]: I1201 13:59:15.324311 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:15Z","lastTransitionTime":"2025-12-01T13:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:15 crc kubenswrapper[4585]: I1201 13:59:15.411412 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qrdw5" Dec 01 13:59:15 crc kubenswrapper[4585]: E1201 13:59:15.411555 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qrdw5" podUID="f11a95e1-135a-4fd2-9a04-1487c56a18e1" Dec 01 13:59:15 crc kubenswrapper[4585]: I1201 13:59:15.426544 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:15 crc kubenswrapper[4585]: I1201 13:59:15.426601 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:15 crc kubenswrapper[4585]: I1201 13:59:15.426610 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:15 crc kubenswrapper[4585]: I1201 13:59:15.426628 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:15 crc kubenswrapper[4585]: I1201 13:59:15.426636 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:15Z","lastTransitionTime":"2025-12-01T13:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:15 crc kubenswrapper[4585]: I1201 13:59:15.529517 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:15 crc kubenswrapper[4585]: I1201 13:59:15.529547 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:15 crc kubenswrapper[4585]: I1201 13:59:15.529556 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:15 crc kubenswrapper[4585]: I1201 13:59:15.529569 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:15 crc kubenswrapper[4585]: I1201 13:59:15.529578 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:15Z","lastTransitionTime":"2025-12-01T13:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:15 crc kubenswrapper[4585]: I1201 13:59:15.631788 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:15 crc kubenswrapper[4585]: I1201 13:59:15.631822 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:15 crc kubenswrapper[4585]: I1201 13:59:15.631832 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:15 crc kubenswrapper[4585]: I1201 13:59:15.631846 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:15 crc kubenswrapper[4585]: I1201 13:59:15.631858 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:15Z","lastTransitionTime":"2025-12-01T13:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:15 crc kubenswrapper[4585]: I1201 13:59:15.734193 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:15 crc kubenswrapper[4585]: I1201 13:59:15.734218 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:15 crc kubenswrapper[4585]: I1201 13:59:15.734226 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:15 crc kubenswrapper[4585]: I1201 13:59:15.734239 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:15 crc kubenswrapper[4585]: I1201 13:59:15.734248 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:15Z","lastTransitionTime":"2025-12-01T13:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:15 crc kubenswrapper[4585]: I1201 13:59:15.837246 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:15 crc kubenswrapper[4585]: I1201 13:59:15.837282 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:15 crc kubenswrapper[4585]: I1201 13:59:15.837292 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:15 crc kubenswrapper[4585]: I1201 13:59:15.837309 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:15 crc kubenswrapper[4585]: I1201 13:59:15.837321 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:15Z","lastTransitionTime":"2025-12-01T13:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:15 crc kubenswrapper[4585]: I1201 13:59:15.939685 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:15 crc kubenswrapper[4585]: I1201 13:59:15.939719 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:15 crc kubenswrapper[4585]: I1201 13:59:15.939727 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:15 crc kubenswrapper[4585]: I1201 13:59:15.939742 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:15 crc kubenswrapper[4585]: I1201 13:59:15.939751 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:15Z","lastTransitionTime":"2025-12-01T13:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.041381 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.041423 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.041435 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.041450 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.041459 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:16Z","lastTransitionTime":"2025-12-01T13:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.144096 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.144137 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.144149 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.144166 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.144178 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:16Z","lastTransitionTime":"2025-12-01T13:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.247298 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.247361 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.247382 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.247412 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.247432 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:16Z","lastTransitionTime":"2025-12-01T13:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.350880 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.350937 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.350950 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.350991 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.351009 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:16Z","lastTransitionTime":"2025-12-01T13:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.412220 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.412286 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.412324 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 13:59:16 crc kubenswrapper[4585]: E1201 13:59:16.412467 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 13:59:16 crc kubenswrapper[4585]: E1201 13:59:16.412540 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 13:59:16 crc kubenswrapper[4585]: E1201 13:59:16.413016 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.434673 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b62eed93-4d28-4748-b991-dd9692b58341\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e952d1592344330416348b9ec4add60740a0d23ab554afb70c6419977fb12533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e35deff79558e54b62aa868535d5bc4d4374ede161463dcf5f3a33584867447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9736ff6b49635cb7c26db7bc2292b36cf63053e0d1e4ffa2980a520dd71d9ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe7079ed6fa6b8b64d16c67287c33c2bffa0f42df2166fe66e3f8e173ca87d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83ba28d0b57f4744f3f7f97915fde983e2de227958ddaecd43b3eae9eb406774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:16Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.448044 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4030376c-31f1-420a-935f-f1ee174914e7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2571804bbf066bfec1f4fde3430800aa87d9c5b6db48cc1f3d352d742657ca8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d240ebf61216f20038983f7b8d2e0792fc0f1ed8db050debf02c777197d417b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f66a3bd7503966a7b443b7ea1adbd8554c99e810749bc275e5cfa4e95c6b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4434211239bc61f432b46918bbb691c6257fded2fbd9552ce3344d10e091b3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:16Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.453183 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.453265 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.453280 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.453299 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.453313 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:16Z","lastTransitionTime":"2025-12-01T13:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.460699 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"979efa67-6804-42c6-9661-43397d760d30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43ecbe420cb6fc8702eae53c7d3a369f12a986685824f1f08c15398e47985cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbb6a609f35d4968e1f0a001362568ddcc21b80f897ad76b3b7f9e7f4b1651af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2e64b25d69a7d86297a1761d8a2c7e62e508e69a19ff6d48679efee71724b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfc93b1f443fde238aa3a3b7a475749b55ddefd79b2380833c6288aa73b131db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfc93b1f443fde238aa3a3b7a475749b55ddefd79b2380833c6288aa73b131db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:16Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.474378 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b3ea01760c8f1e58a55a59a0d453c25ca1298ede32d471f02092fd9d0a43830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e7aa15babbd745afed6bc0d4bbf5526180b8d32f86f8db736066fad3feb506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:16Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.485364 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4694a9b9e5ea09fb1d1963504d94853e22fce9688a502036324f92f39bf8a8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:16Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.496358 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:16Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.512041 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xh4hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0e1eaa6-bee9-401a-b01a-9bf49a938b29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db9b353d1934e3336c4282cb057c52ef64e01675fa36e23d1fa0af7133508a62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f3e0b6c5dac9ff9b4686937a7d1646c4796a3326193349cf1556876bf1ae61a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f3e0b6c5dac9ff9b4686937a7d1646c4796a3326193349cf1556876bf1ae61a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2451dad343dcc9ec3bd5eb7109aefab60da5b700ae69d8eff28fef524795a758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2451dad343dcc9ec3bd5eb7109aefab60da5b700ae69d8eff28fef524795a758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9b888c3602a8f5d0f8cb762ab7d9b6676a0467f9546dbc38c5d2dd99acedd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9b888c3602a8f5d0f8cb762ab7d9b6676a0467f9546dbc38c5d2dd99acedd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3110c36a439f88ab7fc0903fd766db1abef51f129bb1fc68634531d0e30cab0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3110c36a439f88ab7fc0903fd766db1abef51f129bb1fc68634531d0e30cab0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xh4hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:16Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.523583 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qrdw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f11a95e1-135a-4fd2-9a04-1487c56a18e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmr99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmr99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qrdw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:16Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.534619 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e6fbe56-8f79-4138-b082-5fe0f2198590\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4cd6715a83e1d1fc7d714956406fd0260653d09700c83f129dc4399bb089228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4586348a7e68fe58af97d73625dda61628f3b2176e495c513e783f9c8f19d33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4586348a7e68fe58af97d73625dda61628f3b2176e495c513e783f9c8f19d33\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:16Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.549091 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:16Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.555565 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.555606 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.555621 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.555638 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.555651 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:16Z","lastTransitionTime":"2025-12-01T13:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.559537 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-62bsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98be7526-98f6-4a4a-b4a6-1d10e76b7a99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d4df68e891df5e4e44371a4c884475b5723d55f6a10d32889cf194ddbe6179\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxjs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-62bsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:16Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.578172 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b10cdae7-154c-4fd6-a308-02843603d7ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b40ee42d7d1c9043b0b5c0dea3baacc46c24842ec64ed3b0d0ae1acfaf65f8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa22ac0ee9ad705a8affafffed373135005ef0380c9a646e06229a24d04d20b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://949f7fe4da35866e66212aea18ec1ed303bd542b07e53ff6562618c19de112cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d7c5d98b7f6dcade92ba77f78aa8345baf8161727831a616192726f6ef6f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dfb22c5155fdc4bfdf9d54bf540e4018bca6a616835af7e8d5079bebcd45da\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T13:58:25Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 13:58:19.764445 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 13:58:19.765610 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3283210930/tls.crt::/tmp/serving-cert-3283210930/tls.key\\\\\\\"\\\\nI1201 13:58:25.461951 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 13:58:25.464641 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 13:58:25.464660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 13:58:25.464680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 13:58:25.464686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 13:58:25.473932 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 13:58:25.473958 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 13:58:25.473983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 13:58:25.473986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 13:58:25.473989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 13:58:25.474166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 13:58:25.477276 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc9645fabd7968d92ce9867130dddbb6c72664af68d6b5475cf3271d1975878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:16Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.594370 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e9e83cdb2aa3565484c1ff3e5bbd72c7f50a132de50e60aec1aa88ad145ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:16Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.607329 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7beb40d-bcd0-43c8-a9fe-c32408790a4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59e8d51b7e0c8b526fab11f39a0a94685aecda8a1d300e1ee2259eb4bbac9003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c2cf1c1e4965502ab56b6d9bbf7840d00225b831b1e613295befad0748d7d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lj9gs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:16Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.626612 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90163e3776ee8b2a338e3f8e6aa3d918d1d6123d246fbbd88659d342348167a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37437844a9c9719627f96c0afa303f0d449c46def7249b8c69ebd44900661fa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec93c685227d533ac1ab980dca6f72792f746e68511a7a546fb25189b02507d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b64c1a0d36f8addf6040455b1e933756582857ef8098c935b9cc5a115e8afee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d740a1be02b59a062ea8d5c44620cb57725a7ec494fd365e44a8980947c0d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdbccd3910729d23067d94c9ed2f5f3c917cd4880185a3b7e8c4c811cd30e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6e182b8e1ae7a92cc190a93197ab3bdc03ffc5cc65257651261acd04689f8ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e182b8e1ae7a92cc190a93197ab3bdc03ffc5cc65257651261acd04689f8ee\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T13:58:55Z\\\",\\\"message\\\":\\\"Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-image-registry/image-registry]} name:Service_openshift-image-registry/image-registry_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.93:5000:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {83c1e277-3d22-42ae-a355-f7a0ff0bd171}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 13:58:55.244725 6128 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}\\\\nI1201 13:58:55.244735 6128 services_controller.go:360] Finished syncing service etcd on namespace openshift-etcd for network=default : 1.845548ms\\\\nI1201 13:58:55.244747 6128 services_controller.go:356] Processing sync for service openshift-console-operator/metrics for network=default\\\\nF1201 13:58:55.244780 6128 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-tjkqr_openshift-ovn-kubernetes(b0b45150-070d-4f7c-b53a-d76dcbaa6e6d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c85de7cd35b11158d931065cec6bafaf456be350f4a6095b8a9d0851bf87b1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:16Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.641137 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nhp6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da6d315c-d478-4216-9f3d-57b20ce5ced8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ebdaa73edbe04780c0f421ea88446b4268ed1792af30e1ebb945167db80a548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n25g8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nhp6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:16Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.651859 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kn5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c45f35-33ae-4eba-97c7-eb85a6db85a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf9d61c1ae31a9cc21ea918e419af7e8badf234ef1614ef73bafdc95381935cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b27c5e93a6707bc6b2b6ef81596e39344c0b29f8cac30ac5d62c7eaa5e635f4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6kn5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:16Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.661489 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.661520 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.661529 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.661543 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.661550 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:16Z","lastTransitionTime":"2025-12-01T13:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.664233 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:16Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.677723 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9wjs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e7ad3ad-7937-409b-b1c9-9c801f937400\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babb191a7dbe739f4bf7c2ae917b82b700f2987751c60a05ef3d9b3d11195953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a3662f79465b11c168f80f17c534c792fb1cc6b8a5c0115d219d39def6db073\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T13:59:13Z\\\",\\\"message\\\":\\\"2025-12-01T13:58:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8defcfa5-6cc7-4501-b9cf-73be7747a85c\\\\n2025-12-01T13:58:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8defcfa5-6cc7-4501-b9cf-73be7747a85c to /host/opt/cni/bin/\\\\n2025-12-01T13:58:28Z [verbose] multus-daemon started\\\\n2025-12-01T13:58:28Z [verbose] Readiness Indicator file check\\\\n2025-12-01T13:59:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:59:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvc7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9wjs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:16Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.768897 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.768930 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.768937 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.768950 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.768959 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:16Z","lastTransitionTime":"2025-12-01T13:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.870398 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.870447 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.870459 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.870476 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.870487 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:16Z","lastTransitionTime":"2025-12-01T13:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.972862 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.973161 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.973176 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.973254 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:16 crc kubenswrapper[4585]: I1201 13:59:16.973272 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:16Z","lastTransitionTime":"2025-12-01T13:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:17 crc kubenswrapper[4585]: I1201 13:59:17.075931 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:17 crc kubenswrapper[4585]: I1201 13:59:17.075965 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:17 crc kubenswrapper[4585]: I1201 13:59:17.075991 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:17 crc kubenswrapper[4585]: I1201 13:59:17.076006 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:17 crc kubenswrapper[4585]: I1201 13:59:17.076015 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:17Z","lastTransitionTime":"2025-12-01T13:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:17 crc kubenswrapper[4585]: I1201 13:59:17.178034 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:17 crc kubenswrapper[4585]: I1201 13:59:17.178109 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:17 crc kubenswrapper[4585]: I1201 13:59:17.178120 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:17 crc kubenswrapper[4585]: I1201 13:59:17.178136 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:17 crc kubenswrapper[4585]: I1201 13:59:17.178146 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:17Z","lastTransitionTime":"2025-12-01T13:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:17 crc kubenswrapper[4585]: I1201 13:59:17.280665 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:17 crc kubenswrapper[4585]: I1201 13:59:17.280715 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:17 crc kubenswrapper[4585]: I1201 13:59:17.280731 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:17 crc kubenswrapper[4585]: I1201 13:59:17.280748 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:17 crc kubenswrapper[4585]: I1201 13:59:17.280760 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:17Z","lastTransitionTime":"2025-12-01T13:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:17 crc kubenswrapper[4585]: I1201 13:59:17.383285 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:17 crc kubenswrapper[4585]: I1201 13:59:17.383348 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:17 crc kubenswrapper[4585]: I1201 13:59:17.383362 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:17 crc kubenswrapper[4585]: I1201 13:59:17.383380 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:17 crc kubenswrapper[4585]: I1201 13:59:17.383392 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:17Z","lastTransitionTime":"2025-12-01T13:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:17 crc kubenswrapper[4585]: I1201 13:59:17.412581 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qrdw5" Dec 01 13:59:17 crc kubenswrapper[4585]: E1201 13:59:17.412743 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qrdw5" podUID="f11a95e1-135a-4fd2-9a04-1487c56a18e1" Dec 01 13:59:17 crc kubenswrapper[4585]: I1201 13:59:17.485189 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:17 crc kubenswrapper[4585]: I1201 13:59:17.485235 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:17 crc kubenswrapper[4585]: I1201 13:59:17.485245 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:17 crc kubenswrapper[4585]: I1201 13:59:17.485260 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:17 crc kubenswrapper[4585]: I1201 13:59:17.485269 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:17Z","lastTransitionTime":"2025-12-01T13:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:17 crc kubenswrapper[4585]: I1201 13:59:17.587785 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:17 crc kubenswrapper[4585]: I1201 13:59:17.587817 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:17 crc kubenswrapper[4585]: I1201 13:59:17.587825 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:17 crc kubenswrapper[4585]: I1201 13:59:17.587838 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:17 crc kubenswrapper[4585]: I1201 13:59:17.587848 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:17Z","lastTransitionTime":"2025-12-01T13:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:17 crc kubenswrapper[4585]: I1201 13:59:17.689691 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:17 crc kubenswrapper[4585]: I1201 13:59:17.689722 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:17 crc kubenswrapper[4585]: I1201 13:59:17.689730 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:17 crc kubenswrapper[4585]: I1201 13:59:17.689743 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:17 crc kubenswrapper[4585]: I1201 13:59:17.689752 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:17Z","lastTransitionTime":"2025-12-01T13:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:17 crc kubenswrapper[4585]: I1201 13:59:17.791464 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:17 crc kubenswrapper[4585]: I1201 13:59:17.791513 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:17 crc kubenswrapper[4585]: I1201 13:59:17.791524 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:17 crc kubenswrapper[4585]: I1201 13:59:17.791543 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:17 crc kubenswrapper[4585]: I1201 13:59:17.791552 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:17Z","lastTransitionTime":"2025-12-01T13:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:17 crc kubenswrapper[4585]: I1201 13:59:17.893610 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:17 crc kubenswrapper[4585]: I1201 13:59:17.893665 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:17 crc kubenswrapper[4585]: I1201 13:59:17.893675 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:17 crc kubenswrapper[4585]: I1201 13:59:17.893712 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:17 crc kubenswrapper[4585]: I1201 13:59:17.893722 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:17Z","lastTransitionTime":"2025-12-01T13:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:17 crc kubenswrapper[4585]: I1201 13:59:17.996066 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:17 crc kubenswrapper[4585]: I1201 13:59:17.996106 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:17 crc kubenswrapper[4585]: I1201 13:59:17.996117 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:17 crc kubenswrapper[4585]: I1201 13:59:17.996131 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:17 crc kubenswrapper[4585]: I1201 13:59:17.996141 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:17Z","lastTransitionTime":"2025-12-01T13:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:18 crc kubenswrapper[4585]: I1201 13:59:18.098111 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:18 crc kubenswrapper[4585]: I1201 13:59:18.098157 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:18 crc kubenswrapper[4585]: I1201 13:59:18.098167 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:18 crc kubenswrapper[4585]: I1201 13:59:18.098184 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:18 crc kubenswrapper[4585]: I1201 13:59:18.098195 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:18Z","lastTransitionTime":"2025-12-01T13:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:18 crc kubenswrapper[4585]: I1201 13:59:18.200043 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:18 crc kubenswrapper[4585]: I1201 13:59:18.200083 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:18 crc kubenswrapper[4585]: I1201 13:59:18.200094 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:18 crc kubenswrapper[4585]: I1201 13:59:18.200110 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:18 crc kubenswrapper[4585]: I1201 13:59:18.200119 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:18Z","lastTransitionTime":"2025-12-01T13:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:18 crc kubenswrapper[4585]: I1201 13:59:18.302598 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:18 crc kubenswrapper[4585]: I1201 13:59:18.302639 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:18 crc kubenswrapper[4585]: I1201 13:59:18.302649 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:18 crc kubenswrapper[4585]: I1201 13:59:18.302663 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:18 crc kubenswrapper[4585]: I1201 13:59:18.302672 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:18Z","lastTransitionTime":"2025-12-01T13:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:18 crc kubenswrapper[4585]: I1201 13:59:18.407834 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:18 crc kubenswrapper[4585]: I1201 13:59:18.407874 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:18 crc kubenswrapper[4585]: I1201 13:59:18.407883 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:18 crc kubenswrapper[4585]: I1201 13:59:18.407899 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:18 crc kubenswrapper[4585]: I1201 13:59:18.407909 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:18Z","lastTransitionTime":"2025-12-01T13:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:18 crc kubenswrapper[4585]: I1201 13:59:18.412470 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 13:59:18 crc kubenswrapper[4585]: I1201 13:59:18.412578 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 13:59:18 crc kubenswrapper[4585]: I1201 13:59:18.412469 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 13:59:18 crc kubenswrapper[4585]: E1201 13:59:18.412692 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 13:59:18 crc kubenswrapper[4585]: E1201 13:59:18.412614 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 13:59:18 crc kubenswrapper[4585]: E1201 13:59:18.412813 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 13:59:18 crc kubenswrapper[4585]: I1201 13:59:18.510139 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:18 crc kubenswrapper[4585]: I1201 13:59:18.510189 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:18 crc kubenswrapper[4585]: I1201 13:59:18.510199 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:18 crc kubenswrapper[4585]: I1201 13:59:18.510213 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:18 crc kubenswrapper[4585]: I1201 13:59:18.510223 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:18Z","lastTransitionTime":"2025-12-01T13:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:18 crc kubenswrapper[4585]: I1201 13:59:18.614338 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:18 crc kubenswrapper[4585]: I1201 13:59:18.614620 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:18 crc kubenswrapper[4585]: I1201 13:59:18.614708 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:18 crc kubenswrapper[4585]: I1201 13:59:18.614802 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:18 crc kubenswrapper[4585]: I1201 13:59:18.614884 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:18Z","lastTransitionTime":"2025-12-01T13:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:18 crc kubenswrapper[4585]: I1201 13:59:18.717218 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:18 crc kubenswrapper[4585]: I1201 13:59:18.717249 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:18 crc kubenswrapper[4585]: I1201 13:59:18.717257 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:18 crc kubenswrapper[4585]: I1201 13:59:18.717270 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:18 crc kubenswrapper[4585]: I1201 13:59:18.717278 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:18Z","lastTransitionTime":"2025-12-01T13:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:18 crc kubenswrapper[4585]: I1201 13:59:18.819883 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:18 crc kubenswrapper[4585]: I1201 13:59:18.819921 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:18 crc kubenswrapper[4585]: I1201 13:59:18.819930 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:18 crc kubenswrapper[4585]: I1201 13:59:18.819944 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:18 crc kubenswrapper[4585]: I1201 13:59:18.819953 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:18Z","lastTransitionTime":"2025-12-01T13:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:18 crc kubenswrapper[4585]: I1201 13:59:18.921618 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:18 crc kubenswrapper[4585]: I1201 13:59:18.921664 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:18 crc kubenswrapper[4585]: I1201 13:59:18.921676 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:18 crc kubenswrapper[4585]: I1201 13:59:18.921693 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:18 crc kubenswrapper[4585]: I1201 13:59:18.921705 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:18Z","lastTransitionTime":"2025-12-01T13:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:19 crc kubenswrapper[4585]: I1201 13:59:19.024248 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:19 crc kubenswrapper[4585]: I1201 13:59:19.024301 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:19 crc kubenswrapper[4585]: I1201 13:59:19.024313 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:19 crc kubenswrapper[4585]: I1201 13:59:19.024329 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:19 crc kubenswrapper[4585]: I1201 13:59:19.024339 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:19Z","lastTransitionTime":"2025-12-01T13:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:19 crc kubenswrapper[4585]: I1201 13:59:19.126888 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:19 crc kubenswrapper[4585]: I1201 13:59:19.126918 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:19 crc kubenswrapper[4585]: I1201 13:59:19.126928 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:19 crc kubenswrapper[4585]: I1201 13:59:19.126941 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:19 crc kubenswrapper[4585]: I1201 13:59:19.126949 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:19Z","lastTransitionTime":"2025-12-01T13:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:19 crc kubenswrapper[4585]: I1201 13:59:19.229115 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:19 crc kubenswrapper[4585]: I1201 13:59:19.229163 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:19 crc kubenswrapper[4585]: I1201 13:59:19.229174 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:19 crc kubenswrapper[4585]: I1201 13:59:19.229190 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:19 crc kubenswrapper[4585]: I1201 13:59:19.229200 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:19Z","lastTransitionTime":"2025-12-01T13:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:19 crc kubenswrapper[4585]: I1201 13:59:19.332445 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:19 crc kubenswrapper[4585]: I1201 13:59:19.332485 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:19 crc kubenswrapper[4585]: I1201 13:59:19.332494 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:19 crc kubenswrapper[4585]: I1201 13:59:19.332511 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:19 crc kubenswrapper[4585]: I1201 13:59:19.332520 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:19Z","lastTransitionTime":"2025-12-01T13:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:19 crc kubenswrapper[4585]: I1201 13:59:19.412430 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qrdw5" Dec 01 13:59:19 crc kubenswrapper[4585]: E1201 13:59:19.412575 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qrdw5" podUID="f11a95e1-135a-4fd2-9a04-1487c56a18e1" Dec 01 13:59:19 crc kubenswrapper[4585]: I1201 13:59:19.435310 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:19 crc kubenswrapper[4585]: I1201 13:59:19.435562 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:19 crc kubenswrapper[4585]: I1201 13:59:19.435675 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:19 crc kubenswrapper[4585]: I1201 13:59:19.435742 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:19 crc kubenswrapper[4585]: I1201 13:59:19.435804 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:19Z","lastTransitionTime":"2025-12-01T13:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:19 crc kubenswrapper[4585]: I1201 13:59:19.538480 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:19 crc kubenswrapper[4585]: I1201 13:59:19.538746 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:19 crc kubenswrapper[4585]: I1201 13:59:19.538826 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:19 crc kubenswrapper[4585]: I1201 13:59:19.538915 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:19 crc kubenswrapper[4585]: I1201 13:59:19.538988 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:19Z","lastTransitionTime":"2025-12-01T13:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:19 crc kubenswrapper[4585]: I1201 13:59:19.640725 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:19 crc kubenswrapper[4585]: I1201 13:59:19.641099 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:19 crc kubenswrapper[4585]: I1201 13:59:19.641190 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:19 crc kubenswrapper[4585]: I1201 13:59:19.641255 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:19 crc kubenswrapper[4585]: I1201 13:59:19.641325 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:19Z","lastTransitionTime":"2025-12-01T13:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:19 crc kubenswrapper[4585]: I1201 13:59:19.743558 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:19 crc kubenswrapper[4585]: I1201 13:59:19.743842 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:19 crc kubenswrapper[4585]: I1201 13:59:19.743910 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:19 crc kubenswrapper[4585]: I1201 13:59:19.743989 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:19 crc kubenswrapper[4585]: I1201 13:59:19.744048 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:19Z","lastTransitionTime":"2025-12-01T13:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:19 crc kubenswrapper[4585]: I1201 13:59:19.846504 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:19 crc kubenswrapper[4585]: I1201 13:59:19.846545 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:19 crc kubenswrapper[4585]: I1201 13:59:19.846554 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:19 crc kubenswrapper[4585]: I1201 13:59:19.846569 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:19 crc kubenswrapper[4585]: I1201 13:59:19.846577 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:19Z","lastTransitionTime":"2025-12-01T13:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:19 crc kubenswrapper[4585]: I1201 13:59:19.948754 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:19 crc kubenswrapper[4585]: I1201 13:59:19.948836 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:19 crc kubenswrapper[4585]: I1201 13:59:19.948860 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:19 crc kubenswrapper[4585]: I1201 13:59:19.948892 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:19 crc kubenswrapper[4585]: I1201 13:59:19.948912 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:19Z","lastTransitionTime":"2025-12-01T13:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.050708 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.050747 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.050758 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.050774 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.050787 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:20Z","lastTransitionTime":"2025-12-01T13:59:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.153653 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.153876 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.154032 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.154131 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.154189 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:20Z","lastTransitionTime":"2025-12-01T13:59:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.256442 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.256765 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.256875 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.256963 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.257080 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:20Z","lastTransitionTime":"2025-12-01T13:59:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.360321 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.360363 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.360373 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.360392 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.360402 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:20Z","lastTransitionTime":"2025-12-01T13:59:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.412159 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.412245 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 13:59:20 crc kubenswrapper[4585]: E1201 13:59:20.412293 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 13:59:20 crc kubenswrapper[4585]: E1201 13:59:20.412401 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.412449 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 13:59:20 crc kubenswrapper[4585]: E1201 13:59:20.412503 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.462778 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.462817 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.462826 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.462841 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.462853 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:20Z","lastTransitionTime":"2025-12-01T13:59:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.564966 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.565057 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.565076 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.565102 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.565123 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:20Z","lastTransitionTime":"2025-12-01T13:59:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.667941 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.667987 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.667996 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.668009 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.668018 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:20Z","lastTransitionTime":"2025-12-01T13:59:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.770100 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.770143 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.770154 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.770168 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.770182 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:20Z","lastTransitionTime":"2025-12-01T13:59:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.872090 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.872132 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.872141 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.872161 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.872172 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:20Z","lastTransitionTime":"2025-12-01T13:59:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.934594 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.934628 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.934639 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.934653 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.934663 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:20Z","lastTransitionTime":"2025-12-01T13:59:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:20 crc kubenswrapper[4585]: E1201 13:59:20.952048 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e38ee8f4-73f3-495c-b626-577353e9a008\\\",\\\"systemUUID\\\":\\\"fcf25ef7-53cd-4591-aa80-a73d07c13768\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:20Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.956045 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.956073 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.956081 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.956095 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.956105 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:20Z","lastTransitionTime":"2025-12-01T13:59:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:20 crc kubenswrapper[4585]: E1201 13:59:20.971245 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e38ee8f4-73f3-495c-b626-577353e9a008\\\",\\\"systemUUID\\\":\\\"fcf25ef7-53cd-4591-aa80-a73d07c13768\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:20Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.980027 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.980060 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.980126 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.980168 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.980184 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:20Z","lastTransitionTime":"2025-12-01T13:59:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:20 crc kubenswrapper[4585]: E1201 13:59:20.993814 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e38ee8f4-73f3-495c-b626-577353e9a008\\\",\\\"systemUUID\\\":\\\"fcf25ef7-53cd-4591-aa80-a73d07c13768\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:20Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.999203 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.999250 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.999260 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.999277 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:20 crc kubenswrapper[4585]: I1201 13:59:20.999287 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:20Z","lastTransitionTime":"2025-12-01T13:59:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:21 crc kubenswrapper[4585]: E1201 13:59:21.012603 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e38ee8f4-73f3-495c-b626-577353e9a008\\\",\\\"systemUUID\\\":\\\"fcf25ef7-53cd-4591-aa80-a73d07c13768\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:21Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.016724 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.016789 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.016807 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.016831 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.016847 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:21Z","lastTransitionTime":"2025-12-01T13:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:21 crc kubenswrapper[4585]: E1201 13:59:21.029234 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e38ee8f4-73f3-495c-b626-577353e9a008\\\",\\\"systemUUID\\\":\\\"fcf25ef7-53cd-4591-aa80-a73d07c13768\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:21Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:21 crc kubenswrapper[4585]: E1201 13:59:21.029468 4585 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.031803 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.031880 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.031891 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.031910 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.031922 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:21Z","lastTransitionTime":"2025-12-01T13:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.134122 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.134161 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.134207 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.134223 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.134231 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:21Z","lastTransitionTime":"2025-12-01T13:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.237003 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.237068 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.237091 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.237119 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.237141 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:21Z","lastTransitionTime":"2025-12-01T13:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.339552 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.339580 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.339587 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.339599 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.339607 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:21Z","lastTransitionTime":"2025-12-01T13:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.411501 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qrdw5" Dec 01 13:59:21 crc kubenswrapper[4585]: E1201 13:59:21.411700 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qrdw5" podUID="f11a95e1-135a-4fd2-9a04-1487c56a18e1" Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.413099 4585 scope.go:117] "RemoveContainer" containerID="b6e182b8e1ae7a92cc190a93197ab3bdc03ffc5cc65257651261acd04689f8ee" Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.442429 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.442465 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.442479 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.442499 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.442514 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:21Z","lastTransitionTime":"2025-12-01T13:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.545155 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.545591 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.545610 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.545675 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.545691 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:21Z","lastTransitionTime":"2025-12-01T13:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.648306 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.648338 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.648363 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.648377 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.648385 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:21Z","lastTransitionTime":"2025-12-01T13:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.750895 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.751233 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.751314 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.751396 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.751517 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:21Z","lastTransitionTime":"2025-12-01T13:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.854162 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.854204 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.854216 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.854232 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.854243 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:21Z","lastTransitionTime":"2025-12-01T13:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.881353 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tjkqr_b0b45150-070d-4f7c-b53a-d76dcbaa6e6d/ovnkube-controller/2.log" Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.884305 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" event={"ID":"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d","Type":"ContainerStarted","Data":"6fdc8fb68081a35662e2944a74d74746db096b944121e6261161172793f792fd"} Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.885103 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.898307 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e6fbe56-8f79-4138-b082-5fe0f2198590\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4cd6715a83e1d1fc7d714956406fd0260653d09700c83f129dc4399bb089228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4586348a7e68fe58af97d73625dda61628f3b2176e495c513e783f9c8f19d33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4586348a7e68fe58af97d73625dda61628f3b2176e495c513e783f9c8f19d33\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:21Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.913028 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:21Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.924154 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-62bsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98be7526-98f6-4a4a-b4a6-1d10e76b7a99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d4df68e891df5e4e44371a4c884475b5723d55f6a10d32889cf194ddbe6179\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxjs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-62bsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:21Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.937873 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b10cdae7-154c-4fd6-a308-02843603d7ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b40ee42d7d1c9043b0b5c0dea3baacc46c24842ec64ed3b0d0ae1acfaf65f8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa22ac0ee9ad705a8affafffed373135005ef0380c9a646e06229a24d04d20b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://949f7fe4da35866e66212aea18ec1ed303bd542b07e53ff6562618c19de112cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d7c5d98b7f6dcade92ba77f78aa8345baf8161727831a616192726f6ef6f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dfb22c5155fdc4bfdf9d54bf540e4018bca6a616835af7e8d5079bebcd45da\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T13:58:25Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 13:58:19.764445 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 13:58:19.765610 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3283210930/tls.crt::/tmp/serving-cert-3283210930/tls.key\\\\\\\"\\\\nI1201 13:58:25.461951 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 13:58:25.464641 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 13:58:25.464660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 13:58:25.464680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 13:58:25.464686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 13:58:25.473932 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 13:58:25.473958 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 13:58:25.473983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 13:58:25.473986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 13:58:25.473989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 13:58:25.474166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 13:58:25.477276 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc9645fabd7968d92ce9867130dddbb6c72664af68d6b5475cf3271d1975878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:21Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.952335 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e9e83cdb2aa3565484c1ff3e5bbd72c7f50a132de50e60aec1aa88ad145ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:21Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.956211 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.956237 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.956245 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.956257 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.956267 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:21Z","lastTransitionTime":"2025-12-01T13:59:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.967999 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7beb40d-bcd0-43c8-a9fe-c32408790a4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59e8d51b7e0c8b526fab11f39a0a94685aecda8a1d300e1ee2259eb4bbac9003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c2cf1c1e4965502ab56b6d9bbf7840d00225b831b1e613295befad0748d7d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lj9gs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:21Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:21 crc kubenswrapper[4585]: I1201 13:59:21.992106 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90163e3776ee8b2a338e3f8e6aa3d918d1d6123d246fbbd88659d342348167a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37437844a9c9719627f96c0afa303f0d449c46def7249b8c69ebd44900661fa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec93c685227d533ac1ab980dca6f72792f746e68511a7a546fb25189b02507d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b64c1a0d36f8addf6040455b1e933756582857ef8098c935b9cc5a115e8afee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d740a1be02b59a062ea8d5c44620cb57725a7ec494fd365e44a8980947c0d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdbccd3910729d23067d94c9ed2f5f3c917cd4880185a3b7e8c4c811cd30e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdc8fb68081a35662e2944a74d74746db096b944121e6261161172793f792fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e182b8e1ae7a92cc190a93197ab3bdc03ffc5cc65257651261acd04689f8ee\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T13:58:55Z\\\",\\\"message\\\":\\\"Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-image-registry/image-registry]} name:Service_openshift-image-registry/image-registry_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.93:5000:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {83c1e277-3d22-42ae-a355-f7a0ff0bd171}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 13:58:55.244725 6128 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}\\\\nI1201 13:58:55.244735 6128 services_controller.go:360] Finished syncing service etcd on namespace openshift-etcd for network=default : 1.845548ms\\\\nI1201 13:58:55.244747 6128 services_controller.go:356] Processing sync for service openshift-console-operator/metrics for network=default\\\\nF1201 13:58:55.244780 6128 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:59:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c85de7cd35b11158d931065cec6bafaf456be350f4a6095b8a9d0851bf87b1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:21Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.003715 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nhp6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da6d315c-d478-4216-9f3d-57b20ce5ced8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ebdaa73edbe04780c0f421ea88446b4268ed1792af30e1ebb945167db80a548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n25g8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nhp6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:22Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.015860 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kn5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c45f35-33ae-4eba-97c7-eb85a6db85a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf9d61c1ae31a9cc21ea918e419af7e8badf234ef1614ef73bafdc95381935cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b27c5e93a6707bc6b2b6ef81596e39344c0b29f8cac30ac5d62c7eaa5e635f4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6kn5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:22Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.030670 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:22Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.053729 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9wjs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e7ad3ad-7937-409b-b1c9-9c801f937400\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babb191a7dbe739f4bf7c2ae917b82b700f2987751c60a05ef3d9b3d11195953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a3662f79465b11c168f80f17c534c792fb1cc6b8a5c0115d219d39def6db073\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T13:59:13Z\\\",\\\"message\\\":\\\"2025-12-01T13:58:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8defcfa5-6cc7-4501-b9cf-73be7747a85c\\\\n2025-12-01T13:58:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8defcfa5-6cc7-4501-b9cf-73be7747a85c to /host/opt/cni/bin/\\\\n2025-12-01T13:58:28Z [verbose] multus-daemon started\\\\n2025-12-01T13:58:28Z [verbose] Readiness Indicator file check\\\\n2025-12-01T13:59:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:59:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvc7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9wjs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:22Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.066365 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.066405 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.066417 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.066433 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.066454 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:22Z","lastTransitionTime":"2025-12-01T13:59:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.071799 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xh4hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0e1eaa6-bee9-401a-b01a-9bf49a938b29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db9b353d1934e3336c4282cb057c52ef64e01675fa36e23d1fa0af7133508a62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f3e0b6c5dac9ff9b4686937a7d1646c4796a3326193349cf1556876bf1ae61a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f3e0b6c5dac9ff9b4686937a7d1646c4796a3326193349cf1556876bf1ae61a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2451dad343dcc9ec3bd5eb7109aefab60da5b700ae69d8eff28fef524795a758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2451dad343dcc9ec3bd5eb7109aefab60da5b700ae69d8eff28fef524795a758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9b888c3602a8f5d0f8cb762ab7d9b6676a0467f9546dbc38c5d2dd99acedd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9b888c3602a8f5d0f8cb762ab7d9b6676a0467f9546dbc38c5d2dd99acedd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3110c36a439f88ab7fc0903fd766db1abef51f129bb1fc68634531d0e30cab0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3110c36a439f88ab7fc0903fd766db1abef51f129bb1fc68634531d0e30cab0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xh4hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:22Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.088328 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qrdw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f11a95e1-135a-4fd2-9a04-1487c56a18e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmr99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmr99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qrdw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:22Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.108471 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b62eed93-4d28-4748-b991-dd9692b58341\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e952d1592344330416348b9ec4add60740a0d23ab554afb70c6419977fb12533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e35deff79558e54b62aa868535d5bc4d4374ede161463dcf5f3a33584867447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9736ff6b49635cb7c26db7bc2292b36cf63053e0d1e4ffa2980a520dd71d9ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe7079ed6fa6b8b64d16c67287c33c2bffa0f42df2166fe66e3f8e173ca87d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83ba28d0b57f4744f3f7f97915fde983e2de227958ddaecd43b3eae9eb406774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:22Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.122108 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4030376c-31f1-420a-935f-f1ee174914e7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2571804bbf066bfec1f4fde3430800aa87d9c5b6db48cc1f3d352d742657ca8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d240ebf61216f20038983f7b8d2e0792fc0f1ed8db050debf02c777197d417b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f66a3bd7503966a7b443b7ea1adbd8554c99e810749bc275e5cfa4e95c6b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4434211239bc61f432b46918bbb691c6257fded2fbd9552ce3344d10e091b3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:22Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.141227 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"979efa67-6804-42c6-9661-43397d760d30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43ecbe420cb6fc8702eae53c7d3a369f12a986685824f1f08c15398e47985cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbb6a609f35d4968e1f0a001362568ddcc21b80f897ad76b3b7f9e7f4b1651af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2e64b25d69a7d86297a1761d8a2c7e62e508e69a19ff6d48679efee71724b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfc93b1f443fde238aa3a3b7a475749b55ddefd79b2380833c6288aa73b131db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfc93b1f443fde238aa3a3b7a475749b55ddefd79b2380833c6288aa73b131db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:22Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.152882 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b3ea01760c8f1e58a55a59a0d453c25ca1298ede32d471f02092fd9d0a43830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e7aa15babbd745afed6bc0d4bbf5526180b8d32f86f8db736066fad3feb506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:22Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.163429 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4694a9b9e5ea09fb1d1963504d94853e22fce9688a502036324f92f39bf8a8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:22Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.168180 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.168221 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.168233 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.168276 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.168289 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:22Z","lastTransitionTime":"2025-12-01T13:59:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.178598 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:22Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.271991 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.272031 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.272041 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.272056 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.272065 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:22Z","lastTransitionTime":"2025-12-01T13:59:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.374220 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.374274 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.374282 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.374297 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.374305 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:22Z","lastTransitionTime":"2025-12-01T13:59:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.412107 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.412149 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.412175 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 13:59:22 crc kubenswrapper[4585]: E1201 13:59:22.412251 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 13:59:22 crc kubenswrapper[4585]: E1201 13:59:22.412352 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 13:59:22 crc kubenswrapper[4585]: E1201 13:59:22.412430 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.500647 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.500677 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.500687 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.500700 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.500709 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:22Z","lastTransitionTime":"2025-12-01T13:59:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.602628 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.602666 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.602677 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.602693 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.602705 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:22Z","lastTransitionTime":"2025-12-01T13:59:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.705787 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.705821 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.705830 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.705863 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.705872 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:22Z","lastTransitionTime":"2025-12-01T13:59:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.808184 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.808252 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.808266 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.808285 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.808299 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:22Z","lastTransitionTime":"2025-12-01T13:59:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.889189 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tjkqr_b0b45150-070d-4f7c-b53a-d76dcbaa6e6d/ovnkube-controller/3.log" Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.889930 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tjkqr_b0b45150-070d-4f7c-b53a-d76dcbaa6e6d/ovnkube-controller/2.log" Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.891911 4585 generic.go:334] "Generic (PLEG): container finished" podID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" containerID="6fdc8fb68081a35662e2944a74d74746db096b944121e6261161172793f792fd" exitCode=1 Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.891961 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" event={"ID":"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d","Type":"ContainerDied","Data":"6fdc8fb68081a35662e2944a74d74746db096b944121e6261161172793f792fd"} Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.892026 4585 scope.go:117] "RemoveContainer" containerID="b6e182b8e1ae7a92cc190a93197ab3bdc03ffc5cc65257651261acd04689f8ee" Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.892754 4585 scope.go:117] "RemoveContainer" containerID="6fdc8fb68081a35662e2944a74d74746db096b944121e6261161172793f792fd" Dec 01 13:59:22 crc kubenswrapper[4585]: E1201 13:59:22.892951 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-tjkqr_openshift-ovn-kubernetes(b0b45150-070d-4f7c-b53a-d76dcbaa6e6d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" podUID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.910430 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.910457 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.910470 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.910504 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.910515 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:22Z","lastTransitionTime":"2025-12-01T13:59:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.911385 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e9e83cdb2aa3565484c1ff3e5bbd72c7f50a132de50e60aec1aa88ad145ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:22Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.921692 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7beb40d-bcd0-43c8-a9fe-c32408790a4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59e8d51b7e0c8b526fab11f39a0a94685aecda8a1d300e1ee2259eb4bbac9003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c2cf1c1e4965502ab56b6d9bbf7840d00225b831b1e613295befad0748d7d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lj9gs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:22Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.940277 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90163e3776ee8b2a338e3f8e6aa3d918d1d6123d246fbbd88659d342348167a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37437844a9c9719627f96c0afa303f0d449c46def7249b8c69ebd44900661fa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec93c685227d533ac1ab980dca6f72792f746e68511a7a546fb25189b02507d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b64c1a0d36f8addf6040455b1e933756582857ef8098c935b9cc5a115e8afee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d740a1be02b59a062ea8d5c44620cb57725a7ec494fd365e44a8980947c0d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdbccd3910729d23067d94c9ed2f5f3c917cd4880185a3b7e8c4c811cd30e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdc8fb68081a35662e2944a74d74746db096b944121e6261161172793f792fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e182b8e1ae7a92cc190a93197ab3bdc03ffc5cc65257651261acd04689f8ee\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T13:58:55Z\\\",\\\"message\\\":\\\"Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-image-registry/image-registry]} name:Service_openshift-image-registry/image-registry_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.93:5000:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {83c1e277-3d22-42ae-a355-f7a0ff0bd171}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 13:58:55.244725 6128 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}\\\\nI1201 13:58:55.244735 6128 services_controller.go:360] Finished syncing service etcd on namespace openshift-etcd for network=default : 1.845548ms\\\\nI1201 13:58:55.244747 6128 services_controller.go:356] Processing sync for service openshift-console-operator/metrics for network=default\\\\nF1201 13:58:55.244780 6128 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fdc8fb68081a35662e2944a74d74746db096b944121e6261161172793f792fd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T13:59:22Z\\\",\\\"message\\\":\\\"pping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1201 13:59:22.613221 6475 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}\\\\nI1201 13:59:22.613256 6475 services_controller.go:360] Finished syncing service etcd on namespace openshift-etcd for network=default : 9.785386ms\\\\nI1201 13:59:22.613263 6475 kube.go:317] Updating pod openshift-multus/network-metrics-daemon-qrdw5\\\\nI1201 13:59:22.613279 6475 services_controller.go:356] Processing sync for service openshift-operator-lifecycle-manager/packageserver-service for network=default\\\\nF1201 13:59:22.613282 6475 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:59:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c85de7cd35b11158d931065cec6bafaf456be350f4a6095b8a9d0851bf87b1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:22Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.954034 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nhp6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da6d315c-d478-4216-9f3d-57b20ce5ced8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ebdaa73edbe04780c0f421ea88446b4268ed1792af30e1ebb945167db80a548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n25g8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nhp6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:22Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.968948 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kn5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c45f35-33ae-4eba-97c7-eb85a6db85a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf9d61c1ae31a9cc21ea918e419af7e8badf234ef1614ef73bafdc95381935cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b27c5e93a6707bc6b2b6ef81596e39344c0b29f8cac30ac5d62c7eaa5e635f4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6kn5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:22Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.982544 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b10cdae7-154c-4fd6-a308-02843603d7ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b40ee42d7d1c9043b0b5c0dea3baacc46c24842ec64ed3b0d0ae1acfaf65f8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa22ac0ee9ad705a8affafffed373135005ef0380c9a646e06229a24d04d20b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://949f7fe4da35866e66212aea18ec1ed303bd542b07e53ff6562618c19de112cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d7c5d98b7f6dcade92ba77f78aa8345baf8161727831a616192726f6ef6f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dfb22c5155fdc4bfdf9d54bf540e4018bca6a616835af7e8d5079bebcd45da\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T13:58:25Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 13:58:19.764445 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 13:58:19.765610 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3283210930/tls.crt::/tmp/serving-cert-3283210930/tls.key\\\\\\\"\\\\nI1201 13:58:25.461951 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 13:58:25.464641 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 13:58:25.464660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 13:58:25.464680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 13:58:25.464686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 13:58:25.473932 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 13:58:25.473958 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 13:58:25.473983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 13:58:25.473986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 13:58:25.473989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 13:58:25.474166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 13:58:25.477276 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc9645fabd7968d92ce9867130dddbb6c72664af68d6b5475cf3271d1975878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:22Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:22 crc kubenswrapper[4585]: I1201 13:59:22.996469 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9wjs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e7ad3ad-7937-409b-b1c9-9c801f937400\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babb191a7dbe739f4bf7c2ae917b82b700f2987751c60a05ef3d9b3d11195953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a3662f79465b11c168f80f17c534c792fb1cc6b8a5c0115d219d39def6db073\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T13:59:13Z\\\",\\\"message\\\":\\\"2025-12-01T13:58:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8defcfa5-6cc7-4501-b9cf-73be7747a85c\\\\n2025-12-01T13:58:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8defcfa5-6cc7-4501-b9cf-73be7747a85c to /host/opt/cni/bin/\\\\n2025-12-01T13:58:28Z [verbose] multus-daemon started\\\\n2025-12-01T13:58:28Z [verbose] Readiness Indicator file check\\\\n2025-12-01T13:59:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:59:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvc7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9wjs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:22Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.010657 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:23Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.012259 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.012489 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.012585 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.012671 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.012753 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:23Z","lastTransitionTime":"2025-12-01T13:59:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.025356 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4030376c-31f1-420a-935f-f1ee174914e7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2571804bbf066bfec1f4fde3430800aa87d9c5b6db48cc1f3d352d742657ca8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d240ebf61216f20038983f7b8d2e0792fc0f1ed8db050debf02c777197d417b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f66a3bd7503966a7b443b7ea1adbd8554c99e810749bc275e5cfa4e95c6b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4434211239bc61f432b46918bbb691c6257fded2fbd9552ce3344d10e091b3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:23Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.037669 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"979efa67-6804-42c6-9661-43397d760d30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43ecbe420cb6fc8702eae53c7d3a369f12a986685824f1f08c15398e47985cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbb6a609f35d4968e1f0a001362568ddcc21b80f897ad76b3b7f9e7f4b1651af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2e64b25d69a7d86297a1761d8a2c7e62e508e69a19ff6d48679efee71724b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfc93b1f443fde238aa3a3b7a475749b55ddefd79b2380833c6288aa73b131db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfc93b1f443fde238aa3a3b7a475749b55ddefd79b2380833c6288aa73b131db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:23Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.050222 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b3ea01760c8f1e58a55a59a0d453c25ca1298ede32d471f02092fd9d0a43830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e7aa15babbd745afed6bc0d4bbf5526180b8d32f86f8db736066fad3feb506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:23Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.066876 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4694a9b9e5ea09fb1d1963504d94853e22fce9688a502036324f92f39bf8a8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:23Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.083486 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:23Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.099634 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xh4hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0e1eaa6-bee9-401a-b01a-9bf49a938b29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db9b353d1934e3336c4282cb057c52ef64e01675fa36e23d1fa0af7133508a62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f3e0b6c5dac9ff9b4686937a7d1646c4796a3326193349cf1556876bf1ae61a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f3e0b6c5dac9ff9b4686937a7d1646c4796a3326193349cf1556876bf1ae61a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2451dad343dcc9ec3bd5eb7109aefab60da5b700ae69d8eff28fef524795a758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2451dad343dcc9ec3bd5eb7109aefab60da5b700ae69d8eff28fef524795a758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9b888c3602a8f5d0f8cb762ab7d9b6676a0467f9546dbc38c5d2dd99acedd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9b888c3602a8f5d0f8cb762ab7d9b6676a0467f9546dbc38c5d2dd99acedd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3110c36a439f88ab7fc0903fd766db1abef51f129bb1fc68634531d0e30cab0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3110c36a439f88ab7fc0903fd766db1abef51f129bb1fc68634531d0e30cab0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xh4hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:23Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.111487 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qrdw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f11a95e1-135a-4fd2-9a04-1487c56a18e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmr99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmr99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qrdw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:23Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.116257 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.116285 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.116294 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.116308 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.116316 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:23Z","lastTransitionTime":"2025-12-01T13:59:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.131685 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b62eed93-4d28-4748-b991-dd9692b58341\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e952d1592344330416348b9ec4add60740a0d23ab554afb70c6419977fb12533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e35deff79558e54b62aa868535d5bc4d4374ede161463dcf5f3a33584867447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9736ff6b49635cb7c26db7bc2292b36cf63053e0d1e4ffa2980a520dd71d9ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe7079ed6fa6b8b64d16c67287c33c2bffa0f42df2166fe66e3f8e173ca87d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83ba28d0b57f4744f3f7f97915fde983e2de227958ddaecd43b3eae9eb406774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:23Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.144618 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:23Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.155518 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-62bsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98be7526-98f6-4a4a-b4a6-1d10e76b7a99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d4df68e891df5e4e44371a4c884475b5723d55f6a10d32889cf194ddbe6179\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxjs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-62bsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:23Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.167131 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e6fbe56-8f79-4138-b082-5fe0f2198590\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4cd6715a83e1d1fc7d714956406fd0260653d09700c83f129dc4399bb089228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4586348a7e68fe58af97d73625dda61628f3b2176e495c513e783f9c8f19d33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4586348a7e68fe58af97d73625dda61628f3b2176e495c513e783f9c8f19d33\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:23Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.219222 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.219273 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.219284 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.219303 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.219316 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:23Z","lastTransitionTime":"2025-12-01T13:59:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.322021 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.322072 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.322088 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.322108 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.322123 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:23Z","lastTransitionTime":"2025-12-01T13:59:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.411531 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qrdw5" Dec 01 13:59:23 crc kubenswrapper[4585]: E1201 13:59:23.411721 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qrdw5" podUID="f11a95e1-135a-4fd2-9a04-1487c56a18e1" Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.423933 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.424047 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.424066 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.424083 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.424094 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:23Z","lastTransitionTime":"2025-12-01T13:59:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.526574 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.526879 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.526958 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.527060 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.527132 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:23Z","lastTransitionTime":"2025-12-01T13:59:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.629414 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.629454 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.629464 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.629479 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.629491 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:23Z","lastTransitionTime":"2025-12-01T13:59:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.732307 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.732345 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.732354 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.732371 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.732382 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:23Z","lastTransitionTime":"2025-12-01T13:59:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.834998 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.835033 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.835045 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.835062 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.835073 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:23Z","lastTransitionTime":"2025-12-01T13:59:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.897343 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tjkqr_b0b45150-070d-4f7c-b53a-d76dcbaa6e6d/ovnkube-controller/3.log" Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.900836 4585 scope.go:117] "RemoveContainer" containerID="6fdc8fb68081a35662e2944a74d74746db096b944121e6261161172793f792fd" Dec 01 13:59:23 crc kubenswrapper[4585]: E1201 13:59:23.901019 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-tjkqr_openshift-ovn-kubernetes(b0b45150-070d-4f7c-b53a-d76dcbaa6e6d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" podUID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.914739 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e6fbe56-8f79-4138-b082-5fe0f2198590\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4cd6715a83e1d1fc7d714956406fd0260653d09700c83f129dc4399bb089228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4586348a7e68fe58af97d73625dda61628f3b2176e495c513e783f9c8f19d33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4586348a7e68fe58af97d73625dda61628f3b2176e495c513e783f9c8f19d33\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:23Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.931944 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:23Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.941201 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.941262 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.941274 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.941290 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.941302 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:23Z","lastTransitionTime":"2025-12-01T13:59:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.945732 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-62bsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98be7526-98f6-4a4a-b4a6-1d10e76b7a99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d4df68e891df5e4e44371a4c884475b5723d55f6a10d32889cf194ddbe6179\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxjs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-62bsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:23Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.965743 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90163e3776ee8b2a338e3f8e6aa3d918d1d6123d246fbbd88659d342348167a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37437844a9c9719627f96c0afa303f0d449c46def7249b8c69ebd44900661fa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec93c685227d533ac1ab980dca6f72792f746e68511a7a546fb25189b02507d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b64c1a0d36f8addf6040455b1e933756582857ef8098c935b9cc5a115e8afee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d740a1be02b59a062ea8d5c44620cb57725a7ec494fd365e44a8980947c0d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdbccd3910729d23067d94c9ed2f5f3c917cd4880185a3b7e8c4c811cd30e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdc8fb68081a35662e2944a74d74746db096b944121e6261161172793f792fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fdc8fb68081a35662e2944a74d74746db096b944121e6261161172793f792fd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T13:59:22Z\\\",\\\"message\\\":\\\"pping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1201 13:59:22.613221 6475 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}\\\\nI1201 13:59:22.613256 6475 services_controller.go:360] Finished syncing service etcd on namespace openshift-etcd for network=default : 9.785386ms\\\\nI1201 13:59:22.613263 6475 kube.go:317] Updating pod openshift-multus/network-metrics-daemon-qrdw5\\\\nI1201 13:59:22.613279 6475 services_controller.go:356] Processing sync for service openshift-operator-lifecycle-manager/packageserver-service for network=default\\\\nF1201 13:59:22.613282 6475 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:59:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-tjkqr_openshift-ovn-kubernetes(b0b45150-070d-4f7c-b53a-d76dcbaa6e6d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c85de7cd35b11158d931065cec6bafaf456be350f4a6095b8a9d0851bf87b1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:23Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.975865 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nhp6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da6d315c-d478-4216-9f3d-57b20ce5ced8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ebdaa73edbe04780c0f421ea88446b4268ed1792af30e1ebb945167db80a548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n25g8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nhp6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:23Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:23 crc kubenswrapper[4585]: I1201 13:59:23.989354 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kn5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c45f35-33ae-4eba-97c7-eb85a6db85a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf9d61c1ae31a9cc21ea918e419af7e8badf234ef1614ef73bafdc95381935cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b27c5e93a6707bc6b2b6ef81596e39344c0b29f8cac30ac5d62c7eaa5e635f4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6kn5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:23Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:24 crc kubenswrapper[4585]: I1201 13:59:24.003289 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b10cdae7-154c-4fd6-a308-02843603d7ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b40ee42d7d1c9043b0b5c0dea3baacc46c24842ec64ed3b0d0ae1acfaf65f8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa22ac0ee9ad705a8affafffed373135005ef0380c9a646e06229a24d04d20b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://949f7fe4da35866e66212aea18ec1ed303bd542b07e53ff6562618c19de112cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d7c5d98b7f6dcade92ba77f78aa8345baf8161727831a616192726f6ef6f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dfb22c5155fdc4bfdf9d54bf540e4018bca6a616835af7e8d5079bebcd45da\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T13:58:25Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 13:58:19.764445 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 13:58:19.765610 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3283210930/tls.crt::/tmp/serving-cert-3283210930/tls.key\\\\\\\"\\\\nI1201 13:58:25.461951 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 13:58:25.464641 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 13:58:25.464660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 13:58:25.464680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 13:58:25.464686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 13:58:25.473932 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 13:58:25.473958 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 13:58:25.473983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 13:58:25.473986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 13:58:25.473989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 13:58:25.474166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 13:58:25.477276 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc9645fabd7968d92ce9867130dddbb6c72664af68d6b5475cf3271d1975878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:24Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:24 crc kubenswrapper[4585]: I1201 13:59:24.016700 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e9e83cdb2aa3565484c1ff3e5bbd72c7f50a132de50e60aec1aa88ad145ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:24Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:24 crc kubenswrapper[4585]: I1201 13:59:24.027245 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7beb40d-bcd0-43c8-a9fe-c32408790a4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59e8d51b7e0c8b526fab11f39a0a94685aecda8a1d300e1ee2259eb4bbac9003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c2cf1c1e4965502ab56b6d9bbf7840d00225b831b1e613295befad0748d7d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lj9gs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:24Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:24 crc kubenswrapper[4585]: I1201 13:59:24.038560 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:24Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:24 crc kubenswrapper[4585]: I1201 13:59:24.043871 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:24 crc kubenswrapper[4585]: I1201 13:59:24.043952 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:24 crc kubenswrapper[4585]: I1201 13:59:24.043986 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:24 crc kubenswrapper[4585]: I1201 13:59:24.044012 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:24 crc kubenswrapper[4585]: I1201 13:59:24.044024 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:24Z","lastTransitionTime":"2025-12-01T13:59:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:24 crc kubenswrapper[4585]: I1201 13:59:24.050025 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9wjs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e7ad3ad-7937-409b-b1c9-9c801f937400\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babb191a7dbe739f4bf7c2ae917b82b700f2987751c60a05ef3d9b3d11195953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a3662f79465b11c168f80f17c534c792fb1cc6b8a5c0115d219d39def6db073\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T13:59:13Z\\\",\\\"message\\\":\\\"2025-12-01T13:58:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8defcfa5-6cc7-4501-b9cf-73be7747a85c\\\\n2025-12-01T13:58:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8defcfa5-6cc7-4501-b9cf-73be7747a85c to /host/opt/cni/bin/\\\\n2025-12-01T13:58:28Z [verbose] multus-daemon started\\\\n2025-12-01T13:58:28Z [verbose] Readiness Indicator file check\\\\n2025-12-01T13:59:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:59:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvc7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9wjs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:24Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:24 crc kubenswrapper[4585]: I1201 13:59:24.062918 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b3ea01760c8f1e58a55a59a0d453c25ca1298ede32d471f02092fd9d0a43830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e7aa15babbd745afed6bc0d4bbf5526180b8d32f86f8db736066fad3feb506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:24Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:24 crc kubenswrapper[4585]: I1201 13:59:24.073849 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4694a9b9e5ea09fb1d1963504d94853e22fce9688a502036324f92f39bf8a8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:24Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:24 crc kubenswrapper[4585]: I1201 13:59:24.085352 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:24Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:24 crc kubenswrapper[4585]: I1201 13:59:24.098098 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xh4hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0e1eaa6-bee9-401a-b01a-9bf49a938b29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db9b353d1934e3336c4282cb057c52ef64e01675fa36e23d1fa0af7133508a62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f3e0b6c5dac9ff9b4686937a7d1646c4796a3326193349cf1556876bf1ae61a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f3e0b6c5dac9ff9b4686937a7d1646c4796a3326193349cf1556876bf1ae61a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2451dad343dcc9ec3bd5eb7109aefab60da5b700ae69d8eff28fef524795a758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2451dad343dcc9ec3bd5eb7109aefab60da5b700ae69d8eff28fef524795a758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9b888c3602a8f5d0f8cb762ab7d9b6676a0467f9546dbc38c5d2dd99acedd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9b888c3602a8f5d0f8cb762ab7d9b6676a0467f9546dbc38c5d2dd99acedd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3110c36a439f88ab7fc0903fd766db1abef51f129bb1fc68634531d0e30cab0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3110c36a439f88ab7fc0903fd766db1abef51f129bb1fc68634531d0e30cab0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xh4hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:24Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:24 crc kubenswrapper[4585]: I1201 13:59:24.108105 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qrdw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f11a95e1-135a-4fd2-9a04-1487c56a18e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmr99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmr99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qrdw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:24Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:24 crc kubenswrapper[4585]: I1201 13:59:24.130017 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b62eed93-4d28-4748-b991-dd9692b58341\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e952d1592344330416348b9ec4add60740a0d23ab554afb70c6419977fb12533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e35deff79558e54b62aa868535d5bc4d4374ede161463dcf5f3a33584867447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9736ff6b49635cb7c26db7bc2292b36cf63053e0d1e4ffa2980a520dd71d9ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe7079ed6fa6b8b64d16c67287c33c2bffa0f42df2166fe66e3f8e173ca87d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83ba28d0b57f4744f3f7f97915fde983e2de227958ddaecd43b3eae9eb406774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:24Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:24 crc kubenswrapper[4585]: I1201 13:59:24.145952 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4030376c-31f1-420a-935f-f1ee174914e7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2571804bbf066bfec1f4fde3430800aa87d9c5b6db48cc1f3d352d742657ca8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d240ebf61216f20038983f7b8d2e0792fc0f1ed8db050debf02c777197d417b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f66a3bd7503966a7b443b7ea1adbd8554c99e810749bc275e5cfa4e95c6b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4434211239bc61f432b46918bbb691c6257fded2fbd9552ce3344d10e091b3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:24Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:24 crc kubenswrapper[4585]: I1201 13:59:24.149295 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:24 crc kubenswrapper[4585]: I1201 13:59:24.149320 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:24 crc kubenswrapper[4585]: I1201 13:59:24.149329 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:24 crc kubenswrapper[4585]: I1201 13:59:24.149343 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:24 crc kubenswrapper[4585]: I1201 13:59:24.149352 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:24Z","lastTransitionTime":"2025-12-01T13:59:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:24 crc kubenswrapper[4585]: I1201 13:59:24.160021 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"979efa67-6804-42c6-9661-43397d760d30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43ecbe420cb6fc8702eae53c7d3a369f12a986685824f1f08c15398e47985cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbb6a609f35d4968e1f0a001362568ddcc21b80f897ad76b3b7f9e7f4b1651af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2e64b25d69a7d86297a1761d8a2c7e62e508e69a19ff6d48679efee71724b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfc93b1f443fde238aa3a3b7a475749b55ddefd79b2380833c6288aa73b131db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfc93b1f443fde238aa3a3b7a475749b55ddefd79b2380833c6288aa73b131db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:24Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:24 crc kubenswrapper[4585]: I1201 13:59:24.251996 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:24 crc kubenswrapper[4585]: I1201 13:59:24.252333 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:24 crc kubenswrapper[4585]: I1201 13:59:24.252419 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:24 crc kubenswrapper[4585]: I1201 13:59:24.252507 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:24 crc kubenswrapper[4585]: I1201 13:59:24.252587 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:24Z","lastTransitionTime":"2025-12-01T13:59:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:24 crc kubenswrapper[4585]: I1201 13:59:24.355816 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:24 crc kubenswrapper[4585]: I1201 13:59:24.356139 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:24 crc kubenswrapper[4585]: I1201 13:59:24.356233 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:24 crc kubenswrapper[4585]: I1201 13:59:24.356319 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:24 crc kubenswrapper[4585]: I1201 13:59:24.356385 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:24Z","lastTransitionTime":"2025-12-01T13:59:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:24 crc kubenswrapper[4585]: I1201 13:59:24.411640 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 13:59:24 crc kubenswrapper[4585]: I1201 13:59:24.411734 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 13:59:24 crc kubenswrapper[4585]: I1201 13:59:24.411656 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 13:59:24 crc kubenswrapper[4585]: E1201 13:59:24.411856 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 13:59:24 crc kubenswrapper[4585]: E1201 13:59:24.411954 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 13:59:24 crc kubenswrapper[4585]: E1201 13:59:24.412034 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 13:59:24 crc kubenswrapper[4585]: I1201 13:59:24.458915 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:24 crc kubenswrapper[4585]: I1201 13:59:24.459204 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:24 crc kubenswrapper[4585]: I1201 13:59:24.459279 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:24 crc kubenswrapper[4585]: I1201 13:59:24.459385 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:24 crc kubenswrapper[4585]: I1201 13:59:24.459452 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:24Z","lastTransitionTime":"2025-12-01T13:59:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:24 crc kubenswrapper[4585]: I1201 13:59:24.562732 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:24 crc kubenswrapper[4585]: I1201 13:59:24.562787 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:24 crc kubenswrapper[4585]: I1201 13:59:24.562804 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:24 crc kubenswrapper[4585]: I1201 13:59:24.562826 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:24 crc kubenswrapper[4585]: I1201 13:59:24.562844 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:24Z","lastTransitionTime":"2025-12-01T13:59:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:24 crc kubenswrapper[4585]: I1201 13:59:24.665786 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:24 crc kubenswrapper[4585]: I1201 13:59:24.665832 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:24 crc kubenswrapper[4585]: I1201 13:59:24.665843 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:24 crc kubenswrapper[4585]: I1201 13:59:24.665858 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:24 crc kubenswrapper[4585]: I1201 13:59:24.665870 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:24Z","lastTransitionTime":"2025-12-01T13:59:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:24 crc kubenswrapper[4585]: I1201 13:59:24.769627 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:24 crc kubenswrapper[4585]: I1201 13:59:24.769706 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:24 crc kubenswrapper[4585]: I1201 13:59:24.769731 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:24 crc kubenswrapper[4585]: I1201 13:59:24.769798 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:24 crc kubenswrapper[4585]: I1201 13:59:24.769822 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:24Z","lastTransitionTime":"2025-12-01T13:59:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:24 crc kubenswrapper[4585]: I1201 13:59:24.874057 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:24 crc kubenswrapper[4585]: I1201 13:59:24.874100 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:24 crc kubenswrapper[4585]: I1201 13:59:24.874113 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:24 crc kubenswrapper[4585]: I1201 13:59:24.874130 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:24 crc kubenswrapper[4585]: I1201 13:59:24.874144 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:24Z","lastTransitionTime":"2025-12-01T13:59:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:24 crc kubenswrapper[4585]: I1201 13:59:24.977437 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:24 crc kubenswrapper[4585]: I1201 13:59:24.977500 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:24 crc kubenswrapper[4585]: I1201 13:59:24.977517 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:24 crc kubenswrapper[4585]: I1201 13:59:24.977542 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:24 crc kubenswrapper[4585]: I1201 13:59:24.977559 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:24Z","lastTransitionTime":"2025-12-01T13:59:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:25 crc kubenswrapper[4585]: I1201 13:59:25.080857 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:25 crc kubenswrapper[4585]: I1201 13:59:25.080901 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:25 crc kubenswrapper[4585]: I1201 13:59:25.080914 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:25 crc kubenswrapper[4585]: I1201 13:59:25.080931 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:25 crc kubenswrapper[4585]: I1201 13:59:25.080944 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:25Z","lastTransitionTime":"2025-12-01T13:59:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:25 crc kubenswrapper[4585]: I1201 13:59:25.184021 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:25 crc kubenswrapper[4585]: I1201 13:59:25.184081 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:25 crc kubenswrapper[4585]: I1201 13:59:25.184093 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:25 crc kubenswrapper[4585]: I1201 13:59:25.184108 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:25 crc kubenswrapper[4585]: I1201 13:59:25.184118 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:25Z","lastTransitionTime":"2025-12-01T13:59:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:25 crc kubenswrapper[4585]: I1201 13:59:25.286857 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:25 crc kubenswrapper[4585]: I1201 13:59:25.286905 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:25 crc kubenswrapper[4585]: I1201 13:59:25.286920 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:25 crc kubenswrapper[4585]: I1201 13:59:25.286940 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:25 crc kubenswrapper[4585]: I1201 13:59:25.286954 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:25Z","lastTransitionTime":"2025-12-01T13:59:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:25 crc kubenswrapper[4585]: I1201 13:59:25.389648 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:25 crc kubenswrapper[4585]: I1201 13:59:25.389677 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:25 crc kubenswrapper[4585]: I1201 13:59:25.389685 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:25 crc kubenswrapper[4585]: I1201 13:59:25.389699 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:25 crc kubenswrapper[4585]: I1201 13:59:25.389707 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:25Z","lastTransitionTime":"2025-12-01T13:59:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:25 crc kubenswrapper[4585]: I1201 13:59:25.412107 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qrdw5" Dec 01 13:59:25 crc kubenswrapper[4585]: E1201 13:59:25.412306 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qrdw5" podUID="f11a95e1-135a-4fd2-9a04-1487c56a18e1" Dec 01 13:59:25 crc kubenswrapper[4585]: I1201 13:59:25.492354 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:25 crc kubenswrapper[4585]: I1201 13:59:25.492428 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:25 crc kubenswrapper[4585]: I1201 13:59:25.492442 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:25 crc kubenswrapper[4585]: I1201 13:59:25.492461 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:25 crc kubenswrapper[4585]: I1201 13:59:25.492472 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:25Z","lastTransitionTime":"2025-12-01T13:59:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:25 crc kubenswrapper[4585]: I1201 13:59:25.595454 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:25 crc kubenswrapper[4585]: I1201 13:59:25.595492 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:25 crc kubenswrapper[4585]: I1201 13:59:25.595503 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:25 crc kubenswrapper[4585]: I1201 13:59:25.595521 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:25 crc kubenswrapper[4585]: I1201 13:59:25.595535 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:25Z","lastTransitionTime":"2025-12-01T13:59:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:25 crc kubenswrapper[4585]: I1201 13:59:25.697686 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:25 crc kubenswrapper[4585]: I1201 13:59:25.697729 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:25 crc kubenswrapper[4585]: I1201 13:59:25.697742 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:25 crc kubenswrapper[4585]: I1201 13:59:25.697758 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:25 crc kubenswrapper[4585]: I1201 13:59:25.697851 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:25Z","lastTransitionTime":"2025-12-01T13:59:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:25 crc kubenswrapper[4585]: I1201 13:59:25.800628 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:25 crc kubenswrapper[4585]: I1201 13:59:25.800671 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:25 crc kubenswrapper[4585]: I1201 13:59:25.800683 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:25 crc kubenswrapper[4585]: I1201 13:59:25.800697 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:25 crc kubenswrapper[4585]: I1201 13:59:25.800706 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:25Z","lastTransitionTime":"2025-12-01T13:59:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:25 crc kubenswrapper[4585]: I1201 13:59:25.902797 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:25 crc kubenswrapper[4585]: I1201 13:59:25.902838 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:25 crc kubenswrapper[4585]: I1201 13:59:25.902855 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:25 crc kubenswrapper[4585]: I1201 13:59:25.902873 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:25 crc kubenswrapper[4585]: I1201 13:59:25.902883 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:25Z","lastTransitionTime":"2025-12-01T13:59:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.004630 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.004669 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.004682 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.004698 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.004709 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:26Z","lastTransitionTime":"2025-12-01T13:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.107069 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.107112 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.107124 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.107143 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.107157 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:26Z","lastTransitionTime":"2025-12-01T13:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.208994 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.209044 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.209065 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.209083 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.209093 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:26Z","lastTransitionTime":"2025-12-01T13:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.311151 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.311227 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.311236 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.311250 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.311260 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:26Z","lastTransitionTime":"2025-12-01T13:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.411858 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 13:59:26 crc kubenswrapper[4585]: E1201 13:59:26.412107 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.412610 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 13:59:26 crc kubenswrapper[4585]: E1201 13:59:26.412720 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.413004 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 13:59:26 crc kubenswrapper[4585]: E1201 13:59:26.413057 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.413298 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.413322 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.413330 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.413342 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.413351 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:26Z","lastTransitionTime":"2025-12-01T13:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.428157 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xh4hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0e1eaa6-bee9-401a-b01a-9bf49a938b29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db9b353d1934e3336c4282cb057c52ef64e01675fa36e23d1fa0af7133508a62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f3e0b6c5dac9ff9b4686937a7d1646c4796a3326193349cf1556876bf1ae61a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f3e0b6c5dac9ff9b4686937a7d1646c4796a3326193349cf1556876bf1ae61a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2451dad343dcc9ec3bd5eb7109aefab60da5b700ae69d8eff28fef524795a758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2451dad343dcc9ec3bd5eb7109aefab60da5b700ae69d8eff28fef524795a758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9b888c3602a8f5d0f8cb762ab7d9b6676a0467f9546dbc38c5d2dd99acedd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9b888c3602a8f5d0f8cb762ab7d9b6676a0467f9546dbc38c5d2dd99acedd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3110c36a439f88ab7fc0903fd766db1abef51f129bb1fc68634531d0e30cab0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3110c36a439f88ab7fc0903fd766db1abef51f129bb1fc68634531d0e30cab0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xh4hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:26Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.439238 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qrdw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f11a95e1-135a-4fd2-9a04-1487c56a18e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmr99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmr99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qrdw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:26Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.456526 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b62eed93-4d28-4748-b991-dd9692b58341\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e952d1592344330416348b9ec4add60740a0d23ab554afb70c6419977fb12533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e35deff79558e54b62aa868535d5bc4d4374ede161463dcf5f3a33584867447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9736ff6b49635cb7c26db7bc2292b36cf63053e0d1e4ffa2980a520dd71d9ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe7079ed6fa6b8b64d16c67287c33c2bffa0f42df2166fe66e3f8e173ca87d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83ba28d0b57f4744f3f7f97915fde983e2de227958ddaecd43b3eae9eb406774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:26Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.469148 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4030376c-31f1-420a-935f-f1ee174914e7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2571804bbf066bfec1f4fde3430800aa87d9c5b6db48cc1f3d352d742657ca8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d240ebf61216f20038983f7b8d2e0792fc0f1ed8db050debf02c777197d417b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f66a3bd7503966a7b443b7ea1adbd8554c99e810749bc275e5cfa4e95c6b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4434211239bc61f432b46918bbb691c6257fded2fbd9552ce3344d10e091b3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:26Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.483840 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"979efa67-6804-42c6-9661-43397d760d30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43ecbe420cb6fc8702eae53c7d3a369f12a986685824f1f08c15398e47985cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbb6a609f35d4968e1f0a001362568ddcc21b80f897ad76b3b7f9e7f4b1651af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2e64b25d69a7d86297a1761d8a2c7e62e508e69a19ff6d48679efee71724b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfc93b1f443fde238aa3a3b7a475749b55ddefd79b2380833c6288aa73b131db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfc93b1f443fde238aa3a3b7a475749b55ddefd79b2380833c6288aa73b131db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:26Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.497168 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b3ea01760c8f1e58a55a59a0d453c25ca1298ede32d471f02092fd9d0a43830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e7aa15babbd745afed6bc0d4bbf5526180b8d32f86f8db736066fad3feb506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:26Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.515162 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.515200 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.515211 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.515227 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.515238 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:26Z","lastTransitionTime":"2025-12-01T13:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.516079 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4694a9b9e5ea09fb1d1963504d94853e22fce9688a502036324f92f39bf8a8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:26Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.528680 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:26Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.541068 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e6fbe56-8f79-4138-b082-5fe0f2198590\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4cd6715a83e1d1fc7d714956406fd0260653d09700c83f129dc4399bb089228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4586348a7e68fe58af97d73625dda61628f3b2176e495c513e783f9c8f19d33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4586348a7e68fe58af97d73625dda61628f3b2176e495c513e783f9c8f19d33\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:26Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.556609 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:26Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.567919 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-62bsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98be7526-98f6-4a4a-b4a6-1d10e76b7a99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d4df68e891df5e4e44371a4c884475b5723d55f6a10d32889cf194ddbe6179\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxjs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-62bsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:26Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.581345 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b10cdae7-154c-4fd6-a308-02843603d7ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b40ee42d7d1c9043b0b5c0dea3baacc46c24842ec64ed3b0d0ae1acfaf65f8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa22ac0ee9ad705a8affafffed373135005ef0380c9a646e06229a24d04d20b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://949f7fe4da35866e66212aea18ec1ed303bd542b07e53ff6562618c19de112cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d7c5d98b7f6dcade92ba77f78aa8345baf8161727831a616192726f6ef6f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dfb22c5155fdc4bfdf9d54bf540e4018bca6a616835af7e8d5079bebcd45da\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T13:58:25Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 13:58:19.764445 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 13:58:19.765610 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3283210930/tls.crt::/tmp/serving-cert-3283210930/tls.key\\\\\\\"\\\\nI1201 13:58:25.461951 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 13:58:25.464641 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 13:58:25.464660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 13:58:25.464680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 13:58:25.464686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 13:58:25.473932 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 13:58:25.473958 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 13:58:25.473983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 13:58:25.473986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 13:58:25.473989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 13:58:25.474166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 13:58:25.477276 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc9645fabd7968d92ce9867130dddbb6c72664af68d6b5475cf3271d1975878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:26Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.592751 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e9e83cdb2aa3565484c1ff3e5bbd72c7f50a132de50e60aec1aa88ad145ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:26Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.605786 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7beb40d-bcd0-43c8-a9fe-c32408790a4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59e8d51b7e0c8b526fab11f39a0a94685aecda8a1d300e1ee2259eb4bbac9003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c2cf1c1e4965502ab56b6d9bbf7840d00225b831b1e613295befad0748d7d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lj9gs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:26Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.617261 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.617320 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.617335 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.617356 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.617368 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:26Z","lastTransitionTime":"2025-12-01T13:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.625394 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90163e3776ee8b2a338e3f8e6aa3d918d1d6123d246fbbd88659d342348167a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37437844a9c9719627f96c0afa303f0d449c46def7249b8c69ebd44900661fa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec93c685227d533ac1ab980dca6f72792f746e68511a7a546fb25189b02507d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b64c1a0d36f8addf6040455b1e933756582857ef8098c935b9cc5a115e8afee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d740a1be02b59a062ea8d5c44620cb57725a7ec494fd365e44a8980947c0d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdbccd3910729d23067d94c9ed2f5f3c917cd4880185a3b7e8c4c811cd30e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdc8fb68081a35662e2944a74d74746db096b944121e6261161172793f792fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fdc8fb68081a35662e2944a74d74746db096b944121e6261161172793f792fd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T13:59:22Z\\\",\\\"message\\\":\\\"pping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1201 13:59:22.613221 6475 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}\\\\nI1201 13:59:22.613256 6475 services_controller.go:360] Finished syncing service etcd on namespace openshift-etcd for network=default : 9.785386ms\\\\nI1201 13:59:22.613263 6475 kube.go:317] Updating pod openshift-multus/network-metrics-daemon-qrdw5\\\\nI1201 13:59:22.613279 6475 services_controller.go:356] Processing sync for service openshift-operator-lifecycle-manager/packageserver-service for network=default\\\\nF1201 13:59:22.613282 6475 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:59:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-tjkqr_openshift-ovn-kubernetes(b0b45150-070d-4f7c-b53a-d76dcbaa6e6d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c85de7cd35b11158d931065cec6bafaf456be350f4a6095b8a9d0851bf87b1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:26Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.637183 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nhp6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da6d315c-d478-4216-9f3d-57b20ce5ced8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ebdaa73edbe04780c0f421ea88446b4268ed1792af30e1ebb945167db80a548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n25g8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nhp6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:26Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.649832 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kn5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c45f35-33ae-4eba-97c7-eb85a6db85a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf9d61c1ae31a9cc21ea918e419af7e8badf234ef1614ef73bafdc95381935cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b27c5e93a6707bc6b2b6ef81596e39344c0b29f8cac30ac5d62c7eaa5e635f4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6kn5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:26Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.665664 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:26Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.679811 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9wjs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e7ad3ad-7937-409b-b1c9-9c801f937400\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babb191a7dbe739f4bf7c2ae917b82b700f2987751c60a05ef3d9b3d11195953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a3662f79465b11c168f80f17c534c792fb1cc6b8a5c0115d219d39def6db073\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T13:59:13Z\\\",\\\"message\\\":\\\"2025-12-01T13:58:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8defcfa5-6cc7-4501-b9cf-73be7747a85c\\\\n2025-12-01T13:58:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8defcfa5-6cc7-4501-b9cf-73be7747a85c to /host/opt/cni/bin/\\\\n2025-12-01T13:58:28Z [verbose] multus-daemon started\\\\n2025-12-01T13:58:28Z [verbose] Readiness Indicator file check\\\\n2025-12-01T13:59:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:59:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvc7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9wjs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:26Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.719439 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.719478 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.719489 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.719504 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.719516 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:26Z","lastTransitionTime":"2025-12-01T13:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.821038 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.821097 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.821110 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.821126 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.821136 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:26Z","lastTransitionTime":"2025-12-01T13:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.923591 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.923628 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.923638 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.923653 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:26 crc kubenswrapper[4585]: I1201 13:59:26.923666 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:26Z","lastTransitionTime":"2025-12-01T13:59:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:27 crc kubenswrapper[4585]: I1201 13:59:27.025467 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:27 crc kubenswrapper[4585]: I1201 13:59:27.025761 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:27 crc kubenswrapper[4585]: I1201 13:59:27.025944 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:27 crc kubenswrapper[4585]: I1201 13:59:27.026080 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:27 crc kubenswrapper[4585]: I1201 13:59:27.026182 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:27Z","lastTransitionTime":"2025-12-01T13:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:27 crc kubenswrapper[4585]: I1201 13:59:27.129219 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:27 crc kubenswrapper[4585]: I1201 13:59:27.129462 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:27 crc kubenswrapper[4585]: I1201 13:59:27.129533 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:27 crc kubenswrapper[4585]: I1201 13:59:27.129644 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:27 crc kubenswrapper[4585]: I1201 13:59:27.129710 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:27Z","lastTransitionTime":"2025-12-01T13:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:27 crc kubenswrapper[4585]: I1201 13:59:27.234124 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:27 crc kubenswrapper[4585]: I1201 13:59:27.234180 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:27 crc kubenswrapper[4585]: I1201 13:59:27.234193 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:27 crc kubenswrapper[4585]: I1201 13:59:27.234212 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:27 crc kubenswrapper[4585]: I1201 13:59:27.234223 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:27Z","lastTransitionTime":"2025-12-01T13:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:27 crc kubenswrapper[4585]: I1201 13:59:27.336431 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:27 crc kubenswrapper[4585]: I1201 13:59:27.336489 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:27 crc kubenswrapper[4585]: I1201 13:59:27.336501 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:27 crc kubenswrapper[4585]: I1201 13:59:27.336518 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:27 crc kubenswrapper[4585]: I1201 13:59:27.336530 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:27Z","lastTransitionTime":"2025-12-01T13:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:27 crc kubenswrapper[4585]: I1201 13:59:27.411618 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qrdw5" Dec 01 13:59:27 crc kubenswrapper[4585]: E1201 13:59:27.411801 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qrdw5" podUID="f11a95e1-135a-4fd2-9a04-1487c56a18e1" Dec 01 13:59:27 crc kubenswrapper[4585]: I1201 13:59:27.441399 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:27 crc kubenswrapper[4585]: I1201 13:59:27.441482 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:27 crc kubenswrapper[4585]: I1201 13:59:27.441496 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:27 crc kubenswrapper[4585]: I1201 13:59:27.441519 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:27 crc kubenswrapper[4585]: I1201 13:59:27.441535 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:27Z","lastTransitionTime":"2025-12-01T13:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:27 crc kubenswrapper[4585]: I1201 13:59:27.543865 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:27 crc kubenswrapper[4585]: I1201 13:59:27.543915 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:27 crc kubenswrapper[4585]: I1201 13:59:27.543963 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:27 crc kubenswrapper[4585]: I1201 13:59:27.544018 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:27 crc kubenswrapper[4585]: I1201 13:59:27.544033 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:27Z","lastTransitionTime":"2025-12-01T13:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:27 crc kubenswrapper[4585]: I1201 13:59:27.645622 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:27 crc kubenswrapper[4585]: I1201 13:59:27.645671 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:27 crc kubenswrapper[4585]: I1201 13:59:27.645685 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:27 crc kubenswrapper[4585]: I1201 13:59:27.645701 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:27 crc kubenswrapper[4585]: I1201 13:59:27.645714 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:27Z","lastTransitionTime":"2025-12-01T13:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:27 crc kubenswrapper[4585]: I1201 13:59:27.748026 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:27 crc kubenswrapper[4585]: I1201 13:59:27.748063 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:27 crc kubenswrapper[4585]: I1201 13:59:27.748076 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:27 crc kubenswrapper[4585]: I1201 13:59:27.748093 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:27 crc kubenswrapper[4585]: I1201 13:59:27.748104 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:27Z","lastTransitionTime":"2025-12-01T13:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:27 crc kubenswrapper[4585]: I1201 13:59:27.850296 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:27 crc kubenswrapper[4585]: I1201 13:59:27.850341 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:27 crc kubenswrapper[4585]: I1201 13:59:27.850353 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:27 crc kubenswrapper[4585]: I1201 13:59:27.850369 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:27 crc kubenswrapper[4585]: I1201 13:59:27.850381 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:27Z","lastTransitionTime":"2025-12-01T13:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:27 crc kubenswrapper[4585]: I1201 13:59:27.953346 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:27 crc kubenswrapper[4585]: I1201 13:59:27.953392 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:27 crc kubenswrapper[4585]: I1201 13:59:27.953407 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:27 crc kubenswrapper[4585]: I1201 13:59:27.953425 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:27 crc kubenswrapper[4585]: I1201 13:59:27.953437 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:27Z","lastTransitionTime":"2025-12-01T13:59:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:28 crc kubenswrapper[4585]: I1201 13:59:28.056282 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:28 crc kubenswrapper[4585]: I1201 13:59:28.056329 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:28 crc kubenswrapper[4585]: I1201 13:59:28.056337 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:28 crc kubenswrapper[4585]: I1201 13:59:28.056353 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:28 crc kubenswrapper[4585]: I1201 13:59:28.056363 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:28Z","lastTransitionTime":"2025-12-01T13:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:28 crc kubenswrapper[4585]: I1201 13:59:28.159485 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:28 crc kubenswrapper[4585]: I1201 13:59:28.159552 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:28 crc kubenswrapper[4585]: I1201 13:59:28.159575 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:28 crc kubenswrapper[4585]: I1201 13:59:28.159601 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:28 crc kubenswrapper[4585]: I1201 13:59:28.159615 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:28Z","lastTransitionTime":"2025-12-01T13:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:28 crc kubenswrapper[4585]: I1201 13:59:28.261363 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:28 crc kubenswrapper[4585]: I1201 13:59:28.261397 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:28 crc kubenswrapper[4585]: I1201 13:59:28.261407 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:28 crc kubenswrapper[4585]: I1201 13:59:28.261423 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:28 crc kubenswrapper[4585]: I1201 13:59:28.261432 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:28Z","lastTransitionTime":"2025-12-01T13:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:28 crc kubenswrapper[4585]: I1201 13:59:28.363094 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:28 crc kubenswrapper[4585]: I1201 13:59:28.363151 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:28 crc kubenswrapper[4585]: I1201 13:59:28.363165 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:28 crc kubenswrapper[4585]: I1201 13:59:28.363181 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:28 crc kubenswrapper[4585]: I1201 13:59:28.363190 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:28Z","lastTransitionTime":"2025-12-01T13:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:28 crc kubenswrapper[4585]: I1201 13:59:28.412469 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 13:59:28 crc kubenswrapper[4585]: I1201 13:59:28.412526 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 13:59:28 crc kubenswrapper[4585]: I1201 13:59:28.412641 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 13:59:28 crc kubenswrapper[4585]: E1201 13:59:28.412761 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 13:59:28 crc kubenswrapper[4585]: E1201 13:59:28.412863 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 13:59:28 crc kubenswrapper[4585]: E1201 13:59:28.413013 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 13:59:28 crc kubenswrapper[4585]: I1201 13:59:28.466486 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:28 crc kubenswrapper[4585]: I1201 13:59:28.466529 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:28 crc kubenswrapper[4585]: I1201 13:59:28.466541 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:28 crc kubenswrapper[4585]: I1201 13:59:28.466560 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:28 crc kubenswrapper[4585]: I1201 13:59:28.466572 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:28Z","lastTransitionTime":"2025-12-01T13:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:28 crc kubenswrapper[4585]: I1201 13:59:28.569686 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:28 crc kubenswrapper[4585]: I1201 13:59:28.569752 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:28 crc kubenswrapper[4585]: I1201 13:59:28.569770 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:28 crc kubenswrapper[4585]: I1201 13:59:28.569796 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:28 crc kubenswrapper[4585]: I1201 13:59:28.569811 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:28Z","lastTransitionTime":"2025-12-01T13:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:28 crc kubenswrapper[4585]: I1201 13:59:28.672049 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:28 crc kubenswrapper[4585]: I1201 13:59:28.672082 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:28 crc kubenswrapper[4585]: I1201 13:59:28.672092 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:28 crc kubenswrapper[4585]: I1201 13:59:28.672139 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:28 crc kubenswrapper[4585]: I1201 13:59:28.672151 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:28Z","lastTransitionTime":"2025-12-01T13:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:28 crc kubenswrapper[4585]: I1201 13:59:28.775039 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:28 crc kubenswrapper[4585]: I1201 13:59:28.775072 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:28 crc kubenswrapper[4585]: I1201 13:59:28.775082 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:28 crc kubenswrapper[4585]: I1201 13:59:28.775097 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:28 crc kubenswrapper[4585]: I1201 13:59:28.775109 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:28Z","lastTransitionTime":"2025-12-01T13:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:28 crc kubenswrapper[4585]: I1201 13:59:28.878250 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:28 crc kubenswrapper[4585]: I1201 13:59:28.878284 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:28 crc kubenswrapper[4585]: I1201 13:59:28.878292 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:28 crc kubenswrapper[4585]: I1201 13:59:28.878307 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:28 crc kubenswrapper[4585]: I1201 13:59:28.878316 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:28Z","lastTransitionTime":"2025-12-01T13:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:28 crc kubenswrapper[4585]: I1201 13:59:28.981059 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:28 crc kubenswrapper[4585]: I1201 13:59:28.981096 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:28 crc kubenswrapper[4585]: I1201 13:59:28.981106 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:28 crc kubenswrapper[4585]: I1201 13:59:28.981121 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:28 crc kubenswrapper[4585]: I1201 13:59:28.981130 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:28Z","lastTransitionTime":"2025-12-01T13:59:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:29 crc kubenswrapper[4585]: I1201 13:59:29.083997 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:29 crc kubenswrapper[4585]: I1201 13:59:29.084022 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:29 crc kubenswrapper[4585]: I1201 13:59:29.084030 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:29 crc kubenswrapper[4585]: I1201 13:59:29.084045 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:29 crc kubenswrapper[4585]: I1201 13:59:29.084054 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:29Z","lastTransitionTime":"2025-12-01T13:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:29 crc kubenswrapper[4585]: I1201 13:59:29.187201 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:29 crc kubenswrapper[4585]: I1201 13:59:29.187291 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:29 crc kubenswrapper[4585]: I1201 13:59:29.187312 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:29 crc kubenswrapper[4585]: I1201 13:59:29.187344 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:29 crc kubenswrapper[4585]: I1201 13:59:29.187366 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:29Z","lastTransitionTime":"2025-12-01T13:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:29 crc kubenswrapper[4585]: I1201 13:59:29.290392 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:29 crc kubenswrapper[4585]: I1201 13:59:29.290434 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:29 crc kubenswrapper[4585]: I1201 13:59:29.290451 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:29 crc kubenswrapper[4585]: I1201 13:59:29.290468 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:29 crc kubenswrapper[4585]: I1201 13:59:29.290478 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:29Z","lastTransitionTime":"2025-12-01T13:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:29 crc kubenswrapper[4585]: I1201 13:59:29.393689 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:29 crc kubenswrapper[4585]: I1201 13:59:29.393738 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:29 crc kubenswrapper[4585]: I1201 13:59:29.393749 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:29 crc kubenswrapper[4585]: I1201 13:59:29.393772 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:29 crc kubenswrapper[4585]: I1201 13:59:29.393789 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:29Z","lastTransitionTime":"2025-12-01T13:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:29 crc kubenswrapper[4585]: I1201 13:59:29.412065 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qrdw5" Dec 01 13:59:29 crc kubenswrapper[4585]: E1201 13:59:29.412202 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qrdw5" podUID="f11a95e1-135a-4fd2-9a04-1487c56a18e1" Dec 01 13:59:29 crc kubenswrapper[4585]: I1201 13:59:29.463220 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 13:59:29 crc kubenswrapper[4585]: E1201 13:59:29.463337 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 14:00:33.463315461 +0000 UTC m=+147.447529316 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 13:59:29 crc kubenswrapper[4585]: I1201 13:59:29.463386 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 13:59:29 crc kubenswrapper[4585]: I1201 13:59:29.463451 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 13:59:29 crc kubenswrapper[4585]: I1201 13:59:29.463478 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 13:59:29 crc kubenswrapper[4585]: I1201 13:59:29.463508 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 13:59:29 crc kubenswrapper[4585]: E1201 13:59:29.463512 4585 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 13:59:29 crc kubenswrapper[4585]: E1201 13:59:29.463573 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 14:00:33.463562278 +0000 UTC m=+147.447776133 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 13:59:29 crc kubenswrapper[4585]: E1201 13:59:29.463585 4585 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 13:59:29 crc kubenswrapper[4585]: E1201 13:59:29.463599 4585 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 13:59:29 crc kubenswrapper[4585]: E1201 13:59:29.463613 4585 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 13:59:29 crc kubenswrapper[4585]: E1201 13:59:29.463639 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 14:00:33.46363149 +0000 UTC m=+147.447845345 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 13:59:29 crc kubenswrapper[4585]: E1201 13:59:29.463707 4585 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 13:59:29 crc kubenswrapper[4585]: E1201 13:59:29.463729 4585 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 13:59:29 crc kubenswrapper[4585]: E1201 13:59:29.463778 4585 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 13:59:29 crc kubenswrapper[4585]: E1201 13:59:29.463806 4585 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 13:59:29 crc kubenswrapper[4585]: E1201 13:59:29.463842 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 14:00:33.463804645 +0000 UTC m=+147.448018530 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 13:59:29 crc kubenswrapper[4585]: E1201 13:59:29.463876 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 14:00:33.463855837 +0000 UTC m=+147.448069722 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 13:59:29 crc kubenswrapper[4585]: I1201 13:59:29.497581 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:29 crc kubenswrapper[4585]: I1201 13:59:29.497652 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:29 crc kubenswrapper[4585]: I1201 13:59:29.497664 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:29 crc kubenswrapper[4585]: I1201 13:59:29.497686 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:29 crc kubenswrapper[4585]: I1201 13:59:29.497699 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:29Z","lastTransitionTime":"2025-12-01T13:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:29 crc kubenswrapper[4585]: I1201 13:59:29.599688 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:29 crc kubenswrapper[4585]: I1201 13:59:29.599724 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:29 crc kubenswrapper[4585]: I1201 13:59:29.599735 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:29 crc kubenswrapper[4585]: I1201 13:59:29.599750 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:29 crc kubenswrapper[4585]: I1201 13:59:29.599762 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:29Z","lastTransitionTime":"2025-12-01T13:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:29 crc kubenswrapper[4585]: I1201 13:59:29.703380 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:29 crc kubenswrapper[4585]: I1201 13:59:29.703434 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:29 crc kubenswrapper[4585]: I1201 13:59:29.703445 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:29 crc kubenswrapper[4585]: I1201 13:59:29.703463 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:29 crc kubenswrapper[4585]: I1201 13:59:29.703475 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:29Z","lastTransitionTime":"2025-12-01T13:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:29 crc kubenswrapper[4585]: I1201 13:59:29.806559 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:29 crc kubenswrapper[4585]: I1201 13:59:29.806619 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:29 crc kubenswrapper[4585]: I1201 13:59:29.806631 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:29 crc kubenswrapper[4585]: I1201 13:59:29.806650 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:29 crc kubenswrapper[4585]: I1201 13:59:29.806660 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:29Z","lastTransitionTime":"2025-12-01T13:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:29 crc kubenswrapper[4585]: I1201 13:59:29.909395 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:29 crc kubenswrapper[4585]: I1201 13:59:29.909434 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:29 crc kubenswrapper[4585]: I1201 13:59:29.909442 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:29 crc kubenswrapper[4585]: I1201 13:59:29.909454 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:29 crc kubenswrapper[4585]: I1201 13:59:29.909465 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:29Z","lastTransitionTime":"2025-12-01T13:59:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:30 crc kubenswrapper[4585]: I1201 13:59:30.012434 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:30 crc kubenswrapper[4585]: I1201 13:59:30.012480 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:30 crc kubenswrapper[4585]: I1201 13:59:30.012491 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:30 crc kubenswrapper[4585]: I1201 13:59:30.012509 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:30 crc kubenswrapper[4585]: I1201 13:59:30.012524 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:30Z","lastTransitionTime":"2025-12-01T13:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:30 crc kubenswrapper[4585]: I1201 13:59:30.115609 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:30 crc kubenswrapper[4585]: I1201 13:59:30.115705 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:30 crc kubenswrapper[4585]: I1201 13:59:30.115736 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:30 crc kubenswrapper[4585]: I1201 13:59:30.115776 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:30 crc kubenswrapper[4585]: I1201 13:59:30.115827 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:30Z","lastTransitionTime":"2025-12-01T13:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:30 crc kubenswrapper[4585]: I1201 13:59:30.218336 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:30 crc kubenswrapper[4585]: I1201 13:59:30.218395 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:30 crc kubenswrapper[4585]: I1201 13:59:30.218407 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:30 crc kubenswrapper[4585]: I1201 13:59:30.218424 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:30 crc kubenswrapper[4585]: I1201 13:59:30.218435 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:30Z","lastTransitionTime":"2025-12-01T13:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:30 crc kubenswrapper[4585]: I1201 13:59:30.321768 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:30 crc kubenswrapper[4585]: I1201 13:59:30.321828 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:30 crc kubenswrapper[4585]: I1201 13:59:30.321860 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:30 crc kubenswrapper[4585]: I1201 13:59:30.321903 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:30 crc kubenswrapper[4585]: I1201 13:59:30.321928 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:30Z","lastTransitionTime":"2025-12-01T13:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:30 crc kubenswrapper[4585]: I1201 13:59:30.413322 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 13:59:30 crc kubenswrapper[4585]: I1201 13:59:30.413418 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 13:59:30 crc kubenswrapper[4585]: I1201 13:59:30.413604 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 13:59:30 crc kubenswrapper[4585]: E1201 13:59:30.413639 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 13:59:30 crc kubenswrapper[4585]: E1201 13:59:30.413753 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 13:59:30 crc kubenswrapper[4585]: E1201 13:59:30.413863 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 13:59:30 crc kubenswrapper[4585]: I1201 13:59:30.424121 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:30 crc kubenswrapper[4585]: I1201 13:59:30.424155 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:30 crc kubenswrapper[4585]: I1201 13:59:30.424167 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:30 crc kubenswrapper[4585]: I1201 13:59:30.424189 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:30 crc kubenswrapper[4585]: I1201 13:59:30.424201 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:30Z","lastTransitionTime":"2025-12-01T13:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:30 crc kubenswrapper[4585]: I1201 13:59:30.526760 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:30 crc kubenswrapper[4585]: I1201 13:59:30.526792 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:30 crc kubenswrapper[4585]: I1201 13:59:30.526800 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:30 crc kubenswrapper[4585]: I1201 13:59:30.526814 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:30 crc kubenswrapper[4585]: I1201 13:59:30.526823 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:30Z","lastTransitionTime":"2025-12-01T13:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:30 crc kubenswrapper[4585]: I1201 13:59:30.630607 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:30 crc kubenswrapper[4585]: I1201 13:59:30.630912 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:30 crc kubenswrapper[4585]: I1201 13:59:30.630923 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:30 crc kubenswrapper[4585]: I1201 13:59:30.630940 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:30 crc kubenswrapper[4585]: I1201 13:59:30.630952 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:30Z","lastTransitionTime":"2025-12-01T13:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:30 crc kubenswrapper[4585]: I1201 13:59:30.734315 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:30 crc kubenswrapper[4585]: I1201 13:59:30.734409 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:30 crc kubenswrapper[4585]: I1201 13:59:30.734423 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:30 crc kubenswrapper[4585]: I1201 13:59:30.734447 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:30 crc kubenswrapper[4585]: I1201 13:59:30.734466 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:30Z","lastTransitionTime":"2025-12-01T13:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:30 crc kubenswrapper[4585]: I1201 13:59:30.837542 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:30 crc kubenswrapper[4585]: I1201 13:59:30.837596 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:30 crc kubenswrapper[4585]: I1201 13:59:30.837607 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:30 crc kubenswrapper[4585]: I1201 13:59:30.837629 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:30 crc kubenswrapper[4585]: I1201 13:59:30.837641 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:30Z","lastTransitionTime":"2025-12-01T13:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:30 crc kubenswrapper[4585]: I1201 13:59:30.940282 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:30 crc kubenswrapper[4585]: I1201 13:59:30.940318 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:30 crc kubenswrapper[4585]: I1201 13:59:30.940329 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:30 crc kubenswrapper[4585]: I1201 13:59:30.940343 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:30 crc kubenswrapper[4585]: I1201 13:59:30.940353 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:30Z","lastTransitionTime":"2025-12-01T13:59:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.043427 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.043462 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.043473 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.043487 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.043496 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:31Z","lastTransitionTime":"2025-12-01T13:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.145720 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.145751 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.145759 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.145772 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.145780 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:31Z","lastTransitionTime":"2025-12-01T13:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.248151 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.248204 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.248214 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.248232 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.248244 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:31Z","lastTransitionTime":"2025-12-01T13:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.274542 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.274605 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.274618 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.274653 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.274665 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:31Z","lastTransitionTime":"2025-12-01T13:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:31 crc kubenswrapper[4585]: E1201 13:59:31.289426 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e38ee8f4-73f3-495c-b626-577353e9a008\\\",\\\"systemUUID\\\":\\\"fcf25ef7-53cd-4591-aa80-a73d07c13768\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:31Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.295750 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.295795 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.295810 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.295832 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.295842 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:31Z","lastTransitionTime":"2025-12-01T13:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:31 crc kubenswrapper[4585]: E1201 13:59:31.312656 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e38ee8f4-73f3-495c-b626-577353e9a008\\\",\\\"systemUUID\\\":\\\"fcf25ef7-53cd-4591-aa80-a73d07c13768\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:31Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.317099 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.317131 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.317140 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.317154 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.317163 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:31Z","lastTransitionTime":"2025-12-01T13:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:31 crc kubenswrapper[4585]: E1201 13:59:31.332154 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e38ee8f4-73f3-495c-b626-577353e9a008\\\",\\\"systemUUID\\\":\\\"fcf25ef7-53cd-4591-aa80-a73d07c13768\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:31Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.337120 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.337185 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.337203 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.337221 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.337231 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:31Z","lastTransitionTime":"2025-12-01T13:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:31 crc kubenswrapper[4585]: E1201 13:59:31.355874 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e38ee8f4-73f3-495c-b626-577353e9a008\\\",\\\"systemUUID\\\":\\\"fcf25ef7-53cd-4591-aa80-a73d07c13768\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:31Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.359582 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.359614 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.359624 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.359639 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.359648 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:31Z","lastTransitionTime":"2025-12-01T13:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:31 crc kubenswrapper[4585]: E1201 13:59:31.372820 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e38ee8f4-73f3-495c-b626-577353e9a008\\\",\\\"systemUUID\\\":\\\"fcf25ef7-53cd-4591-aa80-a73d07c13768\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:31Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:31 crc kubenswrapper[4585]: E1201 13:59:31.373009 4585 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.374748 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.374814 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.374836 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.374863 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.374880 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:31Z","lastTransitionTime":"2025-12-01T13:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.412182 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qrdw5" Dec 01 13:59:31 crc kubenswrapper[4585]: E1201 13:59:31.412335 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qrdw5" podUID="f11a95e1-135a-4fd2-9a04-1487c56a18e1" Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.477890 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.477951 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.477963 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.478009 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.478023 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:31Z","lastTransitionTime":"2025-12-01T13:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.580285 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.580351 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.580364 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.580390 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.580405 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:31Z","lastTransitionTime":"2025-12-01T13:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.683768 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.683810 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.683821 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.683838 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.683849 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:31Z","lastTransitionTime":"2025-12-01T13:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.786649 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.786701 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.786711 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.786728 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.786740 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:31Z","lastTransitionTime":"2025-12-01T13:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.889266 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.889337 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.889354 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.889377 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.889397 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:31Z","lastTransitionTime":"2025-12-01T13:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.991740 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.991842 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.991866 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.991901 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:31 crc kubenswrapper[4585]: I1201 13:59:31.991921 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:31Z","lastTransitionTime":"2025-12-01T13:59:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:32 crc kubenswrapper[4585]: I1201 13:59:32.094005 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:32 crc kubenswrapper[4585]: I1201 13:59:32.094040 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:32 crc kubenswrapper[4585]: I1201 13:59:32.094047 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:32 crc kubenswrapper[4585]: I1201 13:59:32.094061 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:32 crc kubenswrapper[4585]: I1201 13:59:32.094078 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:32Z","lastTransitionTime":"2025-12-01T13:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:32 crc kubenswrapper[4585]: I1201 13:59:32.195878 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:32 crc kubenswrapper[4585]: I1201 13:59:32.195911 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:32 crc kubenswrapper[4585]: I1201 13:59:32.195922 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:32 crc kubenswrapper[4585]: I1201 13:59:32.195987 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:32 crc kubenswrapper[4585]: I1201 13:59:32.196001 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:32Z","lastTransitionTime":"2025-12-01T13:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:32 crc kubenswrapper[4585]: I1201 13:59:32.298156 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:32 crc kubenswrapper[4585]: I1201 13:59:32.298262 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:32 crc kubenswrapper[4585]: I1201 13:59:32.298293 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:32 crc kubenswrapper[4585]: I1201 13:59:32.298339 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:32 crc kubenswrapper[4585]: I1201 13:59:32.298366 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:32Z","lastTransitionTime":"2025-12-01T13:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:32 crc kubenswrapper[4585]: I1201 13:59:32.401365 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:32 crc kubenswrapper[4585]: I1201 13:59:32.401424 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:32 crc kubenswrapper[4585]: I1201 13:59:32.401436 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:32 crc kubenswrapper[4585]: I1201 13:59:32.401458 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:32 crc kubenswrapper[4585]: I1201 13:59:32.401473 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:32Z","lastTransitionTime":"2025-12-01T13:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:32 crc kubenswrapper[4585]: I1201 13:59:32.411758 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 13:59:32 crc kubenswrapper[4585]: I1201 13:59:32.411925 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 13:59:32 crc kubenswrapper[4585]: I1201 13:59:32.412119 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 13:59:32 crc kubenswrapper[4585]: E1201 13:59:32.412117 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 13:59:32 crc kubenswrapper[4585]: E1201 13:59:32.412250 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 13:59:32 crc kubenswrapper[4585]: E1201 13:59:32.412335 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 13:59:32 crc kubenswrapper[4585]: I1201 13:59:32.503463 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:32 crc kubenswrapper[4585]: I1201 13:59:32.503494 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:32 crc kubenswrapper[4585]: I1201 13:59:32.503502 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:32 crc kubenswrapper[4585]: I1201 13:59:32.503513 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:32 crc kubenswrapper[4585]: I1201 13:59:32.503522 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:32Z","lastTransitionTime":"2025-12-01T13:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:32 crc kubenswrapper[4585]: I1201 13:59:32.605478 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:32 crc kubenswrapper[4585]: I1201 13:59:32.605511 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:32 crc kubenswrapper[4585]: I1201 13:59:32.605522 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:32 crc kubenswrapper[4585]: I1201 13:59:32.605536 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:32 crc kubenswrapper[4585]: I1201 13:59:32.605546 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:32Z","lastTransitionTime":"2025-12-01T13:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:32 crc kubenswrapper[4585]: I1201 13:59:32.707651 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:32 crc kubenswrapper[4585]: I1201 13:59:32.707696 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:32 crc kubenswrapper[4585]: I1201 13:59:32.707706 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:32 crc kubenswrapper[4585]: I1201 13:59:32.707722 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:32 crc kubenswrapper[4585]: I1201 13:59:32.707774 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:32Z","lastTransitionTime":"2025-12-01T13:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:32 crc kubenswrapper[4585]: I1201 13:59:32.809883 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:32 crc kubenswrapper[4585]: I1201 13:59:32.809917 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:32 crc kubenswrapper[4585]: I1201 13:59:32.809928 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:32 crc kubenswrapper[4585]: I1201 13:59:32.809943 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:32 crc kubenswrapper[4585]: I1201 13:59:32.809956 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:32Z","lastTransitionTime":"2025-12-01T13:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:32 crc kubenswrapper[4585]: I1201 13:59:32.912173 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:32 crc kubenswrapper[4585]: I1201 13:59:32.912252 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:32 crc kubenswrapper[4585]: I1201 13:59:32.912273 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:32 crc kubenswrapper[4585]: I1201 13:59:32.912301 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:32 crc kubenswrapper[4585]: I1201 13:59:32.912319 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:32Z","lastTransitionTime":"2025-12-01T13:59:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:33 crc kubenswrapper[4585]: I1201 13:59:33.015013 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:33 crc kubenswrapper[4585]: I1201 13:59:33.015059 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:33 crc kubenswrapper[4585]: I1201 13:59:33.015069 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:33 crc kubenswrapper[4585]: I1201 13:59:33.015085 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:33 crc kubenswrapper[4585]: I1201 13:59:33.015095 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:33Z","lastTransitionTime":"2025-12-01T13:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:33 crc kubenswrapper[4585]: I1201 13:59:33.118448 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:33 crc kubenswrapper[4585]: I1201 13:59:33.118509 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:33 crc kubenswrapper[4585]: I1201 13:59:33.118523 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:33 crc kubenswrapper[4585]: I1201 13:59:33.118540 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:33 crc kubenswrapper[4585]: I1201 13:59:33.118552 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:33Z","lastTransitionTime":"2025-12-01T13:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:33 crc kubenswrapper[4585]: I1201 13:59:33.221183 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:33 crc kubenswrapper[4585]: I1201 13:59:33.221216 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:33 crc kubenswrapper[4585]: I1201 13:59:33.221224 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:33 crc kubenswrapper[4585]: I1201 13:59:33.221243 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:33 crc kubenswrapper[4585]: I1201 13:59:33.221252 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:33Z","lastTransitionTime":"2025-12-01T13:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:33 crc kubenswrapper[4585]: I1201 13:59:33.324144 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:33 crc kubenswrapper[4585]: I1201 13:59:33.324198 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:33 crc kubenswrapper[4585]: I1201 13:59:33.324211 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:33 crc kubenswrapper[4585]: I1201 13:59:33.324234 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:33 crc kubenswrapper[4585]: I1201 13:59:33.324247 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:33Z","lastTransitionTime":"2025-12-01T13:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:33 crc kubenswrapper[4585]: I1201 13:59:33.412389 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qrdw5" Dec 01 13:59:33 crc kubenswrapper[4585]: E1201 13:59:33.412531 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qrdw5" podUID="f11a95e1-135a-4fd2-9a04-1487c56a18e1" Dec 01 13:59:33 crc kubenswrapper[4585]: I1201 13:59:33.425834 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:33 crc kubenswrapper[4585]: I1201 13:59:33.425884 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:33 crc kubenswrapper[4585]: I1201 13:59:33.425896 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:33 crc kubenswrapper[4585]: I1201 13:59:33.425916 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:33 crc kubenswrapper[4585]: I1201 13:59:33.425929 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:33Z","lastTransitionTime":"2025-12-01T13:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:33 crc kubenswrapper[4585]: I1201 13:59:33.528440 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:33 crc kubenswrapper[4585]: I1201 13:59:33.528574 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:33 crc kubenswrapper[4585]: I1201 13:59:33.528587 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:33 crc kubenswrapper[4585]: I1201 13:59:33.528601 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:33 crc kubenswrapper[4585]: I1201 13:59:33.528611 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:33Z","lastTransitionTime":"2025-12-01T13:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:33 crc kubenswrapper[4585]: I1201 13:59:33.632162 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:33 crc kubenswrapper[4585]: I1201 13:59:33.632222 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:33 crc kubenswrapper[4585]: I1201 13:59:33.632233 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:33 crc kubenswrapper[4585]: I1201 13:59:33.632252 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:33 crc kubenswrapper[4585]: I1201 13:59:33.632264 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:33Z","lastTransitionTime":"2025-12-01T13:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:33 crc kubenswrapper[4585]: I1201 13:59:33.734805 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:33 crc kubenswrapper[4585]: I1201 13:59:33.734846 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:33 crc kubenswrapper[4585]: I1201 13:59:33.734858 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:33 crc kubenswrapper[4585]: I1201 13:59:33.734875 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:33 crc kubenswrapper[4585]: I1201 13:59:33.734890 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:33Z","lastTransitionTime":"2025-12-01T13:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:33 crc kubenswrapper[4585]: I1201 13:59:33.837758 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:33 crc kubenswrapper[4585]: I1201 13:59:33.837807 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:33 crc kubenswrapper[4585]: I1201 13:59:33.837823 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:33 crc kubenswrapper[4585]: I1201 13:59:33.837848 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:33 crc kubenswrapper[4585]: I1201 13:59:33.837865 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:33Z","lastTransitionTime":"2025-12-01T13:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:33 crc kubenswrapper[4585]: I1201 13:59:33.939636 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:33 crc kubenswrapper[4585]: I1201 13:59:33.939691 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:33 crc kubenswrapper[4585]: I1201 13:59:33.939716 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:33 crc kubenswrapper[4585]: I1201 13:59:33.939746 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:33 crc kubenswrapper[4585]: I1201 13:59:33.939789 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:33Z","lastTransitionTime":"2025-12-01T13:59:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:34 crc kubenswrapper[4585]: I1201 13:59:34.041807 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:34 crc kubenswrapper[4585]: I1201 13:59:34.041840 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:34 crc kubenswrapper[4585]: I1201 13:59:34.041850 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:34 crc kubenswrapper[4585]: I1201 13:59:34.041865 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:34 crc kubenswrapper[4585]: I1201 13:59:34.041876 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:34Z","lastTransitionTime":"2025-12-01T13:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:34 crc kubenswrapper[4585]: I1201 13:59:34.144645 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:34 crc kubenswrapper[4585]: I1201 13:59:34.144713 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:34 crc kubenswrapper[4585]: I1201 13:59:34.144738 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:34 crc kubenswrapper[4585]: I1201 13:59:34.144766 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:34 crc kubenswrapper[4585]: I1201 13:59:34.144790 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:34Z","lastTransitionTime":"2025-12-01T13:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:34 crc kubenswrapper[4585]: I1201 13:59:34.247205 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:34 crc kubenswrapper[4585]: I1201 13:59:34.247253 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:34 crc kubenswrapper[4585]: I1201 13:59:34.247269 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:34 crc kubenswrapper[4585]: I1201 13:59:34.247291 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:34 crc kubenswrapper[4585]: I1201 13:59:34.247308 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:34Z","lastTransitionTime":"2025-12-01T13:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:34 crc kubenswrapper[4585]: I1201 13:59:34.350255 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:34 crc kubenswrapper[4585]: I1201 13:59:34.350327 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:34 crc kubenswrapper[4585]: I1201 13:59:34.350349 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:34 crc kubenswrapper[4585]: I1201 13:59:34.350377 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:34 crc kubenswrapper[4585]: I1201 13:59:34.350400 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:34Z","lastTransitionTime":"2025-12-01T13:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:34 crc kubenswrapper[4585]: I1201 13:59:34.412076 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 13:59:34 crc kubenswrapper[4585]: I1201 13:59:34.412140 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 13:59:34 crc kubenswrapper[4585]: I1201 13:59:34.412076 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 13:59:34 crc kubenswrapper[4585]: E1201 13:59:34.412248 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 13:59:34 crc kubenswrapper[4585]: E1201 13:59:34.412381 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 13:59:34 crc kubenswrapper[4585]: E1201 13:59:34.412544 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 13:59:34 crc kubenswrapper[4585]: I1201 13:59:34.453105 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:34 crc kubenswrapper[4585]: I1201 13:59:34.453157 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:34 crc kubenswrapper[4585]: I1201 13:59:34.453184 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:34 crc kubenswrapper[4585]: I1201 13:59:34.453201 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:34 crc kubenswrapper[4585]: I1201 13:59:34.453211 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:34Z","lastTransitionTime":"2025-12-01T13:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:34 crc kubenswrapper[4585]: I1201 13:59:34.555620 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:34 crc kubenswrapper[4585]: I1201 13:59:34.555697 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:34 crc kubenswrapper[4585]: I1201 13:59:34.555716 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:34 crc kubenswrapper[4585]: I1201 13:59:34.555741 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:34 crc kubenswrapper[4585]: I1201 13:59:34.555758 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:34Z","lastTransitionTime":"2025-12-01T13:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:34 crc kubenswrapper[4585]: I1201 13:59:34.658245 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:34 crc kubenswrapper[4585]: I1201 13:59:34.658417 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:34 crc kubenswrapper[4585]: I1201 13:59:34.658462 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:34 crc kubenswrapper[4585]: I1201 13:59:34.658493 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:34 crc kubenswrapper[4585]: I1201 13:59:34.658515 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:34Z","lastTransitionTime":"2025-12-01T13:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:34 crc kubenswrapper[4585]: I1201 13:59:34.761542 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:34 crc kubenswrapper[4585]: I1201 13:59:34.761648 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:34 crc kubenswrapper[4585]: I1201 13:59:34.761660 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:34 crc kubenswrapper[4585]: I1201 13:59:34.761691 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:34 crc kubenswrapper[4585]: I1201 13:59:34.761703 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:34Z","lastTransitionTime":"2025-12-01T13:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:34 crc kubenswrapper[4585]: I1201 13:59:34.863668 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:34 crc kubenswrapper[4585]: I1201 13:59:34.863710 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:34 crc kubenswrapper[4585]: I1201 13:59:34.863720 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:34 crc kubenswrapper[4585]: I1201 13:59:34.863736 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:34 crc kubenswrapper[4585]: I1201 13:59:34.863746 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:34Z","lastTransitionTime":"2025-12-01T13:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:34 crc kubenswrapper[4585]: I1201 13:59:34.965909 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:34 crc kubenswrapper[4585]: I1201 13:59:34.965947 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:34 crc kubenswrapper[4585]: I1201 13:59:34.965955 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:34 crc kubenswrapper[4585]: I1201 13:59:34.965982 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:34 crc kubenswrapper[4585]: I1201 13:59:34.965992 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:34Z","lastTransitionTime":"2025-12-01T13:59:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:35 crc kubenswrapper[4585]: I1201 13:59:35.069469 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:35 crc kubenswrapper[4585]: I1201 13:59:35.069521 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:35 crc kubenswrapper[4585]: I1201 13:59:35.069534 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:35 crc kubenswrapper[4585]: I1201 13:59:35.069559 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:35 crc kubenswrapper[4585]: I1201 13:59:35.069571 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:35Z","lastTransitionTime":"2025-12-01T13:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:35 crc kubenswrapper[4585]: I1201 13:59:35.172380 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:35 crc kubenswrapper[4585]: I1201 13:59:35.172410 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:35 crc kubenswrapper[4585]: I1201 13:59:35.172418 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:35 crc kubenswrapper[4585]: I1201 13:59:35.172437 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:35 crc kubenswrapper[4585]: I1201 13:59:35.172447 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:35Z","lastTransitionTime":"2025-12-01T13:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:35 crc kubenswrapper[4585]: I1201 13:59:35.275079 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:35 crc kubenswrapper[4585]: I1201 13:59:35.275113 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:35 crc kubenswrapper[4585]: I1201 13:59:35.275123 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:35 crc kubenswrapper[4585]: I1201 13:59:35.275139 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:35 crc kubenswrapper[4585]: I1201 13:59:35.275150 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:35Z","lastTransitionTime":"2025-12-01T13:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:35 crc kubenswrapper[4585]: I1201 13:59:35.377918 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:35 crc kubenswrapper[4585]: I1201 13:59:35.377952 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:35 crc kubenswrapper[4585]: I1201 13:59:35.377960 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:35 crc kubenswrapper[4585]: I1201 13:59:35.377993 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:35 crc kubenswrapper[4585]: I1201 13:59:35.378003 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:35Z","lastTransitionTime":"2025-12-01T13:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:35 crc kubenswrapper[4585]: I1201 13:59:35.411534 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qrdw5" Dec 01 13:59:35 crc kubenswrapper[4585]: E1201 13:59:35.411649 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qrdw5" podUID="f11a95e1-135a-4fd2-9a04-1487c56a18e1" Dec 01 13:59:35 crc kubenswrapper[4585]: I1201 13:59:35.480398 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:35 crc kubenswrapper[4585]: I1201 13:59:35.480430 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:35 crc kubenswrapper[4585]: I1201 13:59:35.480439 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:35 crc kubenswrapper[4585]: I1201 13:59:35.480451 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:35 crc kubenswrapper[4585]: I1201 13:59:35.480459 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:35Z","lastTransitionTime":"2025-12-01T13:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:35 crc kubenswrapper[4585]: I1201 13:59:35.582213 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:35 crc kubenswrapper[4585]: I1201 13:59:35.582316 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:35 crc kubenswrapper[4585]: I1201 13:59:35.582331 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:35 crc kubenswrapper[4585]: I1201 13:59:35.582351 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:35 crc kubenswrapper[4585]: I1201 13:59:35.582361 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:35Z","lastTransitionTime":"2025-12-01T13:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:35 crc kubenswrapper[4585]: I1201 13:59:35.684668 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:35 crc kubenswrapper[4585]: I1201 13:59:35.684700 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:35 crc kubenswrapper[4585]: I1201 13:59:35.684710 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:35 crc kubenswrapper[4585]: I1201 13:59:35.684725 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:35 crc kubenswrapper[4585]: I1201 13:59:35.684736 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:35Z","lastTransitionTime":"2025-12-01T13:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:35 crc kubenswrapper[4585]: I1201 13:59:35.787037 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:35 crc kubenswrapper[4585]: I1201 13:59:35.787068 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:35 crc kubenswrapper[4585]: I1201 13:59:35.787078 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:35 crc kubenswrapper[4585]: I1201 13:59:35.787091 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:35 crc kubenswrapper[4585]: I1201 13:59:35.787101 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:35Z","lastTransitionTime":"2025-12-01T13:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:35 crc kubenswrapper[4585]: I1201 13:59:35.889504 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:35 crc kubenswrapper[4585]: I1201 13:59:35.889564 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:35 crc kubenswrapper[4585]: I1201 13:59:35.889574 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:35 crc kubenswrapper[4585]: I1201 13:59:35.889589 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:35 crc kubenswrapper[4585]: I1201 13:59:35.889598 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:35Z","lastTransitionTime":"2025-12-01T13:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:35 crc kubenswrapper[4585]: I1201 13:59:35.991966 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:35 crc kubenswrapper[4585]: I1201 13:59:35.992050 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:35 crc kubenswrapper[4585]: I1201 13:59:35.992063 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:35 crc kubenswrapper[4585]: I1201 13:59:35.992080 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:35 crc kubenswrapper[4585]: I1201 13:59:35.992092 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:35Z","lastTransitionTime":"2025-12-01T13:59:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.095185 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.095232 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.095241 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.095257 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.095269 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:36Z","lastTransitionTime":"2025-12-01T13:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.197487 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.197535 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.197546 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.197562 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.197573 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:36Z","lastTransitionTime":"2025-12-01T13:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.300092 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.300126 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.300133 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.300146 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.300155 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:36Z","lastTransitionTime":"2025-12-01T13:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.402606 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.402641 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.402651 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.402666 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.402677 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:36Z","lastTransitionTime":"2025-12-01T13:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.412094 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 13:59:36 crc kubenswrapper[4585]: E1201 13:59:36.412343 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.412434 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.412491 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 13:59:36 crc kubenswrapper[4585]: E1201 13:59:36.412631 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 13:59:36 crc kubenswrapper[4585]: E1201 13:59:36.412713 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.434569 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:36Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.446874 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-62bsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98be7526-98f6-4a4a-b4a6-1d10e76b7a99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d4df68e891df5e4e44371a4c884475b5723d55f6a10d32889cf194ddbe6179\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxjs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-62bsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:36Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.459652 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e6fbe56-8f79-4138-b082-5fe0f2198590\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4cd6715a83e1d1fc7d714956406fd0260653d09700c83f129dc4399bb089228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4586348a7e68fe58af97d73625dda61628f3b2176e495c513e783f9c8f19d33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4586348a7e68fe58af97d73625dda61628f3b2176e495c513e783f9c8f19d33\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:36Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.476836 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e9e83cdb2aa3565484c1ff3e5bbd72c7f50a132de50e60aec1aa88ad145ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:36Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.488782 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7beb40d-bcd0-43c8-a9fe-c32408790a4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59e8d51b7e0c8b526fab11f39a0a94685aecda8a1d300e1ee2259eb4bbac9003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c2cf1c1e4965502ab56b6d9bbf7840d00225b831b1e613295befad0748d7d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lj9gs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:36Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.512505 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.512627 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.512648 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.512777 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.512851 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:36Z","lastTransitionTime":"2025-12-01T13:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.514690 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90163e3776ee8b2a338e3f8e6aa3d918d1d6123d246fbbd88659d342348167a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37437844a9c9719627f96c0afa303f0d449c46def7249b8c69ebd44900661fa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec93c685227d533ac1ab980dca6f72792f746e68511a7a546fb25189b02507d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b64c1a0d36f8addf6040455b1e933756582857ef8098c935b9cc5a115e8afee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d740a1be02b59a062ea8d5c44620cb57725a7ec494fd365e44a8980947c0d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdbccd3910729d23067d94c9ed2f5f3c917cd4880185a3b7e8c4c811cd30e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdc8fb68081a35662e2944a74d74746db096b944121e6261161172793f792fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fdc8fb68081a35662e2944a74d74746db096b944121e6261161172793f792fd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T13:59:22Z\\\",\\\"message\\\":\\\"pping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1201 13:59:22.613221 6475 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}\\\\nI1201 13:59:22.613256 6475 services_controller.go:360] Finished syncing service etcd on namespace openshift-etcd for network=default : 9.785386ms\\\\nI1201 13:59:22.613263 6475 kube.go:317] Updating pod openshift-multus/network-metrics-daemon-qrdw5\\\\nI1201 13:59:22.613279 6475 services_controller.go:356] Processing sync for service openshift-operator-lifecycle-manager/packageserver-service for network=default\\\\nF1201 13:59:22.613282 6475 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:59:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-tjkqr_openshift-ovn-kubernetes(b0b45150-070d-4f7c-b53a-d76dcbaa6e6d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c85de7cd35b11158d931065cec6bafaf456be350f4a6095b8a9d0851bf87b1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:36Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.526257 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nhp6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da6d315c-d478-4216-9f3d-57b20ce5ced8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ebdaa73edbe04780c0f421ea88446b4268ed1792af30e1ebb945167db80a548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n25g8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nhp6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:36Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.539166 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kn5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c45f35-33ae-4eba-97c7-eb85a6db85a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf9d61c1ae31a9cc21ea918e419af7e8badf234ef1614ef73bafdc95381935cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b27c5e93a6707bc6b2b6ef81596e39344c0b29f8cac30ac5d62c7eaa5e635f4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6kn5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:36Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.554797 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b10cdae7-154c-4fd6-a308-02843603d7ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b40ee42d7d1c9043b0b5c0dea3baacc46c24842ec64ed3b0d0ae1acfaf65f8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa22ac0ee9ad705a8affafffed373135005ef0380c9a646e06229a24d04d20b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://949f7fe4da35866e66212aea18ec1ed303bd542b07e53ff6562618c19de112cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d7c5d98b7f6dcade92ba77f78aa8345baf8161727831a616192726f6ef6f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dfb22c5155fdc4bfdf9d54bf540e4018bca6a616835af7e8d5079bebcd45da\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T13:58:25Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 13:58:19.764445 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 13:58:19.765610 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3283210930/tls.crt::/tmp/serving-cert-3283210930/tls.key\\\\\\\"\\\\nI1201 13:58:25.461951 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 13:58:25.464641 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 13:58:25.464660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 13:58:25.464680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 13:58:25.464686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 13:58:25.473932 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 13:58:25.473958 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 13:58:25.473983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 13:58:25.473986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 13:58:25.473989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 13:58:25.474166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 13:58:25.477276 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc9645fabd7968d92ce9867130dddbb6c72664af68d6b5475cf3271d1975878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:36Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.568912 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9wjs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e7ad3ad-7937-409b-b1c9-9c801f937400\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babb191a7dbe739f4bf7c2ae917b82b700f2987751c60a05ef3d9b3d11195953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a3662f79465b11c168f80f17c534c792fb1cc6b8a5c0115d219d39def6db073\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T13:59:13Z\\\",\\\"message\\\":\\\"2025-12-01T13:58:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8defcfa5-6cc7-4501-b9cf-73be7747a85c\\\\n2025-12-01T13:58:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8defcfa5-6cc7-4501-b9cf-73be7747a85c to /host/opt/cni/bin/\\\\n2025-12-01T13:58:28Z [verbose] multus-daemon started\\\\n2025-12-01T13:58:28Z [verbose] Readiness Indicator file check\\\\n2025-12-01T13:59:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:59:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvc7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9wjs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:36Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.581905 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:36Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.595077 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4030376c-31f1-420a-935f-f1ee174914e7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2571804bbf066bfec1f4fde3430800aa87d9c5b6db48cc1f3d352d742657ca8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d240ebf61216f20038983f7b8d2e0792fc0f1ed8db050debf02c777197d417b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f66a3bd7503966a7b443b7ea1adbd8554c99e810749bc275e5cfa4e95c6b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4434211239bc61f432b46918bbb691c6257fded2fbd9552ce3344d10e091b3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:36Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.608050 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"979efa67-6804-42c6-9661-43397d760d30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43ecbe420cb6fc8702eae53c7d3a369f12a986685824f1f08c15398e47985cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbb6a609f35d4968e1f0a001362568ddcc21b80f897ad76b3b7f9e7f4b1651af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2e64b25d69a7d86297a1761d8a2c7e62e508e69a19ff6d48679efee71724b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfc93b1f443fde238aa3a3b7a475749b55ddefd79b2380833c6288aa73b131db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfc93b1f443fde238aa3a3b7a475749b55ddefd79b2380833c6288aa73b131db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:36Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.615438 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.615484 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.615493 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.615508 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.615519 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:36Z","lastTransitionTime":"2025-12-01T13:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.623958 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b3ea01760c8f1e58a55a59a0d453c25ca1298ede32d471f02092fd9d0a43830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e7aa15babbd745afed6bc0d4bbf5526180b8d32f86f8db736066fad3feb506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:36Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.636425 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4694a9b9e5ea09fb1d1963504d94853e22fce9688a502036324f92f39bf8a8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:36Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.649521 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:36Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.665926 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xh4hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0e1eaa6-bee9-401a-b01a-9bf49a938b29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db9b353d1934e3336c4282cb057c52ef64e01675fa36e23d1fa0af7133508a62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f3e0b6c5dac9ff9b4686937a7d1646c4796a3326193349cf1556876bf1ae61a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f3e0b6c5dac9ff9b4686937a7d1646c4796a3326193349cf1556876bf1ae61a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2451dad343dcc9ec3bd5eb7109aefab60da5b700ae69d8eff28fef524795a758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2451dad343dcc9ec3bd5eb7109aefab60da5b700ae69d8eff28fef524795a758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9b888c3602a8f5d0f8cb762ab7d9b6676a0467f9546dbc38c5d2dd99acedd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9b888c3602a8f5d0f8cb762ab7d9b6676a0467f9546dbc38c5d2dd99acedd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3110c36a439f88ab7fc0903fd766db1abef51f129bb1fc68634531d0e30cab0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3110c36a439f88ab7fc0903fd766db1abef51f129bb1fc68634531d0e30cab0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xh4hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:36Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.676215 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qrdw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f11a95e1-135a-4fd2-9a04-1487c56a18e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmr99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmr99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qrdw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:36Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.696563 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b62eed93-4d28-4748-b991-dd9692b58341\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e952d1592344330416348b9ec4add60740a0d23ab554afb70c6419977fb12533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e35deff79558e54b62aa868535d5bc4d4374ede161463dcf5f3a33584867447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9736ff6b49635cb7c26db7bc2292b36cf63053e0d1e4ffa2980a520dd71d9ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe7079ed6fa6b8b64d16c67287c33c2bffa0f42df2166fe66e3f8e173ca87d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83ba28d0b57f4744f3f7f97915fde983e2de227958ddaecd43b3eae9eb406774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:36Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.718056 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.718126 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.718143 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.718167 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.718183 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:36Z","lastTransitionTime":"2025-12-01T13:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.821210 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.821257 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.821267 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.821285 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.821298 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:36Z","lastTransitionTime":"2025-12-01T13:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.924112 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.924177 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.924195 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.924222 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:36 crc kubenswrapper[4585]: I1201 13:59:36.924240 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:36Z","lastTransitionTime":"2025-12-01T13:59:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:37 crc kubenswrapper[4585]: I1201 13:59:37.028339 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:37 crc kubenswrapper[4585]: I1201 13:59:37.028403 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:37 crc kubenswrapper[4585]: I1201 13:59:37.028414 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:37 crc kubenswrapper[4585]: I1201 13:59:37.028433 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:37 crc kubenswrapper[4585]: I1201 13:59:37.028446 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:37Z","lastTransitionTime":"2025-12-01T13:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:37 crc kubenswrapper[4585]: I1201 13:59:37.131438 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:37 crc kubenswrapper[4585]: I1201 13:59:37.131476 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:37 crc kubenswrapper[4585]: I1201 13:59:37.131487 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:37 crc kubenswrapper[4585]: I1201 13:59:37.131503 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:37 crc kubenswrapper[4585]: I1201 13:59:37.131516 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:37Z","lastTransitionTime":"2025-12-01T13:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:37 crc kubenswrapper[4585]: I1201 13:59:37.233897 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:37 crc kubenswrapper[4585]: I1201 13:59:37.233958 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:37 crc kubenswrapper[4585]: I1201 13:59:37.233993 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:37 crc kubenswrapper[4585]: I1201 13:59:37.234011 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:37 crc kubenswrapper[4585]: I1201 13:59:37.234022 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:37Z","lastTransitionTime":"2025-12-01T13:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:37 crc kubenswrapper[4585]: I1201 13:59:37.336486 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:37 crc kubenswrapper[4585]: I1201 13:59:37.336538 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:37 crc kubenswrapper[4585]: I1201 13:59:37.336546 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:37 crc kubenswrapper[4585]: I1201 13:59:37.336560 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:37 crc kubenswrapper[4585]: I1201 13:59:37.336570 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:37Z","lastTransitionTime":"2025-12-01T13:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:37 crc kubenswrapper[4585]: I1201 13:59:37.412028 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qrdw5" Dec 01 13:59:37 crc kubenswrapper[4585]: E1201 13:59:37.412321 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qrdw5" podUID="f11a95e1-135a-4fd2-9a04-1487c56a18e1" Dec 01 13:59:37 crc kubenswrapper[4585]: I1201 13:59:37.439581 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:37 crc kubenswrapper[4585]: I1201 13:59:37.439636 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:37 crc kubenswrapper[4585]: I1201 13:59:37.439647 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:37 crc kubenswrapper[4585]: I1201 13:59:37.439664 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:37 crc kubenswrapper[4585]: I1201 13:59:37.439674 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:37Z","lastTransitionTime":"2025-12-01T13:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:37 crc kubenswrapper[4585]: I1201 13:59:37.542070 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:37 crc kubenswrapper[4585]: I1201 13:59:37.542132 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:37 crc kubenswrapper[4585]: I1201 13:59:37.542158 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:37 crc kubenswrapper[4585]: I1201 13:59:37.542174 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:37 crc kubenswrapper[4585]: I1201 13:59:37.542185 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:37Z","lastTransitionTime":"2025-12-01T13:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:37 crc kubenswrapper[4585]: I1201 13:59:37.644181 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:37 crc kubenswrapper[4585]: I1201 13:59:37.644233 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:37 crc kubenswrapper[4585]: I1201 13:59:37.644242 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:37 crc kubenswrapper[4585]: I1201 13:59:37.644260 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:37 crc kubenswrapper[4585]: I1201 13:59:37.644271 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:37Z","lastTransitionTime":"2025-12-01T13:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:37 crc kubenswrapper[4585]: I1201 13:59:37.747022 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:37 crc kubenswrapper[4585]: I1201 13:59:37.747074 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:37 crc kubenswrapper[4585]: I1201 13:59:37.747086 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:37 crc kubenswrapper[4585]: I1201 13:59:37.747107 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:37 crc kubenswrapper[4585]: I1201 13:59:37.747118 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:37Z","lastTransitionTime":"2025-12-01T13:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:37 crc kubenswrapper[4585]: I1201 13:59:37.849963 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:37 crc kubenswrapper[4585]: I1201 13:59:37.850014 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:37 crc kubenswrapper[4585]: I1201 13:59:37.850022 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:37 crc kubenswrapper[4585]: I1201 13:59:37.850038 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:37 crc kubenswrapper[4585]: I1201 13:59:37.850053 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:37Z","lastTransitionTime":"2025-12-01T13:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:37 crc kubenswrapper[4585]: I1201 13:59:37.952100 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:37 crc kubenswrapper[4585]: I1201 13:59:37.952145 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:37 crc kubenswrapper[4585]: I1201 13:59:37.952156 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:37 crc kubenswrapper[4585]: I1201 13:59:37.952173 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:37 crc kubenswrapper[4585]: I1201 13:59:37.952184 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:37Z","lastTransitionTime":"2025-12-01T13:59:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:38 crc kubenswrapper[4585]: I1201 13:59:38.055552 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:38 crc kubenswrapper[4585]: I1201 13:59:38.055638 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:38 crc kubenswrapper[4585]: I1201 13:59:38.055658 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:38 crc kubenswrapper[4585]: I1201 13:59:38.055687 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:38 crc kubenswrapper[4585]: I1201 13:59:38.055708 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:38Z","lastTransitionTime":"2025-12-01T13:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:38 crc kubenswrapper[4585]: I1201 13:59:38.159179 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:38 crc kubenswrapper[4585]: I1201 13:59:38.159231 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:38 crc kubenswrapper[4585]: I1201 13:59:38.159243 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:38 crc kubenswrapper[4585]: I1201 13:59:38.159262 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:38 crc kubenswrapper[4585]: I1201 13:59:38.159274 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:38Z","lastTransitionTime":"2025-12-01T13:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:38 crc kubenswrapper[4585]: I1201 13:59:38.262146 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:38 crc kubenswrapper[4585]: I1201 13:59:38.262198 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:38 crc kubenswrapper[4585]: I1201 13:59:38.262211 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:38 crc kubenswrapper[4585]: I1201 13:59:38.262233 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:38 crc kubenswrapper[4585]: I1201 13:59:38.262249 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:38Z","lastTransitionTime":"2025-12-01T13:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:38 crc kubenswrapper[4585]: I1201 13:59:38.365411 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:38 crc kubenswrapper[4585]: I1201 13:59:38.365463 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:38 crc kubenswrapper[4585]: I1201 13:59:38.365479 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:38 crc kubenswrapper[4585]: I1201 13:59:38.365502 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:38 crc kubenswrapper[4585]: I1201 13:59:38.365517 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:38Z","lastTransitionTime":"2025-12-01T13:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:38 crc kubenswrapper[4585]: I1201 13:59:38.412138 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 13:59:38 crc kubenswrapper[4585]: I1201 13:59:38.412177 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 13:59:38 crc kubenswrapper[4585]: I1201 13:59:38.412252 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 13:59:38 crc kubenswrapper[4585]: E1201 13:59:38.412348 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 13:59:38 crc kubenswrapper[4585]: E1201 13:59:38.412813 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 13:59:38 crc kubenswrapper[4585]: E1201 13:59:38.413149 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 13:59:38 crc kubenswrapper[4585]: I1201 13:59:38.413168 4585 scope.go:117] "RemoveContainer" containerID="6fdc8fb68081a35662e2944a74d74746db096b944121e6261161172793f792fd" Dec 01 13:59:38 crc kubenswrapper[4585]: E1201 13:59:38.413393 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-tjkqr_openshift-ovn-kubernetes(b0b45150-070d-4f7c-b53a-d76dcbaa6e6d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" podUID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" Dec 01 13:59:38 crc kubenswrapper[4585]: I1201 13:59:38.468614 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:38 crc kubenswrapper[4585]: I1201 13:59:38.468645 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:38 crc kubenswrapper[4585]: I1201 13:59:38.468652 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:38 crc kubenswrapper[4585]: I1201 13:59:38.468666 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:38 crc kubenswrapper[4585]: I1201 13:59:38.468674 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:38Z","lastTransitionTime":"2025-12-01T13:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:38 crc kubenswrapper[4585]: I1201 13:59:38.571175 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:38 crc kubenswrapper[4585]: I1201 13:59:38.571213 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:38 crc kubenswrapper[4585]: I1201 13:59:38.571224 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:38 crc kubenswrapper[4585]: I1201 13:59:38.571240 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:38 crc kubenswrapper[4585]: I1201 13:59:38.571251 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:38Z","lastTransitionTime":"2025-12-01T13:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:38 crc kubenswrapper[4585]: I1201 13:59:38.674075 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:38 crc kubenswrapper[4585]: I1201 13:59:38.674370 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:38 crc kubenswrapper[4585]: I1201 13:59:38.674458 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:38 crc kubenswrapper[4585]: I1201 13:59:38.674546 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:38 crc kubenswrapper[4585]: I1201 13:59:38.674627 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:38Z","lastTransitionTime":"2025-12-01T13:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:38 crc kubenswrapper[4585]: I1201 13:59:38.777010 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:38 crc kubenswrapper[4585]: I1201 13:59:38.777045 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:38 crc kubenswrapper[4585]: I1201 13:59:38.777054 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:38 crc kubenswrapper[4585]: I1201 13:59:38.777069 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:38 crc kubenswrapper[4585]: I1201 13:59:38.777078 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:38Z","lastTransitionTime":"2025-12-01T13:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:38 crc kubenswrapper[4585]: I1201 13:59:38.879826 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:38 crc kubenswrapper[4585]: I1201 13:59:38.880155 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:38 crc kubenswrapper[4585]: I1201 13:59:38.880249 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:38 crc kubenswrapper[4585]: I1201 13:59:38.880355 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:38 crc kubenswrapper[4585]: I1201 13:59:38.880443 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:38Z","lastTransitionTime":"2025-12-01T13:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:38 crc kubenswrapper[4585]: I1201 13:59:38.982946 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:38 crc kubenswrapper[4585]: I1201 13:59:38.983355 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:38 crc kubenswrapper[4585]: I1201 13:59:38.983552 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:38 crc kubenswrapper[4585]: I1201 13:59:38.983759 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:38 crc kubenswrapper[4585]: I1201 13:59:38.983958 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:38Z","lastTransitionTime":"2025-12-01T13:59:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:39 crc kubenswrapper[4585]: I1201 13:59:39.087133 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:39 crc kubenswrapper[4585]: I1201 13:59:39.087180 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:39 crc kubenswrapper[4585]: I1201 13:59:39.087189 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:39 crc kubenswrapper[4585]: I1201 13:59:39.087203 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:39 crc kubenswrapper[4585]: I1201 13:59:39.087212 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:39Z","lastTransitionTime":"2025-12-01T13:59:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:39 crc kubenswrapper[4585]: I1201 13:59:39.189147 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:39 crc kubenswrapper[4585]: I1201 13:59:39.189199 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:39 crc kubenswrapper[4585]: I1201 13:59:39.189215 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:39 crc kubenswrapper[4585]: I1201 13:59:39.189237 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:39 crc kubenswrapper[4585]: I1201 13:59:39.189256 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:39Z","lastTransitionTime":"2025-12-01T13:59:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:39 crc kubenswrapper[4585]: I1201 13:59:39.291892 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:39 crc kubenswrapper[4585]: I1201 13:59:39.291948 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:39 crc kubenswrapper[4585]: I1201 13:59:39.291961 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:39 crc kubenswrapper[4585]: I1201 13:59:39.292015 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:39 crc kubenswrapper[4585]: I1201 13:59:39.292028 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:39Z","lastTransitionTime":"2025-12-01T13:59:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:39 crc kubenswrapper[4585]: I1201 13:59:39.394964 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:39 crc kubenswrapper[4585]: I1201 13:59:39.395069 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:39 crc kubenswrapper[4585]: I1201 13:59:39.395106 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:39 crc kubenswrapper[4585]: I1201 13:59:39.395138 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:39 crc kubenswrapper[4585]: I1201 13:59:39.395159 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:39Z","lastTransitionTime":"2025-12-01T13:59:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:39 crc kubenswrapper[4585]: I1201 13:59:39.411399 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qrdw5" Dec 01 13:59:39 crc kubenswrapper[4585]: E1201 13:59:39.411775 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qrdw5" podUID="f11a95e1-135a-4fd2-9a04-1487c56a18e1" Dec 01 13:59:39 crc kubenswrapper[4585]: I1201 13:59:39.497966 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:39 crc kubenswrapper[4585]: I1201 13:59:39.498058 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:39 crc kubenswrapper[4585]: I1201 13:59:39.498076 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:39 crc kubenswrapper[4585]: I1201 13:59:39.498099 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:39 crc kubenswrapper[4585]: I1201 13:59:39.498117 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:39Z","lastTransitionTime":"2025-12-01T13:59:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:39 crc kubenswrapper[4585]: I1201 13:59:39.600513 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:39 crc kubenswrapper[4585]: I1201 13:59:39.600547 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:39 crc kubenswrapper[4585]: I1201 13:59:39.600557 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:39 crc kubenswrapper[4585]: I1201 13:59:39.600572 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:39 crc kubenswrapper[4585]: I1201 13:59:39.600581 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:39Z","lastTransitionTime":"2025-12-01T13:59:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:39 crc kubenswrapper[4585]: I1201 13:59:39.703966 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:39 crc kubenswrapper[4585]: I1201 13:59:39.704072 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:39 crc kubenswrapper[4585]: I1201 13:59:39.704100 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:39 crc kubenswrapper[4585]: I1201 13:59:39.704139 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:39 crc kubenswrapper[4585]: I1201 13:59:39.704164 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:39Z","lastTransitionTime":"2025-12-01T13:59:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:39 crc kubenswrapper[4585]: I1201 13:59:39.808113 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:39 crc kubenswrapper[4585]: I1201 13:59:39.808204 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:39 crc kubenswrapper[4585]: I1201 13:59:39.808232 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:39 crc kubenswrapper[4585]: I1201 13:59:39.808268 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:39 crc kubenswrapper[4585]: I1201 13:59:39.808295 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:39Z","lastTransitionTime":"2025-12-01T13:59:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:39 crc kubenswrapper[4585]: I1201 13:59:39.911961 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:39 crc kubenswrapper[4585]: I1201 13:59:39.912057 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:39 crc kubenswrapper[4585]: I1201 13:59:39.912076 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:39 crc kubenswrapper[4585]: I1201 13:59:39.912103 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:39 crc kubenswrapper[4585]: I1201 13:59:39.912122 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:39Z","lastTransitionTime":"2025-12-01T13:59:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:40 crc kubenswrapper[4585]: I1201 13:59:40.015190 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:40 crc kubenswrapper[4585]: I1201 13:59:40.015316 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:40 crc kubenswrapper[4585]: I1201 13:59:40.015331 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:40 crc kubenswrapper[4585]: I1201 13:59:40.015354 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:40 crc kubenswrapper[4585]: I1201 13:59:40.015368 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:40Z","lastTransitionTime":"2025-12-01T13:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:40 crc kubenswrapper[4585]: I1201 13:59:40.118867 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:40 crc kubenswrapper[4585]: I1201 13:59:40.118938 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:40 crc kubenswrapper[4585]: I1201 13:59:40.118961 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:40 crc kubenswrapper[4585]: I1201 13:59:40.119036 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:40 crc kubenswrapper[4585]: I1201 13:59:40.119057 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:40Z","lastTransitionTime":"2025-12-01T13:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:40 crc kubenswrapper[4585]: I1201 13:59:40.222341 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:40 crc kubenswrapper[4585]: I1201 13:59:40.222379 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:40 crc kubenswrapper[4585]: I1201 13:59:40.222388 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:40 crc kubenswrapper[4585]: I1201 13:59:40.222402 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:40 crc kubenswrapper[4585]: I1201 13:59:40.222411 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:40Z","lastTransitionTime":"2025-12-01T13:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:40 crc kubenswrapper[4585]: I1201 13:59:40.326760 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:40 crc kubenswrapper[4585]: I1201 13:59:40.326856 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:40 crc kubenswrapper[4585]: I1201 13:59:40.326878 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:40 crc kubenswrapper[4585]: I1201 13:59:40.326920 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:40 crc kubenswrapper[4585]: I1201 13:59:40.326943 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:40Z","lastTransitionTime":"2025-12-01T13:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:40 crc kubenswrapper[4585]: I1201 13:59:40.412311 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 13:59:40 crc kubenswrapper[4585]: E1201 13:59:40.412576 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 13:59:40 crc kubenswrapper[4585]: I1201 13:59:40.413507 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 13:59:40 crc kubenswrapper[4585]: I1201 13:59:40.413534 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 13:59:40 crc kubenswrapper[4585]: E1201 13:59:40.414167 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 13:59:40 crc kubenswrapper[4585]: E1201 13:59:40.414272 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 13:59:40 crc kubenswrapper[4585]: I1201 13:59:40.430826 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:40 crc kubenswrapper[4585]: I1201 13:59:40.430874 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:40 crc kubenswrapper[4585]: I1201 13:59:40.430885 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:40 crc kubenswrapper[4585]: I1201 13:59:40.430905 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:40 crc kubenswrapper[4585]: I1201 13:59:40.430918 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:40Z","lastTransitionTime":"2025-12-01T13:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:40 crc kubenswrapper[4585]: I1201 13:59:40.536676 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:40 crc kubenswrapper[4585]: I1201 13:59:40.536740 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:40 crc kubenswrapper[4585]: I1201 13:59:40.536767 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:40 crc kubenswrapper[4585]: I1201 13:59:40.536808 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:40 crc kubenswrapper[4585]: I1201 13:59:40.536828 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:40Z","lastTransitionTime":"2025-12-01T13:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:40 crc kubenswrapper[4585]: I1201 13:59:40.643859 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:40 crc kubenswrapper[4585]: I1201 13:59:40.644767 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:40 crc kubenswrapper[4585]: I1201 13:59:40.644890 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:40 crc kubenswrapper[4585]: I1201 13:59:40.645025 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:40 crc kubenswrapper[4585]: I1201 13:59:40.645101 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:40Z","lastTransitionTime":"2025-12-01T13:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:40 crc kubenswrapper[4585]: I1201 13:59:40.748246 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:40 crc kubenswrapper[4585]: I1201 13:59:40.748839 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:40 crc kubenswrapper[4585]: I1201 13:59:40.748851 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:40 crc kubenswrapper[4585]: I1201 13:59:40.748872 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:40 crc kubenswrapper[4585]: I1201 13:59:40.748885 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:40Z","lastTransitionTime":"2025-12-01T13:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:40 crc kubenswrapper[4585]: I1201 13:59:40.852627 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:40 crc kubenswrapper[4585]: I1201 13:59:40.852792 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:40 crc kubenswrapper[4585]: I1201 13:59:40.852856 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:40 crc kubenswrapper[4585]: I1201 13:59:40.852934 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:40 crc kubenswrapper[4585]: I1201 13:59:40.853011 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:40Z","lastTransitionTime":"2025-12-01T13:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:40 crc kubenswrapper[4585]: I1201 13:59:40.955616 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:40 crc kubenswrapper[4585]: I1201 13:59:40.955652 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:40 crc kubenswrapper[4585]: I1201 13:59:40.955662 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:40 crc kubenswrapper[4585]: I1201 13:59:40.955675 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:40 crc kubenswrapper[4585]: I1201 13:59:40.955684 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:40Z","lastTransitionTime":"2025-12-01T13:59:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.058326 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.058622 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.058704 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.058798 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.058864 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:41Z","lastTransitionTime":"2025-12-01T13:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.162084 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.162209 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.162230 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.162264 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.162291 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:41Z","lastTransitionTime":"2025-12-01T13:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.263967 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.264292 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.264399 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.264486 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.264568 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:41Z","lastTransitionTime":"2025-12-01T13:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.366392 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.366432 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.366441 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.366456 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.366465 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:41Z","lastTransitionTime":"2025-12-01T13:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.412058 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qrdw5" Dec 01 13:59:41 crc kubenswrapper[4585]: E1201 13:59:41.412263 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qrdw5" podUID="f11a95e1-135a-4fd2-9a04-1487c56a18e1" Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.469010 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.469043 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.469053 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.469065 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.469075 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:41Z","lastTransitionTime":"2025-12-01T13:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.571019 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.571058 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.571067 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.571081 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.571090 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:41Z","lastTransitionTime":"2025-12-01T13:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.673405 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.673446 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.673455 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.673472 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.673482 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:41Z","lastTransitionTime":"2025-12-01T13:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.674730 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.674778 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.674790 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.674809 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.674822 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:41Z","lastTransitionTime":"2025-12-01T13:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:41 crc kubenswrapper[4585]: E1201 13:59:41.687632 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e38ee8f4-73f3-495c-b626-577353e9a008\\\",\\\"systemUUID\\\":\\\"fcf25ef7-53cd-4591-aa80-a73d07c13768\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:41Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.691852 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.691961 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.692085 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.692155 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.692217 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:41Z","lastTransitionTime":"2025-12-01T13:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:41 crc kubenswrapper[4585]: E1201 13:59:41.704601 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e38ee8f4-73f3-495c-b626-577353e9a008\\\",\\\"systemUUID\\\":\\\"fcf25ef7-53cd-4591-aa80-a73d07c13768\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:41Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.708892 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.708926 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.708936 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.708953 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.708963 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:41Z","lastTransitionTime":"2025-12-01T13:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:41 crc kubenswrapper[4585]: E1201 13:59:41.723085 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e38ee8f4-73f3-495c-b626-577353e9a008\\\",\\\"systemUUID\\\":\\\"fcf25ef7-53cd-4591-aa80-a73d07c13768\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:41Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.727308 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.727420 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.727438 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.727459 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.727476 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:41Z","lastTransitionTime":"2025-12-01T13:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:41 crc kubenswrapper[4585]: E1201 13:59:41.743259 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e38ee8f4-73f3-495c-b626-577353e9a008\\\",\\\"systemUUID\\\":\\\"fcf25ef7-53cd-4591-aa80-a73d07c13768\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:41Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.747535 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.747592 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.747606 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.747626 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.747639 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:41Z","lastTransitionTime":"2025-12-01T13:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:41 crc kubenswrapper[4585]: E1201 13:59:41.762793 4585 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T13:59:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e38ee8f4-73f3-495c-b626-577353e9a008\\\",\\\"systemUUID\\\":\\\"fcf25ef7-53cd-4591-aa80-a73d07c13768\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:41Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:41 crc kubenswrapper[4585]: E1201 13:59:41.763186 4585 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.776019 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.776096 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.776115 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.776145 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.776165 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:41Z","lastTransitionTime":"2025-12-01T13:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.878612 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.878653 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.878663 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.878679 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.878690 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:41Z","lastTransitionTime":"2025-12-01T13:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.981104 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.981158 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.981175 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.981200 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:41 crc kubenswrapper[4585]: I1201 13:59:41.981218 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:41Z","lastTransitionTime":"2025-12-01T13:59:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:42 crc kubenswrapper[4585]: I1201 13:59:42.083887 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:42 crc kubenswrapper[4585]: I1201 13:59:42.083942 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:42 crc kubenswrapper[4585]: I1201 13:59:42.083952 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:42 crc kubenswrapper[4585]: I1201 13:59:42.083990 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:42 crc kubenswrapper[4585]: I1201 13:59:42.084005 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:42Z","lastTransitionTime":"2025-12-01T13:59:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:42 crc kubenswrapper[4585]: I1201 13:59:42.187715 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:42 crc kubenswrapper[4585]: I1201 13:59:42.187910 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:42 crc kubenswrapper[4585]: I1201 13:59:42.187933 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:42 crc kubenswrapper[4585]: I1201 13:59:42.187962 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:42 crc kubenswrapper[4585]: I1201 13:59:42.188019 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:42Z","lastTransitionTime":"2025-12-01T13:59:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:42 crc kubenswrapper[4585]: I1201 13:59:42.292015 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:42 crc kubenswrapper[4585]: I1201 13:59:42.292072 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:42 crc kubenswrapper[4585]: I1201 13:59:42.292087 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:42 crc kubenswrapper[4585]: I1201 13:59:42.292110 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:42 crc kubenswrapper[4585]: I1201 13:59:42.292125 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:42Z","lastTransitionTime":"2025-12-01T13:59:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:42 crc kubenswrapper[4585]: I1201 13:59:42.395137 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:42 crc kubenswrapper[4585]: I1201 13:59:42.395217 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:42 crc kubenswrapper[4585]: I1201 13:59:42.395240 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:42 crc kubenswrapper[4585]: I1201 13:59:42.395280 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:42 crc kubenswrapper[4585]: I1201 13:59:42.395303 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:42Z","lastTransitionTime":"2025-12-01T13:59:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:42 crc kubenswrapper[4585]: I1201 13:59:42.412055 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 13:59:42 crc kubenswrapper[4585]: I1201 13:59:42.412117 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 13:59:42 crc kubenswrapper[4585]: I1201 13:59:42.412086 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 13:59:42 crc kubenswrapper[4585]: E1201 13:59:42.412293 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 13:59:42 crc kubenswrapper[4585]: E1201 13:59:42.412448 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 13:59:42 crc kubenswrapper[4585]: E1201 13:59:42.412568 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 13:59:42 crc kubenswrapper[4585]: I1201 13:59:42.498655 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:42 crc kubenswrapper[4585]: I1201 13:59:42.498711 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:42 crc kubenswrapper[4585]: I1201 13:59:42.498721 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:42 crc kubenswrapper[4585]: I1201 13:59:42.498740 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:42 crc kubenswrapper[4585]: I1201 13:59:42.498753 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:42Z","lastTransitionTime":"2025-12-01T13:59:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:42 crc kubenswrapper[4585]: I1201 13:59:42.602704 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:42 crc kubenswrapper[4585]: I1201 13:59:42.602792 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:42 crc kubenswrapper[4585]: I1201 13:59:42.602811 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:42 crc kubenswrapper[4585]: I1201 13:59:42.602838 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:42 crc kubenswrapper[4585]: I1201 13:59:42.602861 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:42Z","lastTransitionTime":"2025-12-01T13:59:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:42 crc kubenswrapper[4585]: I1201 13:59:42.705881 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:42 crc kubenswrapper[4585]: I1201 13:59:42.705918 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:42 crc kubenswrapper[4585]: I1201 13:59:42.705927 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:42 crc kubenswrapper[4585]: I1201 13:59:42.705941 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:42 crc kubenswrapper[4585]: I1201 13:59:42.705951 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:42Z","lastTransitionTime":"2025-12-01T13:59:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:42 crc kubenswrapper[4585]: I1201 13:59:42.808922 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:42 crc kubenswrapper[4585]: I1201 13:59:42.808993 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:42 crc kubenswrapper[4585]: I1201 13:59:42.809006 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:42 crc kubenswrapper[4585]: I1201 13:59:42.809068 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:42 crc kubenswrapper[4585]: I1201 13:59:42.809084 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:42Z","lastTransitionTime":"2025-12-01T13:59:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:42 crc kubenswrapper[4585]: I1201 13:59:42.911505 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:42 crc kubenswrapper[4585]: I1201 13:59:42.911565 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:42 crc kubenswrapper[4585]: I1201 13:59:42.911580 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:42 crc kubenswrapper[4585]: I1201 13:59:42.911597 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:42 crc kubenswrapper[4585]: I1201 13:59:42.911608 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:42Z","lastTransitionTime":"2025-12-01T13:59:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:43 crc kubenswrapper[4585]: I1201 13:59:43.013825 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:43 crc kubenswrapper[4585]: I1201 13:59:43.013865 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:43 crc kubenswrapper[4585]: I1201 13:59:43.013876 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:43 crc kubenswrapper[4585]: I1201 13:59:43.013893 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:43 crc kubenswrapper[4585]: I1201 13:59:43.013905 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:43Z","lastTransitionTime":"2025-12-01T13:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:43 crc kubenswrapper[4585]: I1201 13:59:43.116494 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:43 crc kubenswrapper[4585]: I1201 13:59:43.116546 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:43 crc kubenswrapper[4585]: I1201 13:59:43.116576 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:43 crc kubenswrapper[4585]: I1201 13:59:43.116595 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:43 crc kubenswrapper[4585]: I1201 13:59:43.116612 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:43Z","lastTransitionTime":"2025-12-01T13:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:43 crc kubenswrapper[4585]: I1201 13:59:43.219108 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:43 crc kubenswrapper[4585]: I1201 13:59:43.219166 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:43 crc kubenswrapper[4585]: I1201 13:59:43.219179 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:43 crc kubenswrapper[4585]: I1201 13:59:43.219198 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:43 crc kubenswrapper[4585]: I1201 13:59:43.219212 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:43Z","lastTransitionTime":"2025-12-01T13:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:43 crc kubenswrapper[4585]: I1201 13:59:43.321462 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:43 crc kubenswrapper[4585]: I1201 13:59:43.321507 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:43 crc kubenswrapper[4585]: I1201 13:59:43.321521 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:43 crc kubenswrapper[4585]: I1201 13:59:43.321537 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:43 crc kubenswrapper[4585]: I1201 13:59:43.321551 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:43Z","lastTransitionTime":"2025-12-01T13:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:43 crc kubenswrapper[4585]: I1201 13:59:43.412386 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qrdw5" Dec 01 13:59:43 crc kubenswrapper[4585]: E1201 13:59:43.412576 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qrdw5" podUID="f11a95e1-135a-4fd2-9a04-1487c56a18e1" Dec 01 13:59:43 crc kubenswrapper[4585]: I1201 13:59:43.424389 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:43 crc kubenswrapper[4585]: I1201 13:59:43.424434 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:43 crc kubenswrapper[4585]: I1201 13:59:43.424445 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:43 crc kubenswrapper[4585]: I1201 13:59:43.424461 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:43 crc kubenswrapper[4585]: I1201 13:59:43.424472 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:43Z","lastTransitionTime":"2025-12-01T13:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:43 crc kubenswrapper[4585]: I1201 13:59:43.528584 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:43 crc kubenswrapper[4585]: I1201 13:59:43.528624 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:43 crc kubenswrapper[4585]: I1201 13:59:43.528634 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:43 crc kubenswrapper[4585]: I1201 13:59:43.528660 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:43 crc kubenswrapper[4585]: I1201 13:59:43.528672 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:43Z","lastTransitionTime":"2025-12-01T13:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:43 crc kubenswrapper[4585]: I1201 13:59:43.632195 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:43 crc kubenswrapper[4585]: I1201 13:59:43.632270 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:43 crc kubenswrapper[4585]: I1201 13:59:43.632298 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:43 crc kubenswrapper[4585]: I1201 13:59:43.632328 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:43 crc kubenswrapper[4585]: I1201 13:59:43.632351 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:43Z","lastTransitionTime":"2025-12-01T13:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:43 crc kubenswrapper[4585]: I1201 13:59:43.736126 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:43 crc kubenswrapper[4585]: I1201 13:59:43.736188 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:43 crc kubenswrapper[4585]: I1201 13:59:43.736210 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:43 crc kubenswrapper[4585]: I1201 13:59:43.736231 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:43 crc kubenswrapper[4585]: I1201 13:59:43.736243 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:43Z","lastTransitionTime":"2025-12-01T13:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:43 crc kubenswrapper[4585]: I1201 13:59:43.839430 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:43 crc kubenswrapper[4585]: I1201 13:59:43.839506 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:43 crc kubenswrapper[4585]: I1201 13:59:43.839521 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:43 crc kubenswrapper[4585]: I1201 13:59:43.839552 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:43 crc kubenswrapper[4585]: I1201 13:59:43.839566 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:43Z","lastTransitionTime":"2025-12-01T13:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:43 crc kubenswrapper[4585]: I1201 13:59:43.941356 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:43 crc kubenswrapper[4585]: I1201 13:59:43.941404 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:43 crc kubenswrapper[4585]: I1201 13:59:43.941413 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:43 crc kubenswrapper[4585]: I1201 13:59:43.941429 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:43 crc kubenswrapper[4585]: I1201 13:59:43.941439 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:43Z","lastTransitionTime":"2025-12-01T13:59:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:44 crc kubenswrapper[4585]: I1201 13:59:44.043663 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:44 crc kubenswrapper[4585]: I1201 13:59:44.043759 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:44 crc kubenswrapper[4585]: I1201 13:59:44.043774 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:44 crc kubenswrapper[4585]: I1201 13:59:44.043792 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:44 crc kubenswrapper[4585]: I1201 13:59:44.043802 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:44Z","lastTransitionTime":"2025-12-01T13:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:44 crc kubenswrapper[4585]: I1201 13:59:44.146460 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:44 crc kubenswrapper[4585]: I1201 13:59:44.146528 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:44 crc kubenswrapper[4585]: I1201 13:59:44.146553 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:44 crc kubenswrapper[4585]: I1201 13:59:44.146595 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:44 crc kubenswrapper[4585]: I1201 13:59:44.146622 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:44Z","lastTransitionTime":"2025-12-01T13:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:44 crc kubenswrapper[4585]: I1201 13:59:44.250938 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:44 crc kubenswrapper[4585]: I1201 13:59:44.251060 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:44 crc kubenswrapper[4585]: I1201 13:59:44.251086 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:44 crc kubenswrapper[4585]: I1201 13:59:44.251116 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:44 crc kubenswrapper[4585]: I1201 13:59:44.251136 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:44Z","lastTransitionTime":"2025-12-01T13:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:44 crc kubenswrapper[4585]: I1201 13:59:44.354445 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:44 crc kubenswrapper[4585]: I1201 13:59:44.354541 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:44 crc kubenswrapper[4585]: I1201 13:59:44.354556 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:44 crc kubenswrapper[4585]: I1201 13:59:44.354586 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:44 crc kubenswrapper[4585]: I1201 13:59:44.354602 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:44Z","lastTransitionTime":"2025-12-01T13:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:44 crc kubenswrapper[4585]: I1201 13:59:44.412263 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 13:59:44 crc kubenswrapper[4585]: I1201 13:59:44.412334 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 13:59:44 crc kubenswrapper[4585]: E1201 13:59:44.412440 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 13:59:44 crc kubenswrapper[4585]: I1201 13:59:44.412510 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 13:59:44 crc kubenswrapper[4585]: E1201 13:59:44.412585 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 13:59:44 crc kubenswrapper[4585]: E1201 13:59:44.412639 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 13:59:44 crc kubenswrapper[4585]: I1201 13:59:44.458823 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:44 crc kubenswrapper[4585]: I1201 13:59:44.458905 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:44 crc kubenswrapper[4585]: I1201 13:59:44.458930 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:44 crc kubenswrapper[4585]: I1201 13:59:44.459013 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:44 crc kubenswrapper[4585]: I1201 13:59:44.459041 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:44Z","lastTransitionTime":"2025-12-01T13:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:44 crc kubenswrapper[4585]: I1201 13:59:44.562925 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:44 crc kubenswrapper[4585]: I1201 13:59:44.563047 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:44 crc kubenswrapper[4585]: I1201 13:59:44.563075 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:44 crc kubenswrapper[4585]: I1201 13:59:44.563115 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:44 crc kubenswrapper[4585]: I1201 13:59:44.563143 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:44Z","lastTransitionTime":"2025-12-01T13:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:44 crc kubenswrapper[4585]: I1201 13:59:44.666323 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:44 crc kubenswrapper[4585]: I1201 13:59:44.666397 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:44 crc kubenswrapper[4585]: I1201 13:59:44.666419 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:44 crc kubenswrapper[4585]: I1201 13:59:44.666454 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:44 crc kubenswrapper[4585]: I1201 13:59:44.666477 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:44Z","lastTransitionTime":"2025-12-01T13:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:44 crc kubenswrapper[4585]: I1201 13:59:44.769154 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:44 crc kubenswrapper[4585]: I1201 13:59:44.769193 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:44 crc kubenswrapper[4585]: I1201 13:59:44.769202 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:44 crc kubenswrapper[4585]: I1201 13:59:44.769219 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:44 crc kubenswrapper[4585]: I1201 13:59:44.769234 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:44Z","lastTransitionTime":"2025-12-01T13:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:44 crc kubenswrapper[4585]: I1201 13:59:44.872663 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:44 crc kubenswrapper[4585]: I1201 13:59:44.872705 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:44 crc kubenswrapper[4585]: I1201 13:59:44.872718 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:44 crc kubenswrapper[4585]: I1201 13:59:44.872737 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:44 crc kubenswrapper[4585]: I1201 13:59:44.872751 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:44Z","lastTransitionTime":"2025-12-01T13:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:44 crc kubenswrapper[4585]: I1201 13:59:44.975443 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:44 crc kubenswrapper[4585]: I1201 13:59:44.975478 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:44 crc kubenswrapper[4585]: I1201 13:59:44.975489 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:44 crc kubenswrapper[4585]: I1201 13:59:44.975506 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:44 crc kubenswrapper[4585]: I1201 13:59:44.975517 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:44Z","lastTransitionTime":"2025-12-01T13:59:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:45 crc kubenswrapper[4585]: I1201 13:59:45.079238 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:45 crc kubenswrapper[4585]: I1201 13:59:45.079296 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:45 crc kubenswrapper[4585]: I1201 13:59:45.079309 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:45 crc kubenswrapper[4585]: I1201 13:59:45.079329 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:45 crc kubenswrapper[4585]: I1201 13:59:45.079342 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:45Z","lastTransitionTime":"2025-12-01T13:59:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:45 crc kubenswrapper[4585]: I1201 13:59:45.182116 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:45 crc kubenswrapper[4585]: I1201 13:59:45.182176 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:45 crc kubenswrapper[4585]: I1201 13:59:45.182187 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:45 crc kubenswrapper[4585]: I1201 13:59:45.182207 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:45 crc kubenswrapper[4585]: I1201 13:59:45.182218 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:45Z","lastTransitionTime":"2025-12-01T13:59:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:45 crc kubenswrapper[4585]: I1201 13:59:45.242314 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f11a95e1-135a-4fd2-9a04-1487c56a18e1-metrics-certs\") pod \"network-metrics-daemon-qrdw5\" (UID: \"f11a95e1-135a-4fd2-9a04-1487c56a18e1\") " pod="openshift-multus/network-metrics-daemon-qrdw5" Dec 01 13:59:45 crc kubenswrapper[4585]: E1201 13:59:45.242560 4585 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 13:59:45 crc kubenswrapper[4585]: E1201 13:59:45.242705 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f11a95e1-135a-4fd2-9a04-1487c56a18e1-metrics-certs podName:f11a95e1-135a-4fd2-9a04-1487c56a18e1 nodeName:}" failed. No retries permitted until 2025-12-01 14:00:49.242672103 +0000 UTC m=+163.226885988 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f11a95e1-135a-4fd2-9a04-1487c56a18e1-metrics-certs") pod "network-metrics-daemon-qrdw5" (UID: "f11a95e1-135a-4fd2-9a04-1487c56a18e1") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 13:59:45 crc kubenswrapper[4585]: I1201 13:59:45.285621 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:45 crc kubenswrapper[4585]: I1201 13:59:45.285673 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:45 crc kubenswrapper[4585]: I1201 13:59:45.285688 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:45 crc kubenswrapper[4585]: I1201 13:59:45.285706 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:45 crc kubenswrapper[4585]: I1201 13:59:45.285719 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:45Z","lastTransitionTime":"2025-12-01T13:59:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:45 crc kubenswrapper[4585]: I1201 13:59:45.390033 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:45 crc kubenswrapper[4585]: I1201 13:59:45.390080 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:45 crc kubenswrapper[4585]: I1201 13:59:45.390092 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:45 crc kubenswrapper[4585]: I1201 13:59:45.390111 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:45 crc kubenswrapper[4585]: I1201 13:59:45.390122 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:45Z","lastTransitionTime":"2025-12-01T13:59:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:45 crc kubenswrapper[4585]: I1201 13:59:45.411408 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qrdw5" Dec 01 13:59:45 crc kubenswrapper[4585]: E1201 13:59:45.411566 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qrdw5" podUID="f11a95e1-135a-4fd2-9a04-1487c56a18e1" Dec 01 13:59:45 crc kubenswrapper[4585]: I1201 13:59:45.493019 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:45 crc kubenswrapper[4585]: I1201 13:59:45.493075 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:45 crc kubenswrapper[4585]: I1201 13:59:45.493089 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:45 crc kubenswrapper[4585]: I1201 13:59:45.493112 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:45 crc kubenswrapper[4585]: I1201 13:59:45.493127 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:45Z","lastTransitionTime":"2025-12-01T13:59:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:45 crc kubenswrapper[4585]: I1201 13:59:45.596541 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:45 crc kubenswrapper[4585]: I1201 13:59:45.596621 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:45 crc kubenswrapper[4585]: I1201 13:59:45.596640 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:45 crc kubenswrapper[4585]: I1201 13:59:45.596669 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:45 crc kubenswrapper[4585]: I1201 13:59:45.596685 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:45Z","lastTransitionTime":"2025-12-01T13:59:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:45 crc kubenswrapper[4585]: I1201 13:59:45.701284 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:45 crc kubenswrapper[4585]: I1201 13:59:45.701354 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:45 crc kubenswrapper[4585]: I1201 13:59:45.701373 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:45 crc kubenswrapper[4585]: I1201 13:59:45.701403 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:45 crc kubenswrapper[4585]: I1201 13:59:45.701427 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:45Z","lastTransitionTime":"2025-12-01T13:59:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:45 crc kubenswrapper[4585]: I1201 13:59:45.804884 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:45 crc kubenswrapper[4585]: I1201 13:59:45.804937 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:45 crc kubenswrapper[4585]: I1201 13:59:45.804950 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:45 crc kubenswrapper[4585]: I1201 13:59:45.804990 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:45 crc kubenswrapper[4585]: I1201 13:59:45.805004 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:45Z","lastTransitionTime":"2025-12-01T13:59:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:45 crc kubenswrapper[4585]: I1201 13:59:45.906941 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:45 crc kubenswrapper[4585]: I1201 13:59:45.907071 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:45 crc kubenswrapper[4585]: I1201 13:59:45.907095 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:45 crc kubenswrapper[4585]: I1201 13:59:45.907123 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:45 crc kubenswrapper[4585]: I1201 13:59:45.907148 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:45Z","lastTransitionTime":"2025-12-01T13:59:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.010678 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.010709 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.010721 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.010736 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.010746 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:46Z","lastTransitionTime":"2025-12-01T13:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.114340 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.114407 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.114433 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.114468 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.114493 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:46Z","lastTransitionTime":"2025-12-01T13:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.218298 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.218334 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.218347 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.218366 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.218377 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:46Z","lastTransitionTime":"2025-12-01T13:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.321111 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.321155 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.321165 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.321178 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.321188 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:46Z","lastTransitionTime":"2025-12-01T13:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.412248 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.412334 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 13:59:46 crc kubenswrapper[4585]: E1201 13:59:46.412462 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.412490 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 13:59:46 crc kubenswrapper[4585]: E1201 13:59:46.412597 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 13:59:46 crc kubenswrapper[4585]: E1201 13:59:46.413078 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.424202 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.424264 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.424276 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.424343 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.424357 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:46Z","lastTransitionTime":"2025-12-01T13:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.426362 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qrdw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f11a95e1-135a-4fd2-9a04-1487c56a18e1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmr99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmr99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qrdw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:46Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.455809 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b62eed93-4d28-4748-b991-dd9692b58341\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e952d1592344330416348b9ec4add60740a0d23ab554afb70c6419977fb12533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e35deff79558e54b62aa868535d5bc4d4374ede161463dcf5f3a33584867447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9736ff6b49635cb7c26db7bc2292b36cf63053e0d1e4ffa2980a520dd71d9ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fe7079ed6fa6b8b64d16c67287c33c2bffa0f42df2166fe66e3f8e173ca87d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83ba28d0b57f4744f3f7f97915fde983e2de227958ddaecd43b3eae9eb406774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ec257e361f31df343fb8a229150e5a0164e364188dcb291cbe3c70e94c5b805\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a9ddbf7a99f9e2b63d5727cb1304b4ac1910cf16e8e21709c84a729523d3fe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37b7fde29eb4f37ea09121b7f933467eacdba0d9a6017a1cd24018763a7e9967\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:46Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.476993 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4030376c-31f1-420a-935f-f1ee174914e7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2571804bbf066bfec1f4fde3430800aa87d9c5b6db48cc1f3d352d742657ca8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d240ebf61216f20038983f7b8d2e0792fc0f1ed8db050debf02c777197d417b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f66a3bd7503966a7b443b7ea1adbd8554c99e810749bc275e5cfa4e95c6b5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4434211239bc61f432b46918bbb691c6257fded2fbd9552ce3344d10e091b3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:46Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.496070 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"979efa67-6804-42c6-9661-43397d760d30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43ecbe420cb6fc8702eae53c7d3a369f12a986685824f1f08c15398e47985cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbb6a609f35d4968e1f0a001362568ddcc21b80f897ad76b3b7f9e7f4b1651af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2e64b25d69a7d86297a1761d8a2c7e62e508e69a19ff6d48679efee71724b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfc93b1f443fde238aa3a3b7a475749b55ddefd79b2380833c6288aa73b131db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfc93b1f443fde238aa3a3b7a475749b55ddefd79b2380833c6288aa73b131db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:46Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.517414 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b3ea01760c8f1e58a55a59a0d453c25ca1298ede32d471f02092fd9d0a43830\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92e7aa15babbd745afed6bc0d4bbf5526180b8d32f86f8db736066fad3feb506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:46Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.527225 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.527285 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.527304 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.527334 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.527354 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:46Z","lastTransitionTime":"2025-12-01T13:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.538037 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4694a9b9e5ea09fb1d1963504d94853e22fce9688a502036324f92f39bf8a8d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:46Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.555510 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:46Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.574758 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xh4hc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0e1eaa6-bee9-401a-b01a-9bf49a938b29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db9b353d1934e3336c4282cb057c52ef64e01675fa36e23d1fa0af7133508a62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fc7580ccd7bf968ff54f661743324944c15b6fd1db9ce687738d53cf677cf8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f28bc0fcd255a12e4a40d3c39c53c3f6e078e3934d972286a7747ae98c6a54a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f3e0b6c5dac9ff9b4686937a7d1646c4796a3326193349cf1556876bf1ae61a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f3e0b6c5dac9ff9b4686937a7d1646c4796a3326193349cf1556876bf1ae61a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2451dad343dcc9ec3bd5eb7109aefab60da5b700ae69d8eff28fef524795a758\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2451dad343dcc9ec3bd5eb7109aefab60da5b700ae69d8eff28fef524795a758\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9b888c3602a8f5d0f8cb762ab7d9b6676a0467f9546dbc38c5d2dd99acedd0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9b888c3602a8f5d0f8cb762ab7d9b6676a0467f9546dbc38c5d2dd99acedd0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3110c36a439f88ab7fc0903fd766db1abef51f129bb1fc68634531d0e30cab0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3110c36a439f88ab7fc0903fd766db1abef51f129bb1fc68634531d0e30cab0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dgw4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xh4hc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:46Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.589675 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e6fbe56-8f79-4138-b082-5fe0f2198590\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4cd6715a83e1d1fc7d714956406fd0260653d09700c83f129dc4399bb089228\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4586348a7e68fe58af97d73625dda61628f3b2176e495c513e783f9c8f19d33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4586348a7e68fe58af97d73625dda61628f3b2176e495c513e783f9c8f19d33\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:46Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.605157 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:46Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.616950 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-62bsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98be7526-98f6-4a4a-b4a6-1d10e76b7a99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d4df68e891df5e4e44371a4c884475b5723d55f6a10d32889cf194ddbe6179\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dxjs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-62bsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:46Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.630247 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.630589 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.630669 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.630761 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.630853 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:46Z","lastTransitionTime":"2025-12-01T13:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.636915 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b10cdae7-154c-4fd6-a308-02843603d7ff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b40ee42d7d1c9043b0b5c0dea3baacc46c24842ec64ed3b0d0ae1acfaf65f8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa22ac0ee9ad705a8affafffed373135005ef0380c9a646e06229a24d04d20b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://949f7fe4da35866e66212aea18ec1ed303bd542b07e53ff6562618c19de112cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d7c5d98b7f6dcade92ba77f78aa8345baf8161727831a616192726f6ef6f63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7dfb22c5155fdc4bfdf9d54bf540e4018bca6a616835af7e8d5079bebcd45da\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T13:58:25Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 13:58:19.764445 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 13:58:19.765610 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3283210930/tls.crt::/tmp/serving-cert-3283210930/tls.key\\\\\\\"\\\\nI1201 13:58:25.461951 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 13:58:25.464641 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 13:58:25.464660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 13:58:25.464680 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 13:58:25.464686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 13:58:25.473932 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 13:58:25.473958 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 13:58:25.473980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 13:58:25.473983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 13:58:25.473986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 13:58:25.473989 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 13:58:25.474166 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 13:58:25.477276 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dc9645fabd7968d92ce9867130dddbb6c72664af68d6b5475cf3271d1975878\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:46Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.652265 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9e9e83cdb2aa3565484c1ff3e5bbd72c7f50a132de50e60aec1aa88ad145ca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:46Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.665581 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7beb40d-bcd0-43c8-a9fe-c32408790a4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59e8d51b7e0c8b526fab11f39a0a94685aecda8a1d300e1ee2259eb4bbac9003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c2cf1c1e4965502ab56b6d9bbf7840d00225b831b1e613295befad0748d7d00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zccd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lj9gs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:46Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.686678 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90163e3776ee8b2a338e3f8e6aa3d918d1d6123d246fbbd88659d342348167a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37437844a9c9719627f96c0afa303f0d449c46def7249b8c69ebd44900661fa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec93c685227d533ac1ab980dca6f72792f746e68511a7a546fb25189b02507d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b64c1a0d36f8addf6040455b1e933756582857ef8098c935b9cc5a115e8afee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d740a1be02b59a062ea8d5c44620cb57725a7ec494fd365e44a8980947c0d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdbccd3910729d23067d94c9ed2f5f3c917cd4880185a3b7e8c4c811cd30e609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fdc8fb68081a35662e2944a74d74746db096b944121e6261161172793f792fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fdc8fb68081a35662e2944a74d74746db096b944121e6261161172793f792fd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T13:59:22Z\\\",\\\"message\\\":\\\"pping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1201 13:59:22.613221 6475 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}\\\\nI1201 13:59:22.613256 6475 services_controller.go:360] Finished syncing service etcd on namespace openshift-etcd for network=default : 9.785386ms\\\\nI1201 13:59:22.613263 6475 kube.go:317] Updating pod openshift-multus/network-metrics-daemon-qrdw5\\\\nI1201 13:59:22.613279 6475 services_controller.go:356] Processing sync for service openshift-operator-lifecycle-manager/packageserver-service for network=default\\\\nF1201 13:59:22.613282 6475 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:59:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-tjkqr_openshift-ovn-kubernetes(b0b45150-070d-4f7c-b53a-d76dcbaa6e6d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c85de7cd35b11158d931065cec6bafaf456be350f4a6095b8a9d0851bf87b1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjhfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tjkqr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:46Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.700534 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nhp6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da6d315c-d478-4216-9f3d-57b20ce5ced8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ebdaa73edbe04780c0f421ea88446b4268ed1792af30e1ebb945167db80a548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n25g8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nhp6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:46Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.716892 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kn5d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84c45f35-33ae-4eba-97c7-eb85a6db85a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf9d61c1ae31a9cc21ea918e419af7e8badf234ef1614ef73bafdc95381935cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b27c5e93a6707bc6b2b6ef81596e39344c0b29f8cac30ac5d62c7eaa5e635f4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:58:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rsfmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6kn5d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:46Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.734344 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.734434 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.734461 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.734498 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.734525 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:46Z","lastTransitionTime":"2025-12-01T13:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.742303 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:46Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.759013 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9wjs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e7ad3ad-7937-409b-b1c9-9c801f937400\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:58:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T13:59:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://babb191a7dbe739f4bf7c2ae917b82b700f2987751c60a05ef3d9b3d11195953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a3662f79465b11c168f80f17c534c792fb1cc6b8a5c0115d219d39def6db073\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T13:59:13Z\\\",\\\"message\\\":\\\"2025-12-01T13:58:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8defcfa5-6cc7-4501-b9cf-73be7747a85c\\\\n2025-12-01T13:58:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8defcfa5-6cc7-4501-b9cf-73be7747a85c to /host/opt/cni/bin/\\\\n2025-12-01T13:58:28Z [verbose] multus-daemon started\\\\n2025-12-01T13:58:28Z [verbose] Readiness Indicator file check\\\\n2025-12-01T13:59:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T13:58:27Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T13:59:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xvc7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T13:58:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9wjs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T13:59:46Z is after 2025-08-24T17:21:41Z" Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.838405 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.838473 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.838489 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.838520 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.838539 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:46Z","lastTransitionTime":"2025-12-01T13:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.942429 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.942497 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.942509 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.942531 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:46 crc kubenswrapper[4585]: I1201 13:59:46.942544 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:46Z","lastTransitionTime":"2025-12-01T13:59:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:47 crc kubenswrapper[4585]: I1201 13:59:47.047364 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:47 crc kubenswrapper[4585]: I1201 13:59:47.047414 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:47 crc kubenswrapper[4585]: I1201 13:59:47.047425 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:47 crc kubenswrapper[4585]: I1201 13:59:47.047444 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:47 crc kubenswrapper[4585]: I1201 13:59:47.047454 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:47Z","lastTransitionTime":"2025-12-01T13:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:47 crc kubenswrapper[4585]: I1201 13:59:47.150456 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:47 crc kubenswrapper[4585]: I1201 13:59:47.150523 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:47 crc kubenswrapper[4585]: I1201 13:59:47.150534 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:47 crc kubenswrapper[4585]: I1201 13:59:47.150551 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:47 crc kubenswrapper[4585]: I1201 13:59:47.150564 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:47Z","lastTransitionTime":"2025-12-01T13:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:47 crc kubenswrapper[4585]: I1201 13:59:47.254214 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:47 crc kubenswrapper[4585]: I1201 13:59:47.254297 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:47 crc kubenswrapper[4585]: I1201 13:59:47.254323 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:47 crc kubenswrapper[4585]: I1201 13:59:47.254355 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:47 crc kubenswrapper[4585]: I1201 13:59:47.254376 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:47Z","lastTransitionTime":"2025-12-01T13:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:47 crc kubenswrapper[4585]: I1201 13:59:47.358432 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:47 crc kubenswrapper[4585]: I1201 13:59:47.358494 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:47 crc kubenswrapper[4585]: I1201 13:59:47.358511 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:47 crc kubenswrapper[4585]: I1201 13:59:47.358540 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:47 crc kubenswrapper[4585]: I1201 13:59:47.358558 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:47Z","lastTransitionTime":"2025-12-01T13:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:47 crc kubenswrapper[4585]: I1201 13:59:47.412516 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qrdw5" Dec 01 13:59:47 crc kubenswrapper[4585]: E1201 13:59:47.413018 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qrdw5" podUID="f11a95e1-135a-4fd2-9a04-1487c56a18e1" Dec 01 13:59:47 crc kubenswrapper[4585]: I1201 13:59:47.462135 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:47 crc kubenswrapper[4585]: I1201 13:59:47.462189 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:47 crc kubenswrapper[4585]: I1201 13:59:47.462202 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:47 crc kubenswrapper[4585]: I1201 13:59:47.462228 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:47 crc kubenswrapper[4585]: I1201 13:59:47.462249 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:47Z","lastTransitionTime":"2025-12-01T13:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:47 crc kubenswrapper[4585]: I1201 13:59:47.565549 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:47 crc kubenswrapper[4585]: I1201 13:59:47.565602 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:47 crc kubenswrapper[4585]: I1201 13:59:47.565613 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:47 crc kubenswrapper[4585]: I1201 13:59:47.565634 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:47 crc kubenswrapper[4585]: I1201 13:59:47.565645 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:47Z","lastTransitionTime":"2025-12-01T13:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:47 crc kubenswrapper[4585]: I1201 13:59:47.668910 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:47 crc kubenswrapper[4585]: I1201 13:59:47.668991 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:47 crc kubenswrapper[4585]: I1201 13:59:47.669002 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:47 crc kubenswrapper[4585]: I1201 13:59:47.669020 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:47 crc kubenswrapper[4585]: I1201 13:59:47.669030 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:47Z","lastTransitionTime":"2025-12-01T13:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:47 crc kubenswrapper[4585]: I1201 13:59:47.772437 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:47 crc kubenswrapper[4585]: I1201 13:59:47.772535 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:47 crc kubenswrapper[4585]: I1201 13:59:47.772554 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:47 crc kubenswrapper[4585]: I1201 13:59:47.772582 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:47 crc kubenswrapper[4585]: I1201 13:59:47.772600 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:47Z","lastTransitionTime":"2025-12-01T13:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:47 crc kubenswrapper[4585]: I1201 13:59:47.876166 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:47 crc kubenswrapper[4585]: I1201 13:59:47.876210 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:47 crc kubenswrapper[4585]: I1201 13:59:47.876222 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:47 crc kubenswrapper[4585]: I1201 13:59:47.876240 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:47 crc kubenswrapper[4585]: I1201 13:59:47.876251 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:47Z","lastTransitionTime":"2025-12-01T13:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:47 crc kubenswrapper[4585]: I1201 13:59:47.978680 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:47 crc kubenswrapper[4585]: I1201 13:59:47.978750 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:47 crc kubenswrapper[4585]: I1201 13:59:47.978765 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:47 crc kubenswrapper[4585]: I1201 13:59:47.978791 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:47 crc kubenswrapper[4585]: I1201 13:59:47.978808 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:47Z","lastTransitionTime":"2025-12-01T13:59:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:48 crc kubenswrapper[4585]: I1201 13:59:48.082858 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:48 crc kubenswrapper[4585]: I1201 13:59:48.082902 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:48 crc kubenswrapper[4585]: I1201 13:59:48.082913 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:48 crc kubenswrapper[4585]: I1201 13:59:48.082931 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:48 crc kubenswrapper[4585]: I1201 13:59:48.082945 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:48Z","lastTransitionTime":"2025-12-01T13:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:48 crc kubenswrapper[4585]: I1201 13:59:48.187134 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:48 crc kubenswrapper[4585]: I1201 13:59:48.187195 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:48 crc kubenswrapper[4585]: I1201 13:59:48.187212 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:48 crc kubenswrapper[4585]: I1201 13:59:48.187237 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:48 crc kubenswrapper[4585]: I1201 13:59:48.187255 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:48Z","lastTransitionTime":"2025-12-01T13:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:48 crc kubenswrapper[4585]: I1201 13:59:48.290151 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:48 crc kubenswrapper[4585]: I1201 13:59:48.290225 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:48 crc kubenswrapper[4585]: I1201 13:59:48.290248 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:48 crc kubenswrapper[4585]: I1201 13:59:48.290281 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:48 crc kubenswrapper[4585]: I1201 13:59:48.290304 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:48Z","lastTransitionTime":"2025-12-01T13:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:48 crc kubenswrapper[4585]: I1201 13:59:48.393229 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:48 crc kubenswrapper[4585]: I1201 13:59:48.393288 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:48 crc kubenswrapper[4585]: I1201 13:59:48.393304 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:48 crc kubenswrapper[4585]: I1201 13:59:48.393323 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:48 crc kubenswrapper[4585]: I1201 13:59:48.393333 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:48Z","lastTransitionTime":"2025-12-01T13:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:48 crc kubenswrapper[4585]: I1201 13:59:48.411479 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 13:59:48 crc kubenswrapper[4585]: I1201 13:59:48.411583 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 13:59:48 crc kubenswrapper[4585]: E1201 13:59:48.411660 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 13:59:48 crc kubenswrapper[4585]: E1201 13:59:48.411738 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 13:59:48 crc kubenswrapper[4585]: I1201 13:59:48.411479 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 13:59:48 crc kubenswrapper[4585]: E1201 13:59:48.411845 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 13:59:48 crc kubenswrapper[4585]: I1201 13:59:48.496172 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:48 crc kubenswrapper[4585]: I1201 13:59:48.496199 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:48 crc kubenswrapper[4585]: I1201 13:59:48.496207 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:48 crc kubenswrapper[4585]: I1201 13:59:48.496220 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:48 crc kubenswrapper[4585]: I1201 13:59:48.496229 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:48Z","lastTransitionTime":"2025-12-01T13:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:48 crc kubenswrapper[4585]: I1201 13:59:48.599286 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:48 crc kubenswrapper[4585]: I1201 13:59:48.599336 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:48 crc kubenswrapper[4585]: I1201 13:59:48.599349 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:48 crc kubenswrapper[4585]: I1201 13:59:48.599367 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:48 crc kubenswrapper[4585]: I1201 13:59:48.599379 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:48Z","lastTransitionTime":"2025-12-01T13:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:48 crc kubenswrapper[4585]: I1201 13:59:48.701784 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:48 crc kubenswrapper[4585]: I1201 13:59:48.701832 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:48 crc kubenswrapper[4585]: I1201 13:59:48.701851 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:48 crc kubenswrapper[4585]: I1201 13:59:48.701871 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:48 crc kubenswrapper[4585]: I1201 13:59:48.701885 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:48Z","lastTransitionTime":"2025-12-01T13:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:48 crc kubenswrapper[4585]: I1201 13:59:48.803747 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:48 crc kubenswrapper[4585]: I1201 13:59:48.803778 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:48 crc kubenswrapper[4585]: I1201 13:59:48.803786 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:48 crc kubenswrapper[4585]: I1201 13:59:48.803798 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:48 crc kubenswrapper[4585]: I1201 13:59:48.803807 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:48Z","lastTransitionTime":"2025-12-01T13:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:48 crc kubenswrapper[4585]: I1201 13:59:48.905997 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:48 crc kubenswrapper[4585]: I1201 13:59:48.906038 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:48 crc kubenswrapper[4585]: I1201 13:59:48.906048 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:48 crc kubenswrapper[4585]: I1201 13:59:48.906062 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:48 crc kubenswrapper[4585]: I1201 13:59:48.906074 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:48Z","lastTransitionTime":"2025-12-01T13:59:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:49 crc kubenswrapper[4585]: I1201 13:59:49.008673 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:49 crc kubenswrapper[4585]: I1201 13:59:49.008705 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:49 crc kubenswrapper[4585]: I1201 13:59:49.008717 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:49 crc kubenswrapper[4585]: I1201 13:59:49.008734 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:49 crc kubenswrapper[4585]: I1201 13:59:49.008745 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:49Z","lastTransitionTime":"2025-12-01T13:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:49 crc kubenswrapper[4585]: I1201 13:59:49.120909 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:49 crc kubenswrapper[4585]: I1201 13:59:49.121037 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:49 crc kubenswrapper[4585]: I1201 13:59:49.121080 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:49 crc kubenswrapper[4585]: I1201 13:59:49.121112 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:49 crc kubenswrapper[4585]: I1201 13:59:49.121137 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:49Z","lastTransitionTime":"2025-12-01T13:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:49 crc kubenswrapper[4585]: I1201 13:59:49.224927 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:49 crc kubenswrapper[4585]: I1201 13:59:49.225181 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:49 crc kubenswrapper[4585]: I1201 13:59:49.225391 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:49 crc kubenswrapper[4585]: I1201 13:59:49.225504 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:49 crc kubenswrapper[4585]: I1201 13:59:49.225587 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:49Z","lastTransitionTime":"2025-12-01T13:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:49 crc kubenswrapper[4585]: I1201 13:59:49.328710 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:49 crc kubenswrapper[4585]: I1201 13:59:49.328787 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:49 crc kubenswrapper[4585]: I1201 13:59:49.328809 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:49 crc kubenswrapper[4585]: I1201 13:59:49.328840 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:49 crc kubenswrapper[4585]: I1201 13:59:49.328861 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:49Z","lastTransitionTime":"2025-12-01T13:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:49 crc kubenswrapper[4585]: I1201 13:59:49.412188 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qrdw5" Dec 01 13:59:49 crc kubenswrapper[4585]: E1201 13:59:49.412445 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qrdw5" podUID="f11a95e1-135a-4fd2-9a04-1487c56a18e1" Dec 01 13:59:49 crc kubenswrapper[4585]: I1201 13:59:49.432658 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:49 crc kubenswrapper[4585]: I1201 13:59:49.432736 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:49 crc kubenswrapper[4585]: I1201 13:59:49.432763 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:49 crc kubenswrapper[4585]: I1201 13:59:49.432800 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:49 crc kubenswrapper[4585]: I1201 13:59:49.432822 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:49Z","lastTransitionTime":"2025-12-01T13:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:49 crc kubenswrapper[4585]: I1201 13:59:49.536927 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:49 crc kubenswrapper[4585]: I1201 13:59:49.537168 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:49 crc kubenswrapper[4585]: I1201 13:59:49.537193 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:49 crc kubenswrapper[4585]: I1201 13:59:49.537226 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:49 crc kubenswrapper[4585]: I1201 13:59:49.537247 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:49Z","lastTransitionTime":"2025-12-01T13:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:49 crc kubenswrapper[4585]: I1201 13:59:49.640601 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:49 crc kubenswrapper[4585]: I1201 13:59:49.640674 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:49 crc kubenswrapper[4585]: I1201 13:59:49.640702 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:49 crc kubenswrapper[4585]: I1201 13:59:49.640736 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:49 crc kubenswrapper[4585]: I1201 13:59:49.640758 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:49Z","lastTransitionTime":"2025-12-01T13:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:49 crc kubenswrapper[4585]: I1201 13:59:49.743810 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:49 crc kubenswrapper[4585]: I1201 13:59:49.744160 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:49 crc kubenswrapper[4585]: I1201 13:59:49.744247 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:49 crc kubenswrapper[4585]: I1201 13:59:49.744335 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:49 crc kubenswrapper[4585]: I1201 13:59:49.744558 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:49Z","lastTransitionTime":"2025-12-01T13:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:49 crc kubenswrapper[4585]: I1201 13:59:49.847449 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:49 crc kubenswrapper[4585]: I1201 13:59:49.847519 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:49 crc kubenswrapper[4585]: I1201 13:59:49.847539 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:49 crc kubenswrapper[4585]: I1201 13:59:49.847571 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:49 crc kubenswrapper[4585]: I1201 13:59:49.847590 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:49Z","lastTransitionTime":"2025-12-01T13:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:49 crc kubenswrapper[4585]: I1201 13:59:49.951085 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:49 crc kubenswrapper[4585]: I1201 13:59:49.951748 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:49 crc kubenswrapper[4585]: I1201 13:59:49.951831 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:49 crc kubenswrapper[4585]: I1201 13:59:49.951965 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:49 crc kubenswrapper[4585]: I1201 13:59:49.952084 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:49Z","lastTransitionTime":"2025-12-01T13:59:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:50 crc kubenswrapper[4585]: I1201 13:59:50.055369 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:50 crc kubenswrapper[4585]: I1201 13:59:50.055424 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:50 crc kubenswrapper[4585]: I1201 13:59:50.055437 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:50 crc kubenswrapper[4585]: I1201 13:59:50.055454 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:50 crc kubenswrapper[4585]: I1201 13:59:50.055466 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:50Z","lastTransitionTime":"2025-12-01T13:59:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:50 crc kubenswrapper[4585]: I1201 13:59:50.158385 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:50 crc kubenswrapper[4585]: I1201 13:59:50.158436 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:50 crc kubenswrapper[4585]: I1201 13:59:50.158447 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:50 crc kubenswrapper[4585]: I1201 13:59:50.158465 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:50 crc kubenswrapper[4585]: I1201 13:59:50.158476 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:50Z","lastTransitionTime":"2025-12-01T13:59:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:50 crc kubenswrapper[4585]: I1201 13:59:50.262016 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:50 crc kubenswrapper[4585]: I1201 13:59:50.262105 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:50 crc kubenswrapper[4585]: I1201 13:59:50.262131 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:50 crc kubenswrapper[4585]: I1201 13:59:50.262177 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:50 crc kubenswrapper[4585]: I1201 13:59:50.262201 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:50Z","lastTransitionTime":"2025-12-01T13:59:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:50 crc kubenswrapper[4585]: I1201 13:59:50.365952 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:50 crc kubenswrapper[4585]: I1201 13:59:50.366134 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:50 crc kubenswrapper[4585]: I1201 13:59:50.366158 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:50 crc kubenswrapper[4585]: I1201 13:59:50.366206 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:50 crc kubenswrapper[4585]: I1201 13:59:50.366234 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:50Z","lastTransitionTime":"2025-12-01T13:59:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:50 crc kubenswrapper[4585]: I1201 13:59:50.413247 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 13:59:50 crc kubenswrapper[4585]: E1201 13:59:50.413529 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 13:59:50 crc kubenswrapper[4585]: I1201 13:59:50.413921 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 13:59:50 crc kubenswrapper[4585]: I1201 13:59:50.414127 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 13:59:50 crc kubenswrapper[4585]: E1201 13:59:50.414344 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 13:59:50 crc kubenswrapper[4585]: E1201 13:59:50.414551 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 13:59:50 crc kubenswrapper[4585]: I1201 13:59:50.469638 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:50 crc kubenswrapper[4585]: I1201 13:59:50.469714 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:50 crc kubenswrapper[4585]: I1201 13:59:50.469738 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:50 crc kubenswrapper[4585]: I1201 13:59:50.469778 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:50 crc kubenswrapper[4585]: I1201 13:59:50.469803 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:50Z","lastTransitionTime":"2025-12-01T13:59:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:50 crc kubenswrapper[4585]: I1201 13:59:50.574216 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:50 crc kubenswrapper[4585]: I1201 13:59:50.574296 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:50 crc kubenswrapper[4585]: I1201 13:59:50.574321 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:50 crc kubenswrapper[4585]: I1201 13:59:50.574357 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:50 crc kubenswrapper[4585]: I1201 13:59:50.574381 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:50Z","lastTransitionTime":"2025-12-01T13:59:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:50 crc kubenswrapper[4585]: I1201 13:59:50.677763 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:50 crc kubenswrapper[4585]: I1201 13:59:50.677842 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:50 crc kubenswrapper[4585]: I1201 13:59:50.677862 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:50 crc kubenswrapper[4585]: I1201 13:59:50.677896 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:50 crc kubenswrapper[4585]: I1201 13:59:50.677917 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:50Z","lastTransitionTime":"2025-12-01T13:59:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:50 crc kubenswrapper[4585]: I1201 13:59:50.781823 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:50 crc kubenswrapper[4585]: I1201 13:59:50.781905 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:50 crc kubenswrapper[4585]: I1201 13:59:50.781925 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:50 crc kubenswrapper[4585]: I1201 13:59:50.781959 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:50 crc kubenswrapper[4585]: I1201 13:59:50.782009 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:50Z","lastTransitionTime":"2025-12-01T13:59:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:50 crc kubenswrapper[4585]: I1201 13:59:50.885505 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:50 crc kubenswrapper[4585]: I1201 13:59:50.885556 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:50 crc kubenswrapper[4585]: I1201 13:59:50.885575 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:50 crc kubenswrapper[4585]: I1201 13:59:50.885601 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:50 crc kubenswrapper[4585]: I1201 13:59:50.885623 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:50Z","lastTransitionTime":"2025-12-01T13:59:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:50 crc kubenswrapper[4585]: I1201 13:59:50.989508 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:50 crc kubenswrapper[4585]: I1201 13:59:50.989582 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:50 crc kubenswrapper[4585]: I1201 13:59:50.989607 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:50 crc kubenswrapper[4585]: I1201 13:59:50.989648 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:50 crc kubenswrapper[4585]: I1201 13:59:50.989671 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:50Z","lastTransitionTime":"2025-12-01T13:59:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:51 crc kubenswrapper[4585]: I1201 13:59:51.091883 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:51 crc kubenswrapper[4585]: I1201 13:59:51.091945 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:51 crc kubenswrapper[4585]: I1201 13:59:51.091955 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:51 crc kubenswrapper[4585]: I1201 13:59:51.092008 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:51 crc kubenswrapper[4585]: I1201 13:59:51.092019 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:51Z","lastTransitionTime":"2025-12-01T13:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:51 crc kubenswrapper[4585]: I1201 13:59:51.195200 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:51 crc kubenswrapper[4585]: I1201 13:59:51.195275 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:51 crc kubenswrapper[4585]: I1201 13:59:51.195296 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:51 crc kubenswrapper[4585]: I1201 13:59:51.195328 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:51 crc kubenswrapper[4585]: I1201 13:59:51.195352 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:51Z","lastTransitionTime":"2025-12-01T13:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:51 crc kubenswrapper[4585]: I1201 13:59:51.299140 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:51 crc kubenswrapper[4585]: I1201 13:59:51.299226 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:51 crc kubenswrapper[4585]: I1201 13:59:51.299257 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:51 crc kubenswrapper[4585]: I1201 13:59:51.299302 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:51 crc kubenswrapper[4585]: I1201 13:59:51.299323 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:51Z","lastTransitionTime":"2025-12-01T13:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:51 crc kubenswrapper[4585]: I1201 13:59:51.402598 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:51 crc kubenswrapper[4585]: I1201 13:59:51.402663 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:51 crc kubenswrapper[4585]: I1201 13:59:51.402684 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:51 crc kubenswrapper[4585]: I1201 13:59:51.402713 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:51 crc kubenswrapper[4585]: I1201 13:59:51.402730 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:51Z","lastTransitionTime":"2025-12-01T13:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:51 crc kubenswrapper[4585]: I1201 13:59:51.412128 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qrdw5" Dec 01 13:59:51 crc kubenswrapper[4585]: E1201 13:59:51.412347 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qrdw5" podUID="f11a95e1-135a-4fd2-9a04-1487c56a18e1" Dec 01 13:59:51 crc kubenswrapper[4585]: I1201 13:59:51.505646 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:51 crc kubenswrapper[4585]: I1201 13:59:51.505721 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:51 crc kubenswrapper[4585]: I1201 13:59:51.505745 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:51 crc kubenswrapper[4585]: I1201 13:59:51.505779 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:51 crc kubenswrapper[4585]: I1201 13:59:51.505805 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:51Z","lastTransitionTime":"2025-12-01T13:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:51 crc kubenswrapper[4585]: I1201 13:59:51.609711 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:51 crc kubenswrapper[4585]: I1201 13:59:51.609766 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:51 crc kubenswrapper[4585]: I1201 13:59:51.609780 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:51 crc kubenswrapper[4585]: I1201 13:59:51.609805 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:51 crc kubenswrapper[4585]: I1201 13:59:51.609821 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:51Z","lastTransitionTime":"2025-12-01T13:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:51 crc kubenswrapper[4585]: I1201 13:59:51.712959 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:51 crc kubenswrapper[4585]: I1201 13:59:51.713035 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:51 crc kubenswrapper[4585]: I1201 13:59:51.713047 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:51 crc kubenswrapper[4585]: I1201 13:59:51.713068 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:51 crc kubenswrapper[4585]: I1201 13:59:51.713092 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:51Z","lastTransitionTime":"2025-12-01T13:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:51 crc kubenswrapper[4585]: I1201 13:59:51.816276 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:51 crc kubenswrapper[4585]: I1201 13:59:51.816331 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:51 crc kubenswrapper[4585]: I1201 13:59:51.816341 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:51 crc kubenswrapper[4585]: I1201 13:59:51.816357 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:51 crc kubenswrapper[4585]: I1201 13:59:51.816367 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:51Z","lastTransitionTime":"2025-12-01T13:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:51 crc kubenswrapper[4585]: I1201 13:59:51.919441 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:51 crc kubenswrapper[4585]: I1201 13:59:51.919503 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:51 crc kubenswrapper[4585]: I1201 13:59:51.919522 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:51 crc kubenswrapper[4585]: I1201 13:59:51.919551 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:51 crc kubenswrapper[4585]: I1201 13:59:51.919571 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:51Z","lastTransitionTime":"2025-12-01T13:59:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:52 crc kubenswrapper[4585]: I1201 13:59:52.030825 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:52 crc kubenswrapper[4585]: I1201 13:59:52.030905 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:52 crc kubenswrapper[4585]: I1201 13:59:52.030925 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:52 crc kubenswrapper[4585]: I1201 13:59:52.031261 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:52 crc kubenswrapper[4585]: I1201 13:59:52.031370 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:52Z","lastTransitionTime":"2025-12-01T13:59:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:52 crc kubenswrapper[4585]: I1201 13:59:52.088737 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 13:59:52 crc kubenswrapper[4585]: I1201 13:59:52.088807 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 13:59:52 crc kubenswrapper[4585]: I1201 13:59:52.088826 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 13:59:52 crc kubenswrapper[4585]: I1201 13:59:52.088855 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 13:59:52 crc kubenswrapper[4585]: I1201 13:59:52.088878 4585 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T13:59:52Z","lastTransitionTime":"2025-12-01T13:59:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 13:59:52 crc kubenswrapper[4585]: I1201 13:59:52.164659 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-c65rn"] Dec 01 13:59:52 crc kubenswrapper[4585]: I1201 13:59:52.165063 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c65rn" Dec 01 13:59:52 crc kubenswrapper[4585]: I1201 13:59:52.169003 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 01 13:59:52 crc kubenswrapper[4585]: I1201 13:59:52.170296 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 01 13:59:52 crc kubenswrapper[4585]: I1201 13:59:52.170304 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 01 13:59:52 crc kubenswrapper[4585]: I1201 13:59:52.170466 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 01 13:59:52 crc kubenswrapper[4585]: I1201 13:59:52.208905 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=39.208874456 podStartE2EDuration="39.208874456s" podCreationTimestamp="2025-12-01 13:59:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 13:59:52.185541206 +0000 UTC m=+106.169755071" watchObservedRunningTime="2025-12-01 13:59:52.208874456 +0000 UTC m=+106.193088311" Dec 01 13:59:52 crc kubenswrapper[4585]: I1201 13:59:52.248486 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-62bsn" podStartSLOduration=86.24845699 podStartE2EDuration="1m26.24845699s" podCreationTimestamp="2025-12-01 13:58:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 13:59:52.226554712 +0000 UTC m=+106.210768567" watchObservedRunningTime="2025-12-01 13:59:52.24845699 +0000 UTC m=+106.232670845" Dec 01 13:59:52 crc kubenswrapper[4585]: I1201 13:59:52.269830 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=87.269804403 podStartE2EDuration="1m27.269804403s" podCreationTimestamp="2025-12-01 13:58:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 13:59:52.249396778 +0000 UTC m=+106.233610643" watchObservedRunningTime="2025-12-01 13:59:52.269804403 +0000 UTC m=+106.254018258" Dec 01 13:59:52 crc kubenswrapper[4585]: I1201 13:59:52.336056 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/226e5dc4-b72d-45b0-93e7-467871b8554f-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-c65rn\" (UID: \"226e5dc4-b72d-45b0-93e7-467871b8554f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c65rn" Dec 01 13:59:52 crc kubenswrapper[4585]: I1201 13:59:52.336109 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/226e5dc4-b72d-45b0-93e7-467871b8554f-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-c65rn\" (UID: \"226e5dc4-b72d-45b0-93e7-467871b8554f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c65rn" Dec 01 13:59:52 crc kubenswrapper[4585]: I1201 13:59:52.336133 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/226e5dc4-b72d-45b0-93e7-467871b8554f-service-ca\") pod \"cluster-version-operator-5c965bbfc6-c65rn\" (UID: \"226e5dc4-b72d-45b0-93e7-467871b8554f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c65rn" Dec 01 13:59:52 crc kubenswrapper[4585]: I1201 13:59:52.336165 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/226e5dc4-b72d-45b0-93e7-467871b8554f-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-c65rn\" (UID: \"226e5dc4-b72d-45b0-93e7-467871b8554f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c65rn" Dec 01 13:59:52 crc kubenswrapper[4585]: I1201 13:59:52.336209 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/226e5dc4-b72d-45b0-93e7-467871b8554f-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-c65rn\" (UID: \"226e5dc4-b72d-45b0-93e7-467871b8554f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c65rn" Dec 01 13:59:52 crc kubenswrapper[4585]: I1201 13:59:52.345457 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podStartSLOduration=86.345429417 podStartE2EDuration="1m26.345429417s" podCreationTimestamp="2025-12-01 13:58:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 13:59:52.304338959 +0000 UTC m=+106.288552824" watchObservedRunningTime="2025-12-01 13:59:52.345429417 +0000 UTC m=+106.329643272" Dec 01 13:59:52 crc kubenswrapper[4585]: I1201 13:59:52.365369 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-nhp6c" podStartSLOduration=86.365322097 podStartE2EDuration="1m26.365322097s" podCreationTimestamp="2025-12-01 13:58:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 13:59:52.364627737 +0000 UTC m=+106.348841592" watchObservedRunningTime="2025-12-01 13:59:52.365322097 +0000 UTC m=+106.349535952" Dec 01 13:59:52 crc kubenswrapper[4585]: I1201 13:59:52.397772 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kn5d" podStartSLOduration=85.397554417 podStartE2EDuration="1m25.397554417s" podCreationTimestamp="2025-12-01 13:58:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 13:59:52.378960245 +0000 UTC m=+106.363174100" watchObservedRunningTime="2025-12-01 13:59:52.397554417 +0000 UTC m=+106.381768272" Dec 01 13:59:52 crc kubenswrapper[4585]: I1201 13:59:52.412268 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 13:59:52 crc kubenswrapper[4585]: I1201 13:59:52.412330 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 13:59:52 crc kubenswrapper[4585]: I1201 13:59:52.412348 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 13:59:52 crc kubenswrapper[4585]: E1201 13:59:52.412783 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 13:59:52 crc kubenswrapper[4585]: E1201 13:59:52.412959 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 13:59:52 crc kubenswrapper[4585]: E1201 13:59:52.413050 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 13:59:52 crc kubenswrapper[4585]: I1201 13:59:52.413117 4585 scope.go:117] "RemoveContainer" containerID="6fdc8fb68081a35662e2944a74d74746db096b944121e6261161172793f792fd" Dec 01 13:59:52 crc kubenswrapper[4585]: E1201 13:59:52.413307 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-tjkqr_openshift-ovn-kubernetes(b0b45150-070d-4f7c-b53a-d76dcbaa6e6d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" podUID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" Dec 01 13:59:52 crc kubenswrapper[4585]: I1201 13:59:52.417585 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-9wjs5" podStartSLOduration=86.417570181 podStartE2EDuration="1m26.417570181s" podCreationTimestamp="2025-12-01 13:58:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 13:59:52.417248671 +0000 UTC m=+106.401462526" watchObservedRunningTime="2025-12-01 13:59:52.417570181 +0000 UTC m=+106.401784036" Dec 01 13:59:52 crc kubenswrapper[4585]: I1201 13:59:52.437920 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/226e5dc4-b72d-45b0-93e7-467871b8554f-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-c65rn\" (UID: \"226e5dc4-b72d-45b0-93e7-467871b8554f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c65rn" Dec 01 13:59:52 crc kubenswrapper[4585]: I1201 13:59:52.438059 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/226e5dc4-b72d-45b0-93e7-467871b8554f-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-c65rn\" (UID: \"226e5dc4-b72d-45b0-93e7-467871b8554f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c65rn" Dec 01 13:59:52 crc kubenswrapper[4585]: I1201 13:59:52.438106 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/226e5dc4-b72d-45b0-93e7-467871b8554f-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-c65rn\" (UID: \"226e5dc4-b72d-45b0-93e7-467871b8554f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c65rn" Dec 01 13:59:52 crc kubenswrapper[4585]: I1201 13:59:52.438128 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/226e5dc4-b72d-45b0-93e7-467871b8554f-service-ca\") pod \"cluster-version-operator-5c965bbfc6-c65rn\" (UID: \"226e5dc4-b72d-45b0-93e7-467871b8554f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c65rn" Dec 01 13:59:52 crc kubenswrapper[4585]: I1201 13:59:52.438322 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/226e5dc4-b72d-45b0-93e7-467871b8554f-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-c65rn\" (UID: \"226e5dc4-b72d-45b0-93e7-467871b8554f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c65rn" Dec 01 13:59:52 crc kubenswrapper[4585]: I1201 13:59:52.438419 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/226e5dc4-b72d-45b0-93e7-467871b8554f-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-c65rn\" (UID: \"226e5dc4-b72d-45b0-93e7-467871b8554f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c65rn" Dec 01 13:59:52 crc kubenswrapper[4585]: I1201 13:59:52.438917 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/226e5dc4-b72d-45b0-93e7-467871b8554f-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-c65rn\" (UID: \"226e5dc4-b72d-45b0-93e7-467871b8554f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c65rn" Dec 01 13:59:52 crc kubenswrapper[4585]: I1201 13:59:52.439502 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/226e5dc4-b72d-45b0-93e7-467871b8554f-service-ca\") pod \"cluster-version-operator-5c965bbfc6-c65rn\" (UID: \"226e5dc4-b72d-45b0-93e7-467871b8554f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c65rn" Dec 01 13:59:52 crc kubenswrapper[4585]: I1201 13:59:52.446042 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/226e5dc4-b72d-45b0-93e7-467871b8554f-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-c65rn\" (UID: \"226e5dc4-b72d-45b0-93e7-467871b8554f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c65rn" Dec 01 13:59:52 crc kubenswrapper[4585]: I1201 13:59:52.463449 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/226e5dc4-b72d-45b0-93e7-467871b8554f-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-c65rn\" (UID: \"226e5dc4-b72d-45b0-93e7-467871b8554f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c65rn" Dec 01 13:59:52 crc kubenswrapper[4585]: I1201 13:59:52.465642 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=84.465615941 podStartE2EDuration="1m24.465615941s" podCreationTimestamp="2025-12-01 13:58:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 13:59:52.465059835 +0000 UTC m=+106.449273690" watchObservedRunningTime="2025-12-01 13:59:52.465615941 +0000 UTC m=+106.449829796" Dec 01 13:59:52 crc kubenswrapper[4585]: I1201 13:59:52.466116 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=83.466111106 podStartE2EDuration="1m23.466111106s" podCreationTimestamp="2025-12-01 13:58:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 13:59:52.448984097 +0000 UTC m=+106.433197952" watchObservedRunningTime="2025-12-01 13:59:52.466111106 +0000 UTC m=+106.450324961" Dec 01 13:59:52 crc kubenswrapper[4585]: I1201 13:59:52.489717 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c65rn" Dec 01 13:59:52 crc kubenswrapper[4585]: I1201 13:59:52.502002 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=52.501960031 podStartE2EDuration="52.501960031s" podCreationTimestamp="2025-12-01 13:59:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 13:59:52.479165866 +0000 UTC m=+106.463379731" watchObservedRunningTime="2025-12-01 13:59:52.501960031 +0000 UTC m=+106.486173886" Dec 01 13:59:52 crc kubenswrapper[4585]: I1201 13:59:52.571777 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-xh4hc" podStartSLOduration=86.571745195 podStartE2EDuration="1m26.571745195s" podCreationTimestamp="2025-12-01 13:58:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 13:59:52.569810029 +0000 UTC m=+106.554023884" watchObservedRunningTime="2025-12-01 13:59:52.571745195 +0000 UTC m=+106.555959050" Dec 01 13:59:53 crc kubenswrapper[4585]: I1201 13:59:53.009071 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c65rn" event={"ID":"226e5dc4-b72d-45b0-93e7-467871b8554f","Type":"ContainerStarted","Data":"329396bf27cd50af344b74dab68e7f98fe1cbab28d9823f8369948930d445809"} Dec 01 13:59:53 crc kubenswrapper[4585]: I1201 13:59:53.009732 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c65rn" event={"ID":"226e5dc4-b72d-45b0-93e7-467871b8554f","Type":"ContainerStarted","Data":"e931968058360afebec6a97e68981ebaafd694dc197497efad0e486592da6c94"} Dec 01 13:59:53 crc kubenswrapper[4585]: I1201 13:59:53.043025 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c65rn" podStartSLOduration=87.042991935 podStartE2EDuration="1m27.042991935s" podCreationTimestamp="2025-12-01 13:58:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 13:59:53.042550672 +0000 UTC m=+107.026764587" watchObservedRunningTime="2025-12-01 13:59:53.042991935 +0000 UTC m=+107.027205800" Dec 01 13:59:53 crc kubenswrapper[4585]: I1201 13:59:53.412181 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qrdw5" Dec 01 13:59:53 crc kubenswrapper[4585]: E1201 13:59:53.412363 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qrdw5" podUID="f11a95e1-135a-4fd2-9a04-1487c56a18e1" Dec 01 13:59:54 crc kubenswrapper[4585]: I1201 13:59:54.411626 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 13:59:54 crc kubenswrapper[4585]: I1201 13:59:54.411865 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 13:59:54 crc kubenswrapper[4585]: E1201 13:59:54.412210 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 13:59:54 crc kubenswrapper[4585]: I1201 13:59:54.412345 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 13:59:54 crc kubenswrapper[4585]: E1201 13:59:54.412429 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 13:59:54 crc kubenswrapper[4585]: E1201 13:59:54.412483 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 13:59:55 crc kubenswrapper[4585]: I1201 13:59:55.411489 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qrdw5" Dec 01 13:59:55 crc kubenswrapper[4585]: E1201 13:59:55.411818 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qrdw5" podUID="f11a95e1-135a-4fd2-9a04-1487c56a18e1" Dec 01 13:59:56 crc kubenswrapper[4585]: I1201 13:59:56.412306 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 13:59:56 crc kubenswrapper[4585]: I1201 13:59:56.412353 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 13:59:56 crc kubenswrapper[4585]: I1201 13:59:56.412447 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 13:59:56 crc kubenswrapper[4585]: E1201 13:59:56.413546 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 13:59:56 crc kubenswrapper[4585]: E1201 13:59:56.413663 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 13:59:56 crc kubenswrapper[4585]: E1201 13:59:56.413837 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 13:59:57 crc kubenswrapper[4585]: I1201 13:59:57.411826 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qrdw5" Dec 01 13:59:57 crc kubenswrapper[4585]: E1201 13:59:57.411962 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qrdw5" podUID="f11a95e1-135a-4fd2-9a04-1487c56a18e1" Dec 01 13:59:58 crc kubenswrapper[4585]: I1201 13:59:58.411478 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 13:59:58 crc kubenswrapper[4585]: I1201 13:59:58.411596 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 13:59:58 crc kubenswrapper[4585]: I1201 13:59:58.411612 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 13:59:58 crc kubenswrapper[4585]: E1201 13:59:58.411825 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 13:59:58 crc kubenswrapper[4585]: E1201 13:59:58.411928 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 13:59:58 crc kubenswrapper[4585]: E1201 13:59:58.412127 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 13:59:59 crc kubenswrapper[4585]: I1201 13:59:59.411585 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qrdw5" Dec 01 13:59:59 crc kubenswrapper[4585]: E1201 13:59:59.411711 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qrdw5" podUID="f11a95e1-135a-4fd2-9a04-1487c56a18e1" Dec 01 14:00:00 crc kubenswrapper[4585]: I1201 14:00:00.036709 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9wjs5_6e7ad3ad-7937-409b-b1c9-9c801f937400/kube-multus/1.log" Dec 01 14:00:00 crc kubenswrapper[4585]: I1201 14:00:00.037474 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9wjs5_6e7ad3ad-7937-409b-b1c9-9c801f937400/kube-multus/0.log" Dec 01 14:00:00 crc kubenswrapper[4585]: I1201 14:00:00.037575 4585 generic.go:334] "Generic (PLEG): container finished" podID="6e7ad3ad-7937-409b-b1c9-9c801f937400" containerID="babb191a7dbe739f4bf7c2ae917b82b700f2987751c60a05ef3d9b3d11195953" exitCode=1 Dec 01 14:00:00 crc kubenswrapper[4585]: I1201 14:00:00.037644 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9wjs5" event={"ID":"6e7ad3ad-7937-409b-b1c9-9c801f937400","Type":"ContainerDied","Data":"babb191a7dbe739f4bf7c2ae917b82b700f2987751c60a05ef3d9b3d11195953"} Dec 01 14:00:00 crc kubenswrapper[4585]: I1201 14:00:00.037744 4585 scope.go:117] "RemoveContainer" containerID="9a3662f79465b11c168f80f17c534c792fb1cc6b8a5c0115d219d39def6db073" Dec 01 14:00:00 crc kubenswrapper[4585]: I1201 14:00:00.038317 4585 scope.go:117] "RemoveContainer" containerID="babb191a7dbe739f4bf7c2ae917b82b700f2987751c60a05ef3d9b3d11195953" Dec 01 14:00:00 crc kubenswrapper[4585]: E1201 14:00:00.038544 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-9wjs5_openshift-multus(6e7ad3ad-7937-409b-b1c9-9c801f937400)\"" pod="openshift-multus/multus-9wjs5" podUID="6e7ad3ad-7937-409b-b1c9-9c801f937400" Dec 01 14:00:00 crc kubenswrapper[4585]: I1201 14:00:00.412324 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 14:00:00 crc kubenswrapper[4585]: I1201 14:00:00.412375 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 14:00:00 crc kubenswrapper[4585]: I1201 14:00:00.412395 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 14:00:00 crc kubenswrapper[4585]: E1201 14:00:00.412500 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 14:00:00 crc kubenswrapper[4585]: E1201 14:00:00.412557 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 14:00:00 crc kubenswrapper[4585]: E1201 14:00:00.412630 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 14:00:01 crc kubenswrapper[4585]: I1201 14:00:01.043195 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9wjs5_6e7ad3ad-7937-409b-b1c9-9c801f937400/kube-multus/1.log" Dec 01 14:00:01 crc kubenswrapper[4585]: I1201 14:00:01.412594 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qrdw5" Dec 01 14:00:01 crc kubenswrapper[4585]: E1201 14:00:01.412844 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qrdw5" podUID="f11a95e1-135a-4fd2-9a04-1487c56a18e1" Dec 01 14:00:02 crc kubenswrapper[4585]: I1201 14:00:02.413154 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 14:00:02 crc kubenswrapper[4585]: I1201 14:00:02.413246 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 14:00:02 crc kubenswrapper[4585]: I1201 14:00:02.413177 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 14:00:02 crc kubenswrapper[4585]: E1201 14:00:02.413438 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 14:00:02 crc kubenswrapper[4585]: E1201 14:00:02.413545 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 14:00:02 crc kubenswrapper[4585]: E1201 14:00:02.413840 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 14:00:03 crc kubenswrapper[4585]: I1201 14:00:03.412695 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qrdw5" Dec 01 14:00:03 crc kubenswrapper[4585]: E1201 14:00:03.412869 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qrdw5" podUID="f11a95e1-135a-4fd2-9a04-1487c56a18e1" Dec 01 14:00:04 crc kubenswrapper[4585]: I1201 14:00:04.412408 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 14:00:04 crc kubenswrapper[4585]: I1201 14:00:04.412773 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 14:00:04 crc kubenswrapper[4585]: I1201 14:00:04.412673 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 14:00:04 crc kubenswrapper[4585]: E1201 14:00:04.412812 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 14:00:04 crc kubenswrapper[4585]: E1201 14:00:04.412913 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 14:00:04 crc kubenswrapper[4585]: I1201 14:00:04.413008 4585 scope.go:117] "RemoveContainer" containerID="6fdc8fb68081a35662e2944a74d74746db096b944121e6261161172793f792fd" Dec 01 14:00:04 crc kubenswrapper[4585]: E1201 14:00:04.413468 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 14:00:05 crc kubenswrapper[4585]: I1201 14:00:05.060202 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tjkqr_b0b45150-070d-4f7c-b53a-d76dcbaa6e6d/ovnkube-controller/3.log" Dec 01 14:00:05 crc kubenswrapper[4585]: I1201 14:00:05.063460 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" event={"ID":"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d","Type":"ContainerStarted","Data":"3b2013660155359d058ac5bb7ff50988013ccaf6dd753976bfddfdfc11f9ec90"} Dec 01 14:00:05 crc kubenswrapper[4585]: I1201 14:00:05.063989 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 14:00:05 crc kubenswrapper[4585]: I1201 14:00:05.373901 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" podStartSLOduration=99.373878519 podStartE2EDuration="1m39.373878519s" podCreationTimestamp="2025-12-01 13:58:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:00:05.09299869 +0000 UTC m=+119.077212545" watchObservedRunningTime="2025-12-01 14:00:05.373878519 +0000 UTC m=+119.358092374" Dec 01 14:00:05 crc kubenswrapper[4585]: I1201 14:00:05.374619 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qrdw5"] Dec 01 14:00:05 crc kubenswrapper[4585]: I1201 14:00:05.374715 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qrdw5" Dec 01 14:00:05 crc kubenswrapper[4585]: E1201 14:00:05.374806 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qrdw5" podUID="f11a95e1-135a-4fd2-9a04-1487c56a18e1" Dec 01 14:00:06 crc kubenswrapper[4585]: E1201 14:00:06.369694 4585 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 01 14:00:06 crc kubenswrapper[4585]: I1201 14:00:06.412071 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 14:00:06 crc kubenswrapper[4585]: I1201 14:00:06.412174 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 14:00:06 crc kubenswrapper[4585]: I1201 14:00:06.412174 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qrdw5" Dec 01 14:00:06 crc kubenswrapper[4585]: E1201 14:00:06.413671 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 14:00:06 crc kubenswrapper[4585]: I1201 14:00:06.413876 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 14:00:06 crc kubenswrapper[4585]: E1201 14:00:06.413950 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 14:00:06 crc kubenswrapper[4585]: E1201 14:00:06.414120 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 14:00:06 crc kubenswrapper[4585]: E1201 14:00:06.414309 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qrdw5" podUID="f11a95e1-135a-4fd2-9a04-1487c56a18e1" Dec 01 14:00:06 crc kubenswrapper[4585]: E1201 14:00:06.545873 4585 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 14:00:08 crc kubenswrapper[4585]: I1201 14:00:08.412057 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 14:00:08 crc kubenswrapper[4585]: I1201 14:00:08.412070 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qrdw5" Dec 01 14:00:08 crc kubenswrapper[4585]: E1201 14:00:08.413191 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 14:00:08 crc kubenswrapper[4585]: I1201 14:00:08.412121 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 14:00:08 crc kubenswrapper[4585]: E1201 14:00:08.413304 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qrdw5" podUID="f11a95e1-135a-4fd2-9a04-1487c56a18e1" Dec 01 14:00:08 crc kubenswrapper[4585]: E1201 14:00:08.413346 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 14:00:08 crc kubenswrapper[4585]: I1201 14:00:08.412070 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 14:00:08 crc kubenswrapper[4585]: E1201 14:00:08.413461 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 14:00:10 crc kubenswrapper[4585]: I1201 14:00:10.411477 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 14:00:10 crc kubenswrapper[4585]: I1201 14:00:10.411563 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qrdw5" Dec 01 14:00:10 crc kubenswrapper[4585]: I1201 14:00:10.411501 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 14:00:10 crc kubenswrapper[4585]: E1201 14:00:10.411700 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 14:00:10 crc kubenswrapper[4585]: I1201 14:00:10.412058 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 14:00:10 crc kubenswrapper[4585]: E1201 14:00:10.412177 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 14:00:10 crc kubenswrapper[4585]: E1201 14:00:10.412228 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 14:00:10 crc kubenswrapper[4585]: E1201 14:00:10.412399 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qrdw5" podUID="f11a95e1-135a-4fd2-9a04-1487c56a18e1" Dec 01 14:00:11 crc kubenswrapper[4585]: E1201 14:00:11.548259 4585 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 14:00:12 crc kubenswrapper[4585]: I1201 14:00:12.412286 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 14:00:12 crc kubenswrapper[4585]: I1201 14:00:12.412365 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 14:00:12 crc kubenswrapper[4585]: E1201 14:00:12.412443 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 14:00:12 crc kubenswrapper[4585]: E1201 14:00:12.412548 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 14:00:12 crc kubenswrapper[4585]: I1201 14:00:12.412632 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qrdw5" Dec 01 14:00:12 crc kubenswrapper[4585]: E1201 14:00:12.412729 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qrdw5" podUID="f11a95e1-135a-4fd2-9a04-1487c56a18e1" Dec 01 14:00:12 crc kubenswrapper[4585]: I1201 14:00:12.412896 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 14:00:12 crc kubenswrapper[4585]: E1201 14:00:12.413026 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 14:00:13 crc kubenswrapper[4585]: I1201 14:00:13.414031 4585 scope.go:117] "RemoveContainer" containerID="babb191a7dbe739f4bf7c2ae917b82b700f2987751c60a05ef3d9b3d11195953" Dec 01 14:00:14 crc kubenswrapper[4585]: I1201 14:00:14.094191 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9wjs5_6e7ad3ad-7937-409b-b1c9-9c801f937400/kube-multus/1.log" Dec 01 14:00:14 crc kubenswrapper[4585]: I1201 14:00:14.094497 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9wjs5" event={"ID":"6e7ad3ad-7937-409b-b1c9-9c801f937400","Type":"ContainerStarted","Data":"faa193efeada9b36721dd685be49ca406d22ffe8d1ba80f075c5d12bec3e3baf"} Dec 01 14:00:14 crc kubenswrapper[4585]: I1201 14:00:14.412258 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 14:00:14 crc kubenswrapper[4585]: I1201 14:00:14.412258 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 14:00:14 crc kubenswrapper[4585]: I1201 14:00:14.412400 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 14:00:14 crc kubenswrapper[4585]: I1201 14:00:14.413033 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qrdw5" Dec 01 14:00:14 crc kubenswrapper[4585]: E1201 14:00:14.413252 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 14:00:14 crc kubenswrapper[4585]: E1201 14:00:14.413510 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 14:00:14 crc kubenswrapper[4585]: E1201 14:00:14.413621 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 14:00:14 crc kubenswrapper[4585]: E1201 14:00:14.413714 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qrdw5" podUID="f11a95e1-135a-4fd2-9a04-1487c56a18e1" Dec 01 14:00:16 crc kubenswrapper[4585]: I1201 14:00:16.411800 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 14:00:16 crc kubenswrapper[4585]: I1201 14:00:16.411810 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qrdw5" Dec 01 14:00:16 crc kubenswrapper[4585]: I1201 14:00:16.412787 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 14:00:16 crc kubenswrapper[4585]: I1201 14:00:16.413828 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 14:00:16 crc kubenswrapper[4585]: E1201 14:00:16.413936 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 14:00:16 crc kubenswrapper[4585]: E1201 14:00:16.413951 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 14:00:16 crc kubenswrapper[4585]: E1201 14:00:16.414217 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qrdw5" podUID="f11a95e1-135a-4fd2-9a04-1487c56a18e1" Dec 01 14:00:16 crc kubenswrapper[4585]: E1201 14:00:16.414325 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 14:00:18 crc kubenswrapper[4585]: I1201 14:00:18.412691 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 14:00:18 crc kubenswrapper[4585]: I1201 14:00:18.412763 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 14:00:18 crc kubenswrapper[4585]: I1201 14:00:18.412712 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qrdw5" Dec 01 14:00:18 crc kubenswrapper[4585]: I1201 14:00:18.412719 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 14:00:18 crc kubenswrapper[4585]: I1201 14:00:18.415852 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 01 14:00:18 crc kubenswrapper[4585]: I1201 14:00:18.416544 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 01 14:00:18 crc kubenswrapper[4585]: I1201 14:00:18.416672 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 01 14:00:18 crc kubenswrapper[4585]: I1201 14:00:18.416800 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 01 14:00:18 crc kubenswrapper[4585]: I1201 14:00:18.417883 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 01 14:00:18 crc kubenswrapper[4585]: I1201 14:00:18.418264 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.820195 4585 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.862635 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-42pj4"] Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.863130 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-42pj4" Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.863909 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-n4pr6"] Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.864159 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-n4pr6" Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.865747 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-m8p5b"] Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.866804 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-m8p5b" Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.867486 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r6pct"] Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.867815 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r6pct" Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.868795 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-f9k95"] Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.869311 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-f9k95" Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.869751 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-6wzhb"] Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.870255 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6wzhb" Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.880614 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 01 14:00:22 crc kubenswrapper[4585]: W1201 14:00:22.880640 4585 reflector.go:561] object-"openshift-apiserver"/"encryption-config-1": failed to list *v1.Secret: secrets "encryption-config-1" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Dec 01 14:00:22 crc kubenswrapper[4585]: E1201 14:00:22.880685 4585 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"encryption-config-1\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"encryption-config-1\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 01 14:00:22 crc kubenswrapper[4585]: W1201 14:00:22.883316 4585 reflector.go:561] object-"openshift-controller-manager"/"openshift-global-ca": failed to list *v1.ConfigMap: configmaps "openshift-global-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Dec 01 14:00:22 crc kubenswrapper[4585]: E1201 14:00:22.883354 4585 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"openshift-global-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-global-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 01 14:00:22 crc kubenswrapper[4585]: W1201 14:00:22.883697 4585 reflector.go:561] object-"openshift-controller-manager"/"client-ca": failed to list *v1.ConfigMap: configmaps "client-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Dec 01 14:00:22 crc kubenswrapper[4585]: E1201 14:00:22.883828 4585 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"client-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"client-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 01 14:00:22 crc kubenswrapper[4585]: W1201 14:00:22.883766 4585 reflector.go:561] object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7": failed to list *v1.Secret: secrets "machine-api-operator-dockercfg-mfbb7" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Dec 01 14:00:22 crc kubenswrapper[4585]: E1201 14:00:22.884043 4585 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"machine-api-operator-dockercfg-mfbb7\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"machine-api-operator-dockercfg-mfbb7\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 01 14:00:22 crc kubenswrapper[4585]: W1201 14:00:22.884068 4585 reflector.go:561] object-"openshift-machine-api"/"machine-api-operator-tls": failed to list *v1.Secret: secrets "machine-api-operator-tls" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Dec 01 14:00:22 crc kubenswrapper[4585]: E1201 14:00:22.884228 4585 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"machine-api-operator-tls\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"machine-api-operator-tls\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 01 14:00:22 crc kubenswrapper[4585]: W1201 14:00:22.884247 4585 reflector.go:561] object-"openshift-machine-api"/"kube-rbac-proxy": failed to list *v1.ConfigMap: configmaps "kube-rbac-proxy" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Dec 01 14:00:22 crc kubenswrapper[4585]: W1201 14:00:22.884416 4585 reflector.go:561] object-"openshift-machine-api"/"machine-api-operator-images": failed to list *v1.ConfigMap: configmaps "machine-api-operator-images" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Dec 01 14:00:22 crc kubenswrapper[4585]: E1201 14:00:22.884444 4585 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"machine-api-operator-images\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"machine-api-operator-images\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 01 14:00:22 crc kubenswrapper[4585]: W1201 14:00:22.883940 4585 reflector.go:561] object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff": failed to list *v1.Secret: secrets "openshift-apiserver-sa-dockercfg-djjff" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Dec 01 14:00:22 crc kubenswrapper[4585]: W1201 14:00:22.884567 4585 reflector.go:561] object-"openshift-machine-api"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Dec 01 14:00:22 crc kubenswrapper[4585]: E1201 14:00:22.884602 4585 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 01 14:00:22 crc kubenswrapper[4585]: E1201 14:00:22.884423 4585 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"kube-rbac-proxy\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-rbac-proxy\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 01 14:00:22 crc kubenswrapper[4585]: E1201 14:00:22.884569 4585 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"openshift-apiserver-sa-dockercfg-djjff\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"openshift-apiserver-sa-dockercfg-djjff\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 01 14:00:22 crc kubenswrapper[4585]: W1201 14:00:22.884382 4585 reflector.go:561] object-"openshift-machine-api"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Dec 01 14:00:22 crc kubenswrapper[4585]: E1201 14:00:22.884634 4585 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 01 14:00:22 crc kubenswrapper[4585]: W1201 14:00:22.884751 4585 reflector.go:561] object-"openshift-controller-manager"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Dec 01 14:00:22 crc kubenswrapper[4585]: E1201 14:00:22.884770 4585 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 01 14:00:22 crc kubenswrapper[4585]: W1201 14:00:22.884943 4585 reflector.go:561] object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c": failed to list *v1.Secret: secrets "openshift-controller-manager-sa-dockercfg-msq4c" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Dec 01 14:00:22 crc kubenswrapper[4585]: E1201 14:00:22.884982 4585 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"openshift-controller-manager-sa-dockercfg-msq4c\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"openshift-controller-manager-sa-dockercfg-msq4c\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 01 14:00:22 crc kubenswrapper[4585]: W1201 14:00:22.886346 4585 reflector.go:561] object-"openshift-controller-manager"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Dec 01 14:00:22 crc kubenswrapper[4585]: E1201 14:00:22.886369 4585 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 01 14:00:22 crc kubenswrapper[4585]: W1201 14:00:22.886448 4585 reflector.go:561] object-"openshift-apiserver"/"serving-cert": failed to list *v1.Secret: secrets "serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Dec 01 14:00:22 crc kubenswrapper[4585]: E1201 14:00:22.886461 4585 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.888281 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.893260 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 01 14:00:22 crc kubenswrapper[4585]: W1201 14:00:22.911767 4585 reflector.go:561] object-"openshift-controller-manager"/"serving-cert": failed to list *v1.Secret: secrets "serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Dec 01 14:00:22 crc kubenswrapper[4585]: E1201 14:00:22.911815 4585 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.912215 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.912224 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.912591 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.912605 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.912643 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.940123 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.940210 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.940356 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.941334 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-hx54r"] Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.941819 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-hx54r" Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.941905 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-c7gls"] Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.942343 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-c7gls" Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.943585 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-dsfs8"] Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.943854 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-dsfs8" Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.945433 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-2vtl4"] Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.946051 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-2zzvd"] Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.946186 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2vtl4" Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.946521 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2zzvd" Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.949465 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4z48d"] Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.950255 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4z48d" Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.950270 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-p8vxn"] Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.951023 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-p8vxn" Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.955715 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h9trv"] Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.956297 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h9trv" Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.960115 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-sm85h"] Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.961094 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-sm85h" Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.961274 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-flcxm"] Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.962078 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-flcxm" Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.975013 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-pttbb"] Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.975381 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/fe562f92-5985-4fbf-a5b9-8359e7a044d9-etcd-client\") pod \"apiserver-76f77b778f-m8p5b\" (UID: \"fe562f92-5985-4fbf-a5b9-8359e7a044d9\") " pod="openshift-apiserver/apiserver-76f77b778f-m8p5b" Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.975435 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bb6e47d0-5966-48d3-be81-97265e7e7a4f-console-serving-cert\") pod \"console-f9d7485db-f9k95\" (UID: \"bb6e47d0-5966-48d3-be81-97265e7e7a4f\") " pod="openshift-console/console-f9d7485db-f9k95" Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.975462 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lm7wr\" (UniqueName: \"kubernetes.io/projected/bb6e47d0-5966-48d3-be81-97265e7e7a4f-kube-api-access-lm7wr\") pod \"console-f9d7485db-f9k95\" (UID: \"bb6e47d0-5966-48d3-be81-97265e7e7a4f\") " pod="openshift-console/console-f9d7485db-f9k95" Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.975488 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ls596\" (UniqueName: \"kubernetes.io/projected/e0b7b830-078c-4448-b914-ab62e5ff7059-kube-api-access-ls596\") pod \"controller-manager-879f6c89f-n4pr6\" (UID: \"e0b7b830-078c-4448-b914-ab62e5ff7059\") " pod="openshift-controller-manager/controller-manager-879f6c89f-n4pr6" Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.975512 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhj9p\" (UniqueName: \"kubernetes.io/projected/795dab1c-49d5-4b05-a84f-4e1655d459fc-kube-api-access-nhj9p\") pod \"machine-api-operator-5694c8668f-42pj4\" (UID: \"795dab1c-49d5-4b05-a84f-4e1655d459fc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-42pj4" Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.975537 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/fe562f92-5985-4fbf-a5b9-8359e7a044d9-node-pullsecrets\") pod \"apiserver-76f77b778f-m8p5b\" (UID: \"fe562f92-5985-4fbf-a5b9-8359e7a044d9\") " pod="openshift-apiserver/apiserver-76f77b778f-m8p5b" Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.975561 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cd3272a-98a9-421b-97f9-424bc5907cd1-config\") pod \"openshift-apiserver-operator-796bbdcf4f-r6pct\" (UID: \"5cd3272a-98a9-421b-97f9-424bc5907cd1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r6pct" Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.975581 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0b7b830-078c-4448-b914-ab62e5ff7059-client-ca\") pod \"controller-manager-879f6c89f-n4pr6\" (UID: \"e0b7b830-078c-4448-b914-ab62e5ff7059\") " pod="openshift-controller-manager/controller-manager-879f6c89f-n4pr6" Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.975603 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c449e593-5832-4dc3-b251-fc8d2838e680-auth-proxy-config\") pod \"machine-approver-56656f9798-6wzhb\" (UID: \"c449e593-5832-4dc3-b251-fc8d2838e680\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6wzhb" Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.975627 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bb6e47d0-5966-48d3-be81-97265e7e7a4f-oauth-serving-cert\") pod \"console-f9d7485db-f9k95\" (UID: \"bb6e47d0-5966-48d3-be81-97265e7e7a4f\") " pod="openshift-console/console-f9d7485db-f9k95" Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.975650 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/795dab1c-49d5-4b05-a84f-4e1655d459fc-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-42pj4\" (UID: \"795dab1c-49d5-4b05-a84f-4e1655d459fc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-42pj4" Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.975670 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pttbb" Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.975744 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5cd3272a-98a9-421b-97f9-424bc5907cd1-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-r6pct\" (UID: \"5cd3272a-98a9-421b-97f9-424bc5907cd1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r6pct" Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.975776 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/fe562f92-5985-4fbf-a5b9-8359e7a044d9-image-import-ca\") pod \"apiserver-76f77b778f-m8p5b\" (UID: \"fe562f92-5985-4fbf-a5b9-8359e7a044d9\") " pod="openshift-apiserver/apiserver-76f77b778f-m8p5b" Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.975796 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e0b7b830-078c-4448-b914-ab62e5ff7059-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-n4pr6\" (UID: \"e0b7b830-078c-4448-b914-ab62e5ff7059\") " pod="openshift-controller-manager/controller-manager-879f6c89f-n4pr6" Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.975825 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0b7b830-078c-4448-b914-ab62e5ff7059-config\") pod \"controller-manager-879f6c89f-n4pr6\" (UID: \"e0b7b830-078c-4448-b914-ab62e5ff7059\") " pod="openshift-controller-manager/controller-manager-879f6c89f-n4pr6" Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.975844 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c449e593-5832-4dc3-b251-fc8d2838e680-machine-approver-tls\") pod \"machine-approver-56656f9798-6wzhb\" (UID: \"c449e593-5832-4dc3-b251-fc8d2838e680\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6wzhb" Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.975860 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb6e47d0-5966-48d3-be81-97265e7e7a4f-trusted-ca-bundle\") pod \"console-f9d7485db-f9k95\" (UID: \"bb6e47d0-5966-48d3-be81-97265e7e7a4f\") " pod="openshift-console/console-f9d7485db-f9k95" Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.975875 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/fe562f92-5985-4fbf-a5b9-8359e7a044d9-encryption-config\") pod \"apiserver-76f77b778f-m8p5b\" (UID: \"fe562f92-5985-4fbf-a5b9-8359e7a044d9\") " pod="openshift-apiserver/apiserver-76f77b778f-m8p5b" Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.975889 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c449e593-5832-4dc3-b251-fc8d2838e680-config\") pod \"machine-approver-56656f9798-6wzhb\" (UID: \"c449e593-5832-4dc3-b251-fc8d2838e680\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6wzhb" Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.975911 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bcsxh"] Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.975927 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/fe562f92-5985-4fbf-a5b9-8359e7a044d9-etcd-serving-ca\") pod \"apiserver-76f77b778f-m8p5b\" (UID: \"fe562f92-5985-4fbf-a5b9-8359e7a044d9\") " pod="openshift-apiserver/apiserver-76f77b778f-m8p5b" Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.975948 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0b7b830-078c-4448-b914-ab62e5ff7059-serving-cert\") pod \"controller-manager-879f6c89f-n4pr6\" (UID: \"e0b7b830-078c-4448-b914-ab62e5ff7059\") " pod="openshift-controller-manager/controller-manager-879f6c89f-n4pr6" Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.976011 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bb6e47d0-5966-48d3-be81-97265e7e7a4f-console-config\") pod \"console-f9d7485db-f9k95\" (UID: \"bb6e47d0-5966-48d3-be81-97265e7e7a4f\") " pod="openshift-console/console-f9d7485db-f9k95" Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.984433 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/795dab1c-49d5-4b05-a84f-4e1655d459fc-images\") pod \"machine-api-operator-5694c8668f-42pj4\" (UID: \"795dab1c-49d5-4b05-a84f-4e1655d459fc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-42pj4" Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.984529 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bcsxh" Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.984445 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-nrbjm"] Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.984535 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe562f92-5985-4fbf-a5b9-8359e7a044d9-serving-cert\") pod \"apiserver-76f77b778f-m8p5b\" (UID: \"fe562f92-5985-4fbf-a5b9-8359e7a044d9\") " pod="openshift-apiserver/apiserver-76f77b778f-m8p5b" Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.984993 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/795dab1c-49d5-4b05-a84f-4e1655d459fc-config\") pod \"machine-api-operator-5694c8668f-42pj4\" (UID: \"795dab1c-49d5-4b05-a84f-4e1655d459fc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-42pj4" Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.985055 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bb6e47d0-5966-48d3-be81-97265e7e7a4f-service-ca\") pod \"console-f9d7485db-f9k95\" (UID: \"bb6e47d0-5966-48d3-be81-97265e7e7a4f\") " pod="openshift-console/console-f9d7485db-f9k95" Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.985085 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe562f92-5985-4fbf-a5b9-8359e7a044d9-config\") pod \"apiserver-76f77b778f-m8p5b\" (UID: \"fe562f92-5985-4fbf-a5b9-8359e7a044d9\") " pod="openshift-apiserver/apiserver-76f77b778f-m8p5b" Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.985301 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fe562f92-5985-4fbf-a5b9-8359e7a044d9-audit-dir\") pod \"apiserver-76f77b778f-m8p5b\" (UID: \"fe562f92-5985-4fbf-a5b9-8359e7a044d9\") " pod="openshift-apiserver/apiserver-76f77b778f-m8p5b" Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.985442 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwwvd\" (UniqueName: \"kubernetes.io/projected/c449e593-5832-4dc3-b251-fc8d2838e680-kube-api-access-rwwvd\") pod \"machine-approver-56656f9798-6wzhb\" (UID: \"c449e593-5832-4dc3-b251-fc8d2838e680\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6wzhb" Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.985492 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/fe562f92-5985-4fbf-a5b9-8359e7a044d9-audit\") pod \"apiserver-76f77b778f-m8p5b\" (UID: \"fe562f92-5985-4fbf-a5b9-8359e7a044d9\") " pod="openshift-apiserver/apiserver-76f77b778f-m8p5b" Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.985513 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qpjv\" (UniqueName: \"kubernetes.io/projected/5cd3272a-98a9-421b-97f9-424bc5907cd1-kube-api-access-5qpjv\") pod \"openshift-apiserver-operator-796bbdcf4f-r6pct\" (UID: \"5cd3272a-98a9-421b-97f9-424bc5907cd1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r6pct" Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.985639 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bb6e47d0-5966-48d3-be81-97265e7e7a4f-console-oauth-config\") pod \"console-f9d7485db-f9k95\" (UID: \"bb6e47d0-5966-48d3-be81-97265e7e7a4f\") " pod="openshift-console/console-f9d7485db-f9k95" Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.985725 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe562f92-5985-4fbf-a5b9-8359e7a044d9-trusted-ca-bundle\") pod \"apiserver-76f77b778f-m8p5b\" (UID: \"fe562f92-5985-4fbf-a5b9-8359e7a044d9\") " pod="openshift-apiserver/apiserver-76f77b778f-m8p5b" Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.985772 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l749s\" (UniqueName: \"kubernetes.io/projected/fe562f92-5985-4fbf-a5b9-8359e7a044d9-kube-api-access-l749s\") pod \"apiserver-76f77b778f-m8p5b\" (UID: \"fe562f92-5985-4fbf-a5b9-8359e7a044d9\") " pod="openshift-apiserver/apiserver-76f77b778f-m8p5b" Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.986873 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.987871 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cqw6s"] Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.991804 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-rqphx"] Dec 01 14:00:22 crc kubenswrapper[4585]: I1201 14:00:22.997703 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cqw6s" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.013395 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.014113 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-sljgx"] Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.014306 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.014404 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-tgbx7"] Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.014609 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.014766 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-tgbx7" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.014997 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rqphx" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.015109 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-sljgx" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.015425 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-xjls2"] Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.016055 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xjls2" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.022341 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.022402 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.031485 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.031687 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.068826 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.086947 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0b7b830-078c-4448-b914-ab62e5ff7059-serving-cert\") pod \"controller-manager-879f6c89f-n4pr6\" (UID: \"e0b7b830-078c-4448-b914-ab62e5ff7059\") " pod="openshift-controller-manager/controller-manager-879f6c89f-n4pr6" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.087012 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7918cfa2-6bbe-4434-a62f-8b06e3ff324e-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-2zzvd\" (UID: \"7918cfa2-6bbe-4434-a62f-8b06e3ff324e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2zzvd" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.087040 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7918cfa2-6bbe-4434-a62f-8b06e3ff324e-encryption-config\") pod \"apiserver-7bbb656c7d-2zzvd\" (UID: \"7918cfa2-6bbe-4434-a62f-8b06e3ff324e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2zzvd" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.087065 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pbfk\" (UniqueName: \"kubernetes.io/projected/1d15ad5d-2ee0-4543-8e56-89fa7a2461f7-kube-api-access-5pbfk\") pod \"migrator-59844c95c7-rqphx\" (UID: \"1d15ad5d-2ee0-4543-8e56-89fa7a2461f7\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rqphx" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.087090 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59ce8ebf-2806-4ff3-bd4c-7f1ace81f7e2-config\") pod \"authentication-operator-69f744f599-p8vxn\" (UID: \"59ce8ebf-2806-4ff3-bd4c-7f1ace81f7e2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-p8vxn" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.087122 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-547g7\" (UniqueName: \"kubernetes.io/projected/59ce8ebf-2806-4ff3-bd4c-7f1ace81f7e2-kube-api-access-547g7\") pod \"authentication-operator-69f744f599-p8vxn\" (UID: \"59ce8ebf-2806-4ff3-bd4c-7f1ace81f7e2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-p8vxn" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.087145 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/406a9c63-4d0d-4c27-8ade-804cd92b0985-images\") pod \"machine-config-operator-74547568cd-xjls2\" (UID: \"406a9c63-4d0d-4c27-8ade-804cd92b0985\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xjls2" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.087169 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bb6e47d0-5966-48d3-be81-97265e7e7a4f-console-config\") pod \"console-f9d7485db-f9k95\" (UID: \"bb6e47d0-5966-48d3-be81-97265e7e7a4f\") " pod="openshift-console/console-f9d7485db-f9k95" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.087220 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/795dab1c-49d5-4b05-a84f-4e1655d459fc-images\") pod \"machine-api-operator-5694c8668f-42pj4\" (UID: \"795dab1c-49d5-4b05-a84f-4e1655d459fc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-42pj4" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.087262 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/23eabf52-0331-423d-a779-c83ef2d2c0fc-signing-key\") pod \"service-ca-9c57cc56f-sljgx\" (UID: \"23eabf52-0331-423d-a779-c83ef2d2c0fc\") " pod="openshift-service-ca/service-ca-9c57cc56f-sljgx" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.087291 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlj4s\" (UniqueName: \"kubernetes.io/projected/a988a2aa-8447-47fe-9b03-771d1633d69f-kube-api-access-nlj4s\") pod \"openshift-controller-manager-operator-756b6f6bc6-4z48d\" (UID: \"a988a2aa-8447-47fe-9b03-771d1633d69f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4z48d" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.087339 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26hrb\" (UniqueName: \"kubernetes.io/projected/e3752441-ce0c-46e5-bf1c-3bfab4ae6819-kube-api-access-26hrb\") pod \"machine-config-controller-84d6567774-pttbb\" (UID: \"e3752441-ce0c-46e5-bf1c-3bfab4ae6819\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pttbb" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.087367 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72q79\" (UniqueName: \"kubernetes.io/projected/79abd33c-0184-473e-8bb9-c408a5c32efc-kube-api-access-72q79\") pod \"oauth-openshift-558db77b4-c7gls\" (UID: \"79abd33c-0184-473e-8bb9-c408a5c32efc\") " pod="openshift-authentication/oauth-openshift-558db77b4-c7gls" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.087405 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/766043b8-3f15-4428-855d-1e82aca4fb63-apiservice-cert\") pod \"packageserver-d55dfcdfc-cqw6s\" (UID: \"766043b8-3f15-4428-855d-1e82aca4fb63\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cqw6s" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.087429 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6fb639a8-cf5f-4b09-aae1-35844b2e792d-serving-cert\") pod \"etcd-operator-b45778765-tgbx7\" (UID: \"6fb639a8-cf5f-4b09-aae1-35844b2e792d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tgbx7" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.087452 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/79abd33c-0184-473e-8bb9-c408a5c32efc-audit-dir\") pod \"oauth-openshift-558db77b4-c7gls\" (UID: \"79abd33c-0184-473e-8bb9-c408a5c32efc\") " pod="openshift-authentication/oauth-openshift-558db77b4-c7gls" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.087484 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/79abd33c-0184-473e-8bb9-c408a5c32efc-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-c7gls\" (UID: \"79abd33c-0184-473e-8bb9-c408a5c32efc\") " pod="openshift-authentication/oauth-openshift-558db77b4-c7gls" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.087508 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/406a9c63-4d0d-4c27-8ade-804cd92b0985-proxy-tls\") pod \"machine-config-operator-74547568cd-xjls2\" (UID: \"406a9c63-4d0d-4c27-8ade-804cd92b0985\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xjls2" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.087533 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7918cfa2-6bbe-4434-a62f-8b06e3ff324e-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-2zzvd\" (UID: \"7918cfa2-6bbe-4434-a62f-8b06e3ff324e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2zzvd" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.087562 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe562f92-5985-4fbf-a5b9-8359e7a044d9-serving-cert\") pod \"apiserver-76f77b778f-m8p5b\" (UID: \"fe562f92-5985-4fbf-a5b9-8359e7a044d9\") " pod="openshift-apiserver/apiserver-76f77b778f-m8p5b" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.087585 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/766043b8-3f15-4428-855d-1e82aca4fb63-tmpfs\") pod \"packageserver-d55dfcdfc-cqw6s\" (UID: \"766043b8-3f15-4428-855d-1e82aca4fb63\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cqw6s" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.087612 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7aedeec4-fc41-47aa-85a1-d5a92de50deb-config\") pod \"route-controller-manager-6576b87f9c-2vtl4\" (UID: \"7aedeec4-fc41-47aa-85a1-d5a92de50deb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2vtl4" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.087635 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/79abd33c-0184-473e-8bb9-c408a5c32efc-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-c7gls\" (UID: \"79abd33c-0184-473e-8bb9-c408a5c32efc\") " pod="openshift-authentication/oauth-openshift-558db77b4-c7gls" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.087675 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/795dab1c-49d5-4b05-a84f-4e1655d459fc-config\") pod \"machine-api-operator-5694c8668f-42pj4\" (UID: \"795dab1c-49d5-4b05-a84f-4e1655d459fc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-42pj4" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.087701 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7918cfa2-6bbe-4434-a62f-8b06e3ff324e-serving-cert\") pod \"apiserver-7bbb656c7d-2zzvd\" (UID: \"7918cfa2-6bbe-4434-a62f-8b06e3ff324e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2zzvd" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.087724 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/766043b8-3f15-4428-855d-1e82aca4fb63-webhook-cert\") pod \"packageserver-d55dfcdfc-cqw6s\" (UID: \"766043b8-3f15-4428-855d-1e82aca4fb63\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cqw6s" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.087750 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bf71ed2f-6b27-4ce7-93ae-6f5ced50b306-available-featuregates\") pod \"openshift-config-operator-7777fb866f-sm85h\" (UID: \"bf71ed2f-6b27-4ce7-93ae-6f5ced50b306\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-sm85h" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.087774 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/acf6d794-22cb-4a06-adf1-4bea5f76e854-serving-cert\") pod \"console-operator-58897d9998-hx54r\" (UID: \"acf6d794-22cb-4a06-adf1-4bea5f76e854\") " pod="openshift-console-operator/console-operator-58897d9998-hx54r" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.087900 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcb5aba5-cd6c-4535-a92d-d1583fe02b18-config\") pod \"kube-controller-manager-operator-78b949d7b-bcsxh\" (UID: \"dcb5aba5-cd6c-4535-a92d-d1583fe02b18\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bcsxh" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.088065 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a988a2aa-8447-47fe-9b03-771d1633d69f-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-4z48d\" (UID: \"a988a2aa-8447-47fe-9b03-771d1633d69f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4z48d" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.088160 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bcv4\" (UniqueName: \"kubernetes.io/projected/406a9c63-4d0d-4c27-8ade-804cd92b0985-kube-api-access-2bcv4\") pod \"machine-config-operator-74547568cd-xjls2\" (UID: \"406a9c63-4d0d-4c27-8ade-804cd92b0985\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xjls2" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.088239 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bb6e47d0-5966-48d3-be81-97265e7e7a4f-console-config\") pod \"console-f9d7485db-f9k95\" (UID: \"bb6e47d0-5966-48d3-be81-97265e7e7a4f\") " pod="openshift-console/console-f9d7485db-f9k95" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.088327 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6scb\" (UniqueName: \"kubernetes.io/projected/7918cfa2-6bbe-4434-a62f-8b06e3ff324e-kube-api-access-b6scb\") pod \"apiserver-7bbb656c7d-2zzvd\" (UID: \"7918cfa2-6bbe-4434-a62f-8b06e3ff324e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2zzvd" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.088403 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fe562f92-5985-4fbf-a5b9-8359e7a044d9-audit-dir\") pod \"apiserver-76f77b778f-m8p5b\" (UID: \"fe562f92-5985-4fbf-a5b9-8359e7a044d9\") " pod="openshift-apiserver/apiserver-76f77b778f-m8p5b" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.088479 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwwvd\" (UniqueName: \"kubernetes.io/projected/c449e593-5832-4dc3-b251-fc8d2838e680-kube-api-access-rwwvd\") pod \"machine-approver-56656f9798-6wzhb\" (UID: \"c449e593-5832-4dc3-b251-fc8d2838e680\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6wzhb" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.088548 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzd22\" (UniqueName: \"kubernetes.io/projected/7aedeec4-fc41-47aa-85a1-d5a92de50deb-kube-api-access-bzd22\") pod \"route-controller-manager-6576b87f9c-2vtl4\" (UID: \"7aedeec4-fc41-47aa-85a1-d5a92de50deb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2vtl4" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.088632 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bb6e47d0-5966-48d3-be81-97265e7e7a4f-service-ca\") pod \"console-f9d7485db-f9k95\" (UID: \"bb6e47d0-5966-48d3-be81-97265e7e7a4f\") " pod="openshift-console/console-f9d7485db-f9k95" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.088711 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe562f92-5985-4fbf-a5b9-8359e7a044d9-config\") pod \"apiserver-76f77b778f-m8p5b\" (UID: \"fe562f92-5985-4fbf-a5b9-8359e7a044d9\") " pod="openshift-apiserver/apiserver-76f77b778f-m8p5b" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.088801 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/79abd33c-0184-473e-8bb9-c408a5c32efc-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-c7gls\" (UID: \"79abd33c-0184-473e-8bb9-c408a5c32efc\") " pod="openshift-authentication/oauth-openshift-558db77b4-c7gls" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.088884 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/79abd33c-0184-473e-8bb9-c408a5c32efc-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-c7gls\" (UID: \"79abd33c-0184-473e-8bb9-c408a5c32efc\") " pod="openshift-authentication/oauth-openshift-558db77b4-c7gls" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.088989 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/79abd33c-0184-473e-8bb9-c408a5c32efc-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-c7gls\" (UID: \"79abd33c-0184-473e-8bb9-c408a5c32efc\") " pod="openshift-authentication/oauth-openshift-558db77b4-c7gls" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.089078 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wd92n\" (UniqueName: \"kubernetes.io/projected/81c799b6-ed27-4b1c-9751-55fdcc101362-kube-api-access-wd92n\") pod \"cluster-image-registry-operator-dc59b4c8b-h9trv\" (UID: \"81c799b6-ed27-4b1c-9751-55fdcc101362\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h9trv" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.089166 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dcb5aba5-cd6c-4535-a92d-d1583fe02b18-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-bcsxh\" (UID: \"dcb5aba5-cd6c-4535-a92d-d1583fe02b18\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bcsxh" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.089256 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/79abd33c-0184-473e-8bb9-c408a5c32efc-audit-policies\") pod \"oauth-openshift-558db77b4-c7gls\" (UID: \"79abd33c-0184-473e-8bb9-c408a5c32efc\") " pod="openshift-authentication/oauth-openshift-558db77b4-c7gls" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.089339 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qpjv\" (UniqueName: \"kubernetes.io/projected/5cd3272a-98a9-421b-97f9-424bc5907cd1-kube-api-access-5qpjv\") pod \"openshift-apiserver-operator-796bbdcf4f-r6pct\" (UID: \"5cd3272a-98a9-421b-97f9-424bc5907cd1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r6pct" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.089407 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/fe562f92-5985-4fbf-a5b9-8359e7a044d9-audit\") pod \"apiserver-76f77b778f-m8p5b\" (UID: \"fe562f92-5985-4fbf-a5b9-8359e7a044d9\") " pod="openshift-apiserver/apiserver-76f77b778f-m8p5b" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.089477 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zns2k\" (UniqueName: \"kubernetes.io/projected/23eabf52-0331-423d-a779-c83ef2d2c0fc-kube-api-access-zns2k\") pod \"service-ca-9c57cc56f-sljgx\" (UID: \"23eabf52-0331-423d-a779-c83ef2d2c0fc\") " pod="openshift-service-ca/service-ca-9c57cc56f-sljgx" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.089548 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bd26v\" (UniqueName: \"kubernetes.io/projected/6fb639a8-cf5f-4b09-aae1-35844b2e792d-kube-api-access-bd26v\") pod \"etcd-operator-b45778765-tgbx7\" (UID: \"6fb639a8-cf5f-4b09-aae1-35844b2e792d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tgbx7" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.089628 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/59ce8ebf-2806-4ff3-bd4c-7f1ace81f7e2-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-p8vxn\" (UID: \"59ce8ebf-2806-4ff3-bd4c-7f1ace81f7e2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-p8vxn" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.089701 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bb6e47d0-5966-48d3-be81-97265e7e7a4f-console-oauth-config\") pod \"console-f9d7485db-f9k95\" (UID: \"bb6e47d0-5966-48d3-be81-97265e7e7a4f\") " pod="openshift-console/console-f9d7485db-f9k95" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.089783 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe562f92-5985-4fbf-a5b9-8359e7a044d9-trusted-ca-bundle\") pod \"apiserver-76f77b778f-m8p5b\" (UID: \"fe562f92-5985-4fbf-a5b9-8359e7a044d9\") " pod="openshift-apiserver/apiserver-76f77b778f-m8p5b" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.089855 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l749s\" (UniqueName: \"kubernetes.io/projected/fe562f92-5985-4fbf-a5b9-8359e7a044d9-kube-api-access-l749s\") pod \"apiserver-76f77b778f-m8p5b\" (UID: \"fe562f92-5985-4fbf-a5b9-8359e7a044d9\") " pod="openshift-apiserver/apiserver-76f77b778f-m8p5b" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.089925 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7918cfa2-6bbe-4434-a62f-8b06e3ff324e-audit-policies\") pod \"apiserver-7bbb656c7d-2zzvd\" (UID: \"7918cfa2-6bbe-4434-a62f-8b06e3ff324e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2zzvd" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.090011 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7aedeec4-fc41-47aa-85a1-d5a92de50deb-client-ca\") pod \"route-controller-manager-6576b87f9c-2vtl4\" (UID: \"7aedeec4-fc41-47aa-85a1-d5a92de50deb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2vtl4" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.090088 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7918cfa2-6bbe-4434-a62f-8b06e3ff324e-etcd-client\") pod \"apiserver-7bbb656c7d-2zzvd\" (UID: \"7918cfa2-6bbe-4434-a62f-8b06e3ff324e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2zzvd" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.090172 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/fe562f92-5985-4fbf-a5b9-8359e7a044d9-etcd-client\") pod \"apiserver-76f77b778f-m8p5b\" (UID: \"fe562f92-5985-4fbf-a5b9-8359e7a044d9\") " pod="openshift-apiserver/apiserver-76f77b778f-m8p5b" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.090244 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7918cfa2-6bbe-4434-a62f-8b06e3ff324e-audit-dir\") pod \"apiserver-7bbb656c7d-2zzvd\" (UID: \"7918cfa2-6bbe-4434-a62f-8b06e3ff324e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2zzvd" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.089485 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bb6e47d0-5966-48d3-be81-97265e7e7a4f-service-ca\") pod \"console-f9d7485db-f9k95\" (UID: \"bb6e47d0-5966-48d3-be81-97265e7e7a4f\") " pod="openshift-console/console-f9d7485db-f9k95" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.090332 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59ce8ebf-2806-4ff3-bd4c-7f1ace81f7e2-serving-cert\") pod \"authentication-operator-69f744f599-p8vxn\" (UID: \"59ce8ebf-2806-4ff3-bd4c-7f1ace81f7e2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-p8vxn" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.089954 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/fe562f92-5985-4fbf-a5b9-8359e7a044d9-audit\") pod \"apiserver-76f77b778f-m8p5b\" (UID: \"fe562f92-5985-4fbf-a5b9-8359e7a044d9\") " pod="openshift-apiserver/apiserver-76f77b778f-m8p5b" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.090405 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbb7l\" (UniqueName: \"kubernetes.io/projected/766043b8-3f15-4428-855d-1e82aca4fb63-kube-api-access-wbb7l\") pod \"packageserver-d55dfcdfc-cqw6s\" (UID: \"766043b8-3f15-4428-855d-1e82aca4fb63\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cqw6s" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.090461 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bb6e47d0-5966-48d3-be81-97265e7e7a4f-console-serving-cert\") pod \"console-f9d7485db-f9k95\" (UID: \"bb6e47d0-5966-48d3-be81-97265e7e7a4f\") " pod="openshift-console/console-f9d7485db-f9k95" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.090507 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lm7wr\" (UniqueName: \"kubernetes.io/projected/bb6e47d0-5966-48d3-be81-97265e7e7a4f-kube-api-access-lm7wr\") pod \"console-f9d7485db-f9k95\" (UID: \"bb6e47d0-5966-48d3-be81-97265e7e7a4f\") " pod="openshift-console/console-f9d7485db-f9k95" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.090540 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/79abd33c-0184-473e-8bb9-c408a5c32efc-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-c7gls\" (UID: \"79abd33c-0184-473e-8bb9-c408a5c32efc\") " pod="openshift-authentication/oauth-openshift-558db77b4-c7gls" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.090584 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/79abd33c-0184-473e-8bb9-c408a5c32efc-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-c7gls\" (UID: \"79abd33c-0184-473e-8bb9-c408a5c32efc\") " pod="openshift-authentication/oauth-openshift-558db77b4-c7gls" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.090610 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/81c799b6-ed27-4b1c-9751-55fdcc101362-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-h9trv\" (UID: \"81c799b6-ed27-4b1c-9751-55fdcc101362\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h9trv" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.089485 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe562f92-5985-4fbf-a5b9-8359e7a044d9-config\") pod \"apiserver-76f77b778f-m8p5b\" (UID: \"fe562f92-5985-4fbf-a5b9-8359e7a044d9\") " pod="openshift-apiserver/apiserver-76f77b778f-m8p5b" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.088743 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fe562f92-5985-4fbf-a5b9-8359e7a044d9-audit-dir\") pod \"apiserver-76f77b778f-m8p5b\" (UID: \"fe562f92-5985-4fbf-a5b9-8359e7a044d9\") " pod="openshift-apiserver/apiserver-76f77b778f-m8p5b" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.090672 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ls596\" (UniqueName: \"kubernetes.io/projected/e0b7b830-078c-4448-b914-ab62e5ff7059-kube-api-access-ls596\") pod \"controller-manager-879f6c89f-n4pr6\" (UID: \"e0b7b830-078c-4448-b914-ab62e5ff7059\") " pod="openshift-controller-manager/controller-manager-879f6c89f-n4pr6" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.090704 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhj9p\" (UniqueName: \"kubernetes.io/projected/795dab1c-49d5-4b05-a84f-4e1655d459fc-kube-api-access-nhj9p\") pod \"machine-api-operator-5694c8668f-42pj4\" (UID: \"795dab1c-49d5-4b05-a84f-4e1655d459fc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-42pj4" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.090730 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/fe562f92-5985-4fbf-a5b9-8359e7a044d9-node-pullsecrets\") pod \"apiserver-76f77b778f-m8p5b\" (UID: \"fe562f92-5985-4fbf-a5b9-8359e7a044d9\") " pod="openshift-apiserver/apiserver-76f77b778f-m8p5b" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.090808 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/79abd33c-0184-473e-8bb9-c408a5c32efc-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-c7gls\" (UID: \"79abd33c-0184-473e-8bb9-c408a5c32efc\") " pod="openshift-authentication/oauth-openshift-558db77b4-c7gls" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.090841 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dcb5aba5-cd6c-4535-a92d-d1583fe02b18-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-bcsxh\" (UID: \"dcb5aba5-cd6c-4535-a92d-d1583fe02b18\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bcsxh" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.090867 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/23eabf52-0331-423d-a779-c83ef2d2c0fc-signing-cabundle\") pod \"service-ca-9c57cc56f-sljgx\" (UID: \"23eabf52-0331-423d-a779-c83ef2d2c0fc\") " pod="openshift-service-ca/service-ca-9c57cc56f-sljgx" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.090905 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/79abd33c-0184-473e-8bb9-c408a5c32efc-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-c7gls\" (UID: \"79abd33c-0184-473e-8bb9-c408a5c32efc\") " pod="openshift-authentication/oauth-openshift-558db77b4-c7gls" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.090930 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/6fb639a8-cf5f-4b09-aae1-35844b2e792d-etcd-service-ca\") pod \"etcd-operator-b45778765-tgbx7\" (UID: \"6fb639a8-cf5f-4b09-aae1-35844b2e792d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tgbx7" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.090951 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf71ed2f-6b27-4ce7-93ae-6f5ced50b306-serving-cert\") pod \"openshift-config-operator-7777fb866f-sm85h\" (UID: \"bf71ed2f-6b27-4ce7-93ae-6f5ced50b306\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-sm85h" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.091036 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cd3272a-98a9-421b-97f9-424bc5907cd1-config\") pod \"openshift-apiserver-operator-796bbdcf4f-r6pct\" (UID: \"5cd3272a-98a9-421b-97f9-424bc5907cd1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r6pct" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.091061 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0b7b830-078c-4448-b914-ab62e5ff7059-client-ca\") pod \"controller-manager-879f6c89f-n4pr6\" (UID: \"e0b7b830-078c-4448-b914-ab62e5ff7059\") " pod="openshift-controller-manager/controller-manager-879f6c89f-n4pr6" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.091088 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e3752441-ce0c-46e5-bf1c-3bfab4ae6819-proxy-tls\") pod \"machine-config-controller-84d6567774-pttbb\" (UID: \"e3752441-ce0c-46e5-bf1c-3bfab4ae6819\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pttbb" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.091112 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/59ce8ebf-2806-4ff3-bd4c-7f1ace81f7e2-service-ca-bundle\") pod \"authentication-operator-69f744f599-p8vxn\" (UID: \"59ce8ebf-2806-4ff3-bd4c-7f1ace81f7e2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-p8vxn" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.091196 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wb7qg\" (UniqueName: \"kubernetes.io/projected/acf6d794-22cb-4a06-adf1-4bea5f76e854-kube-api-access-wb7qg\") pod \"console-operator-58897d9998-hx54r\" (UID: \"acf6d794-22cb-4a06-adf1-4bea5f76e854\") " pod="openshift-console-operator/console-operator-58897d9998-hx54r" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.091233 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c449e593-5832-4dc3-b251-fc8d2838e680-auth-proxy-config\") pod \"machine-approver-56656f9798-6wzhb\" (UID: \"c449e593-5832-4dc3-b251-fc8d2838e680\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6wzhb" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.091258 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bb6e47d0-5966-48d3-be81-97265e7e7a4f-oauth-serving-cert\") pod \"console-f9d7485db-f9k95\" (UID: \"bb6e47d0-5966-48d3-be81-97265e7e7a4f\") " pod="openshift-console/console-f9d7485db-f9k95" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.091281 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/795dab1c-49d5-4b05-a84f-4e1655d459fc-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-42pj4\" (UID: \"795dab1c-49d5-4b05-a84f-4e1655d459fc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-42pj4" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.091306 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/81c799b6-ed27-4b1c-9751-55fdcc101362-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-h9trv\" (UID: \"81c799b6-ed27-4b1c-9751-55fdcc101362\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h9trv" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.091333 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5cd3272a-98a9-421b-97f9-424bc5907cd1-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-r6pct\" (UID: \"5cd3272a-98a9-421b-97f9-424bc5907cd1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r6pct" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.091358 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/fe562f92-5985-4fbf-a5b9-8359e7a044d9-image-import-ca\") pod \"apiserver-76f77b778f-m8p5b\" (UID: \"fe562f92-5985-4fbf-a5b9-8359e7a044d9\") " pod="openshift-apiserver/apiserver-76f77b778f-m8p5b" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.091520 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/fe562f92-5985-4fbf-a5b9-8359e7a044d9-node-pullsecrets\") pod \"apiserver-76f77b778f-m8p5b\" (UID: \"fe562f92-5985-4fbf-a5b9-8359e7a044d9\") " pod="openshift-apiserver/apiserver-76f77b778f-m8p5b" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.091589 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e0b7b830-078c-4448-b914-ab62e5ff7059-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-n4pr6\" (UID: \"e0b7b830-078c-4448-b914-ab62e5ff7059\") " pod="openshift-controller-manager/controller-manager-879f6c89f-n4pr6" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.091634 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e3752441-ce0c-46e5-bf1c-3bfab4ae6819-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-pttbb\" (UID: \"e3752441-ce0c-46e5-bf1c-3bfab4ae6819\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pttbb" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.091664 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf56f15d-84d6-47c7-b8f7-1c9922d8f53f-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-flcxm\" (UID: \"bf56f15d-84d6-47c7-b8f7-1c9922d8f53f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-flcxm" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.091689 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6fb639a8-cf5f-4b09-aae1-35844b2e792d-etcd-client\") pod \"etcd-operator-b45778765-tgbx7\" (UID: \"6fb639a8-cf5f-4b09-aae1-35844b2e792d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tgbx7" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.091717 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/406a9c63-4d0d-4c27-8ade-804cd92b0985-auth-proxy-config\") pod \"machine-config-operator-74547568cd-xjls2\" (UID: \"406a9c63-4d0d-4c27-8ade-804cd92b0985\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xjls2" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.092463 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a988a2aa-8447-47fe-9b03-771d1633d69f-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-4z48d\" (UID: \"a988a2aa-8447-47fe-9b03-771d1633d69f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4z48d" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.092493 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fb639a8-cf5f-4b09-aae1-35844b2e792d-config\") pod \"etcd-operator-b45778765-tgbx7\" (UID: \"6fb639a8-cf5f-4b09-aae1-35844b2e792d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tgbx7" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.092520 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krsjb\" (UniqueName: \"kubernetes.io/projected/bf71ed2f-6b27-4ce7-93ae-6f5ced50b306-kube-api-access-krsjb\") pod \"openshift-config-operator-7777fb866f-sm85h\" (UID: \"bf71ed2f-6b27-4ce7-93ae-6f5ced50b306\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-sm85h" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.092538 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/acf6d794-22cb-4a06-adf1-4bea5f76e854-trusted-ca\") pod \"console-operator-58897d9998-hx54r\" (UID: \"acf6d794-22cb-4a06-adf1-4bea5f76e854\") " pod="openshift-console-operator/console-operator-58897d9998-hx54r" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.092566 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0b7b830-078c-4448-b914-ab62e5ff7059-config\") pod \"controller-manager-879f6c89f-n4pr6\" (UID: \"e0b7b830-078c-4448-b914-ab62e5ff7059\") " pod="openshift-controller-manager/controller-manager-879f6c89f-n4pr6" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.092581 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c449e593-5832-4dc3-b251-fc8d2838e680-machine-approver-tls\") pod \"machine-approver-56656f9798-6wzhb\" (UID: \"c449e593-5832-4dc3-b251-fc8d2838e680\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6wzhb" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.092597 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7aedeec4-fc41-47aa-85a1-d5a92de50deb-serving-cert\") pod \"route-controller-manager-6576b87f9c-2vtl4\" (UID: \"7aedeec4-fc41-47aa-85a1-d5a92de50deb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2vtl4" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.092613 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/79abd33c-0184-473e-8bb9-c408a5c32efc-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-c7gls\" (UID: \"79abd33c-0184-473e-8bb9-c408a5c32efc\") " pod="openshift-authentication/oauth-openshift-558db77b4-c7gls" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.093043 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cd3272a-98a9-421b-97f9-424bc5907cd1-config\") pod \"openshift-apiserver-operator-796bbdcf4f-r6pct\" (UID: \"5cd3272a-98a9-421b-97f9-424bc5907cd1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r6pct" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.093449 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c449e593-5832-4dc3-b251-fc8d2838e680-auth-proxy-config\") pod \"machine-approver-56656f9798-6wzhb\" (UID: \"c449e593-5832-4dc3-b251-fc8d2838e680\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6wzhb" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.093510 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/6fb639a8-cf5f-4b09-aae1-35844b2e792d-etcd-ca\") pod \"etcd-operator-b45778765-tgbx7\" (UID: \"6fb639a8-cf5f-4b09-aae1-35844b2e792d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tgbx7" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.093555 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4txwv\" (UniqueName: \"kubernetes.io/projected/c239d6eb-535e-442b-a67a-f8227313ceb4-kube-api-access-4txwv\") pod \"downloads-7954f5f757-dsfs8\" (UID: \"c239d6eb-535e-442b-a67a-f8227313ceb4\") " pod="openshift-console/downloads-7954f5f757-dsfs8" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.093586 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/acf6d794-22cb-4a06-adf1-4bea5f76e854-config\") pod \"console-operator-58897d9998-hx54r\" (UID: \"acf6d794-22cb-4a06-adf1-4bea5f76e854\") " pod="openshift-console-operator/console-operator-58897d9998-hx54r" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.093630 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/fe562f92-5985-4fbf-a5b9-8359e7a044d9-encryption-config\") pod \"apiserver-76f77b778f-m8p5b\" (UID: \"fe562f92-5985-4fbf-a5b9-8359e7a044d9\") " pod="openshift-apiserver/apiserver-76f77b778f-m8p5b" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.093657 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c449e593-5832-4dc3-b251-fc8d2838e680-config\") pod \"machine-approver-56656f9798-6wzhb\" (UID: \"c449e593-5832-4dc3-b251-fc8d2838e680\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6wzhb" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.093822 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb6e47d0-5966-48d3-be81-97265e7e7a4f-trusted-ca-bundle\") pod \"console-f9d7485db-f9k95\" (UID: \"bb6e47d0-5966-48d3-be81-97265e7e7a4f\") " pod="openshift-console/console-f9d7485db-f9k95" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.093851 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/fe562f92-5985-4fbf-a5b9-8359e7a044d9-etcd-serving-ca\") pod \"apiserver-76f77b778f-m8p5b\" (UID: \"fe562f92-5985-4fbf-a5b9-8359e7a044d9\") " pod="openshift-apiserver/apiserver-76f77b778f-m8p5b" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.093883 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgm82\" (UniqueName: \"kubernetes.io/projected/bf56f15d-84d6-47c7-b8f7-1c9922d8f53f-kube-api-access-kgm82\") pod \"cluster-samples-operator-665b6dd947-flcxm\" (UID: \"bf56f15d-84d6-47c7-b8f7-1c9922d8f53f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-flcxm" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.093911 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/79abd33c-0184-473e-8bb9-c408a5c32efc-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-c7gls\" (UID: \"79abd33c-0184-473e-8bb9-c408a5c32efc\") " pod="openshift-authentication/oauth-openshift-558db77b4-c7gls" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.093936 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/81c799b6-ed27-4b1c-9751-55fdcc101362-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-h9trv\" (UID: \"81c799b6-ed27-4b1c-9751-55fdcc101362\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h9trv" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.093965 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/fe562f92-5985-4fbf-a5b9-8359e7a044d9-image-import-ca\") pod \"apiserver-76f77b778f-m8p5b\" (UID: \"fe562f92-5985-4fbf-a5b9-8359e7a044d9\") " pod="openshift-apiserver/apiserver-76f77b778f-m8p5b" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.094315 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0b7b830-078c-4448-b914-ab62e5ff7059-config\") pod \"controller-manager-879f6c89f-n4pr6\" (UID: \"e0b7b830-078c-4448-b914-ab62e5ff7059\") " pod="openshift-controller-manager/controller-manager-879f6c89f-n4pr6" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.094390 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/fe562f92-5985-4fbf-a5b9-8359e7a044d9-etcd-serving-ca\") pod \"apiserver-76f77b778f-m8p5b\" (UID: \"fe562f92-5985-4fbf-a5b9-8359e7a044d9\") " pod="openshift-apiserver/apiserver-76f77b778f-m8p5b" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.098208 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bb6e47d0-5966-48d3-be81-97265e7e7a4f-console-oauth-config\") pod \"console-f9d7485db-f9k95\" (UID: \"bb6e47d0-5966-48d3-be81-97265e7e7a4f\") " pod="openshift-console/console-f9d7485db-f9k95" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.100584 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bb6e47d0-5966-48d3-be81-97265e7e7a4f-oauth-serving-cert\") pod \"console-f9d7485db-f9k95\" (UID: \"bb6e47d0-5966-48d3-be81-97265e7e7a4f\") " pod="openshift-console/console-f9d7485db-f9k95" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.101116 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/fe562f92-5985-4fbf-a5b9-8359e7a044d9-etcd-client\") pod \"apiserver-76f77b778f-m8p5b\" (UID: \"fe562f92-5985-4fbf-a5b9-8359e7a044d9\") " pod="openshift-apiserver/apiserver-76f77b778f-m8p5b" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.113589 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.114011 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.114124 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.114226 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.114394 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.114499 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.114616 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.114789 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.115118 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.115225 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.113711 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-2l65t"] Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.115581 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.113812 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.115231 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.116477 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-2l65t" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.116653 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c449e593-5832-4dc3-b251-fc8d2838e680-config\") pod \"machine-approver-56656f9798-6wzhb\" (UID: \"c449e593-5832-4dc3-b251-fc8d2838e680\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6wzhb" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.117506 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c449e593-5832-4dc3-b251-fc8d2838e680-machine-approver-tls\") pod \"machine-approver-56656f9798-6wzhb\" (UID: \"c449e593-5832-4dc3-b251-fc8d2838e680\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6wzhb" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.118962 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.119110 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.120051 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.121310 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.122043 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pfvzd"] Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.122234 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.122519 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pfvzd" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.122542 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.122686 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.122747 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.122796 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.122884 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.122931 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.123090 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.123204 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.123215 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.123320 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.123734 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.123828 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.123994 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.124132 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.124247 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.124346 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.124496 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.128736 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.129144 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.129183 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.129348 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.129469 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.129540 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.129592 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.129365 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.129145 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.129654 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.129413 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.129472 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.129772 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.129803 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.129857 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.129902 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.129934 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.129955 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.129808 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.130062 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.130106 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.130148 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.130167 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.130200 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.130020 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.130263 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.130330 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.130398 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.130485 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.130501 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.130689 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5cd3272a-98a9-421b-97f9-424bc5907cd1-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-r6pct\" (UID: \"5cd3272a-98a9-421b-97f9-424bc5907cd1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r6pct" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.130730 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.130543 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.136111 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.136291 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-9r6bj"] Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.137941 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-9r6bj" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.139670 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-7hljs"] Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.141382 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-7hljs" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.142879 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gvp7h"] Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.143246 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gvp7h" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.171504 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.178111 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb6e47d0-5966-48d3-be81-97265e7e7a4f-trusted-ca-bundle\") pod \"console-f9d7485db-f9k95\" (UID: \"bb6e47d0-5966-48d3-be81-97265e7e7a4f\") " pod="openshift-console/console-f9d7485db-f9k95" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.196519 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.197817 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.198136 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.199539 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6scb\" (UniqueName: \"kubernetes.io/projected/7918cfa2-6bbe-4434-a62f-8b06e3ff324e-kube-api-access-b6scb\") pod \"apiserver-7bbb656c7d-2zzvd\" (UID: \"7918cfa2-6bbe-4434-a62f-8b06e3ff324e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2zzvd" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.199575 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a988a2aa-8447-47fe-9b03-771d1633d69f-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-4z48d\" (UID: \"a988a2aa-8447-47fe-9b03-771d1633d69f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4z48d" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.199594 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bcv4\" (UniqueName: \"kubernetes.io/projected/406a9c63-4d0d-4c27-8ade-804cd92b0985-kube-api-access-2bcv4\") pod \"machine-config-operator-74547568cd-xjls2\" (UID: \"406a9c63-4d0d-4c27-8ade-804cd92b0985\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xjls2" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.199613 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzd22\" (UniqueName: \"kubernetes.io/projected/7aedeec4-fc41-47aa-85a1-d5a92de50deb-kube-api-access-bzd22\") pod \"route-controller-manager-6576b87f9c-2vtl4\" (UID: \"7aedeec4-fc41-47aa-85a1-d5a92de50deb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2vtl4" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.199655 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/79abd33c-0184-473e-8bb9-c408a5c32efc-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-c7gls\" (UID: \"79abd33c-0184-473e-8bb9-c408a5c32efc\") " pod="openshift-authentication/oauth-openshift-558db77b4-c7gls" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.199673 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/79abd33c-0184-473e-8bb9-c408a5c32efc-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-c7gls\" (UID: \"79abd33c-0184-473e-8bb9-c408a5c32efc\") " pod="openshift-authentication/oauth-openshift-558db77b4-c7gls" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.199696 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dcb5aba5-cd6c-4535-a92d-d1583fe02b18-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-bcsxh\" (UID: \"dcb5aba5-cd6c-4535-a92d-d1583fe02b18\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bcsxh" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.199715 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/79abd33c-0184-473e-8bb9-c408a5c32efc-audit-policies\") pod \"oauth-openshift-558db77b4-c7gls\" (UID: \"79abd33c-0184-473e-8bb9-c408a5c32efc\") " pod="openshift-authentication/oauth-openshift-558db77b4-c7gls" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.199735 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/79abd33c-0184-473e-8bb9-c408a5c32efc-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-c7gls\" (UID: \"79abd33c-0184-473e-8bb9-c408a5c32efc\") " pod="openshift-authentication/oauth-openshift-558db77b4-c7gls" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.199755 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wd92n\" (UniqueName: \"kubernetes.io/projected/81c799b6-ed27-4b1c-9751-55fdcc101362-kube-api-access-wd92n\") pod \"cluster-image-registry-operator-dc59b4c8b-h9trv\" (UID: \"81c799b6-ed27-4b1c-9751-55fdcc101362\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h9trv" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.199774 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zns2k\" (UniqueName: \"kubernetes.io/projected/23eabf52-0331-423d-a779-c83ef2d2c0fc-kube-api-access-zns2k\") pod \"service-ca-9c57cc56f-sljgx\" (UID: \"23eabf52-0331-423d-a779-c83ef2d2c0fc\") " pod="openshift-service-ca/service-ca-9c57cc56f-sljgx" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.199792 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/59ce8ebf-2806-4ff3-bd4c-7f1ace81f7e2-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-p8vxn\" (UID: \"59ce8ebf-2806-4ff3-bd4c-7f1ace81f7e2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-p8vxn" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.199812 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bd26v\" (UniqueName: \"kubernetes.io/projected/6fb639a8-cf5f-4b09-aae1-35844b2e792d-kube-api-access-bd26v\") pod \"etcd-operator-b45778765-tgbx7\" (UID: \"6fb639a8-cf5f-4b09-aae1-35844b2e792d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tgbx7" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.199849 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7918cfa2-6bbe-4434-a62f-8b06e3ff324e-audit-policies\") pod \"apiserver-7bbb656c7d-2zzvd\" (UID: \"7918cfa2-6bbe-4434-a62f-8b06e3ff324e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2zzvd" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.199870 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7aedeec4-fc41-47aa-85a1-d5a92de50deb-client-ca\") pod \"route-controller-manager-6576b87f9c-2vtl4\" (UID: \"7aedeec4-fc41-47aa-85a1-d5a92de50deb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2vtl4" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.199885 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7918cfa2-6bbe-4434-a62f-8b06e3ff324e-etcd-client\") pod \"apiserver-7bbb656c7d-2zzvd\" (UID: \"7918cfa2-6bbe-4434-a62f-8b06e3ff324e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2zzvd" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.199901 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbb7l\" (UniqueName: \"kubernetes.io/projected/766043b8-3f15-4428-855d-1e82aca4fb63-kube-api-access-wbb7l\") pod \"packageserver-d55dfcdfc-cqw6s\" (UID: \"766043b8-3f15-4428-855d-1e82aca4fb63\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cqw6s" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.199925 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7918cfa2-6bbe-4434-a62f-8b06e3ff324e-audit-dir\") pod \"apiserver-7bbb656c7d-2zzvd\" (UID: \"7918cfa2-6bbe-4434-a62f-8b06e3ff324e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2zzvd" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.199939 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59ce8ebf-2806-4ff3-bd4c-7f1ace81f7e2-serving-cert\") pod \"authentication-operator-69f744f599-p8vxn\" (UID: \"59ce8ebf-2806-4ff3-bd4c-7f1ace81f7e2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-p8vxn" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.199954 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/81c799b6-ed27-4b1c-9751-55fdcc101362-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-h9trv\" (UID: \"81c799b6-ed27-4b1c-9751-55fdcc101362\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h9trv" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.199996 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/79abd33c-0184-473e-8bb9-c408a5c32efc-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-c7gls\" (UID: \"79abd33c-0184-473e-8bb9-c408a5c32efc\") " pod="openshift-authentication/oauth-openshift-558db77b4-c7gls" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.200015 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/79abd33c-0184-473e-8bb9-c408a5c32efc-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-c7gls\" (UID: \"79abd33c-0184-473e-8bb9-c408a5c32efc\") " pod="openshift-authentication/oauth-openshift-558db77b4-c7gls" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.200031 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dcb5aba5-cd6c-4535-a92d-d1583fe02b18-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-bcsxh\" (UID: \"dcb5aba5-cd6c-4535-a92d-d1583fe02b18\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bcsxh" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.200047 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/79abd33c-0184-473e-8bb9-c408a5c32efc-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-c7gls\" (UID: \"79abd33c-0184-473e-8bb9-c408a5c32efc\") " pod="openshift-authentication/oauth-openshift-558db77b4-c7gls" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.200063 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/6fb639a8-cf5f-4b09-aae1-35844b2e792d-etcd-service-ca\") pod \"etcd-operator-b45778765-tgbx7\" (UID: \"6fb639a8-cf5f-4b09-aae1-35844b2e792d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tgbx7" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.200078 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf71ed2f-6b27-4ce7-93ae-6f5ced50b306-serving-cert\") pod \"openshift-config-operator-7777fb866f-sm85h\" (UID: \"bf71ed2f-6b27-4ce7-93ae-6f5ced50b306\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-sm85h" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.200094 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/23eabf52-0331-423d-a779-c83ef2d2c0fc-signing-cabundle\") pod \"service-ca-9c57cc56f-sljgx\" (UID: \"23eabf52-0331-423d-a779-c83ef2d2c0fc\") " pod="openshift-service-ca/service-ca-9c57cc56f-sljgx" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.200126 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/79abd33c-0184-473e-8bb9-c408a5c32efc-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-c7gls\" (UID: \"79abd33c-0184-473e-8bb9-c408a5c32efc\") " pod="openshift-authentication/oauth-openshift-558db77b4-c7gls" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.200143 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e3752441-ce0c-46e5-bf1c-3bfab4ae6819-proxy-tls\") pod \"machine-config-controller-84d6567774-pttbb\" (UID: \"e3752441-ce0c-46e5-bf1c-3bfab4ae6819\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pttbb" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.200158 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/59ce8ebf-2806-4ff3-bd4c-7f1ace81f7e2-service-ca-bundle\") pod \"authentication-operator-69f744f599-p8vxn\" (UID: \"59ce8ebf-2806-4ff3-bd4c-7f1ace81f7e2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-p8vxn" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.200173 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wb7qg\" (UniqueName: \"kubernetes.io/projected/acf6d794-22cb-4a06-adf1-4bea5f76e854-kube-api-access-wb7qg\") pod \"console-operator-58897d9998-hx54r\" (UID: \"acf6d794-22cb-4a06-adf1-4bea5f76e854\") " pod="openshift-console-operator/console-operator-58897d9998-hx54r" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.200187 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.200201 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/81c799b6-ed27-4b1c-9751-55fdcc101362-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-h9trv\" (UID: \"81c799b6-ed27-4b1c-9751-55fdcc101362\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h9trv" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.200218 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e3752441-ce0c-46e5-bf1c-3bfab4ae6819-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-pttbb\" (UID: \"e3752441-ce0c-46e5-bf1c-3bfab4ae6819\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pttbb" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.200233 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf56f15d-84d6-47c7-b8f7-1c9922d8f53f-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-flcxm\" (UID: \"bf56f15d-84d6-47c7-b8f7-1c9922d8f53f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-flcxm" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.200250 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6fb639a8-cf5f-4b09-aae1-35844b2e792d-etcd-client\") pod \"etcd-operator-b45778765-tgbx7\" (UID: \"6fb639a8-cf5f-4b09-aae1-35844b2e792d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tgbx7" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.200281 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/406a9c63-4d0d-4c27-8ade-804cd92b0985-auth-proxy-config\") pod \"machine-config-operator-74547568cd-xjls2\" (UID: \"406a9c63-4d0d-4c27-8ade-804cd92b0985\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xjls2" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.200298 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a988a2aa-8447-47fe-9b03-771d1633d69f-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-4z48d\" (UID: \"a988a2aa-8447-47fe-9b03-771d1633d69f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4z48d" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.200319 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fb639a8-cf5f-4b09-aae1-35844b2e792d-config\") pod \"etcd-operator-b45778765-tgbx7\" (UID: \"6fb639a8-cf5f-4b09-aae1-35844b2e792d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tgbx7" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.200370 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krsjb\" (UniqueName: \"kubernetes.io/projected/bf71ed2f-6b27-4ce7-93ae-6f5ced50b306-kube-api-access-krsjb\") pod \"openshift-config-operator-7777fb866f-sm85h\" (UID: \"bf71ed2f-6b27-4ce7-93ae-6f5ced50b306\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-sm85h" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.200392 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/acf6d794-22cb-4a06-adf1-4bea5f76e854-trusted-ca\") pod \"console-operator-58897d9998-hx54r\" (UID: \"acf6d794-22cb-4a06-adf1-4bea5f76e854\") " pod="openshift-console-operator/console-operator-58897d9998-hx54r" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.200407 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7aedeec4-fc41-47aa-85a1-d5a92de50deb-serving-cert\") pod \"route-controller-manager-6576b87f9c-2vtl4\" (UID: \"7aedeec4-fc41-47aa-85a1-d5a92de50deb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2vtl4" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.200427 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/79abd33c-0184-473e-8bb9-c408a5c32efc-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-c7gls\" (UID: \"79abd33c-0184-473e-8bb9-c408a5c32efc\") " pod="openshift-authentication/oauth-openshift-558db77b4-c7gls" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.200443 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/6fb639a8-cf5f-4b09-aae1-35844b2e792d-etcd-ca\") pod \"etcd-operator-b45778765-tgbx7\" (UID: \"6fb639a8-cf5f-4b09-aae1-35844b2e792d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tgbx7" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.200458 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4txwv\" (UniqueName: \"kubernetes.io/projected/c239d6eb-535e-442b-a67a-f8227313ceb4-kube-api-access-4txwv\") pod \"downloads-7954f5f757-dsfs8\" (UID: \"c239d6eb-535e-442b-a67a-f8227313ceb4\") " pod="openshift-console/downloads-7954f5f757-dsfs8" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.200463 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.200475 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/acf6d794-22cb-4a06-adf1-4bea5f76e854-config\") pod \"console-operator-58897d9998-hx54r\" (UID: \"acf6d794-22cb-4a06-adf1-4bea5f76e854\") " pod="openshift-console-operator/console-operator-58897d9998-hx54r" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.200497 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/79abd33c-0184-473e-8bb9-c408a5c32efc-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-c7gls\" (UID: \"79abd33c-0184-473e-8bb9-c408a5c32efc\") " pod="openshift-authentication/oauth-openshift-558db77b4-c7gls" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.200514 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/81c799b6-ed27-4b1c-9751-55fdcc101362-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-h9trv\" (UID: \"81c799b6-ed27-4b1c-9751-55fdcc101362\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h9trv" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.200531 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgm82\" (UniqueName: \"kubernetes.io/projected/bf56f15d-84d6-47c7-b8f7-1c9922d8f53f-kube-api-access-kgm82\") pod \"cluster-samples-operator-665b6dd947-flcxm\" (UID: \"bf56f15d-84d6-47c7-b8f7-1c9922d8f53f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-flcxm" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.200551 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7918cfa2-6bbe-4434-a62f-8b06e3ff324e-encryption-config\") pod \"apiserver-7bbb656c7d-2zzvd\" (UID: \"7918cfa2-6bbe-4434-a62f-8b06e3ff324e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2zzvd" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.200574 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pbfk\" (UniqueName: \"kubernetes.io/projected/1d15ad5d-2ee0-4543-8e56-89fa7a2461f7-kube-api-access-5pbfk\") pod \"migrator-59844c95c7-rqphx\" (UID: \"1d15ad5d-2ee0-4543-8e56-89fa7a2461f7\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rqphx" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.200590 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59ce8ebf-2806-4ff3-bd4c-7f1ace81f7e2-config\") pod \"authentication-operator-69f744f599-p8vxn\" (UID: \"59ce8ebf-2806-4ff3-bd4c-7f1ace81f7e2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-p8vxn" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.200613 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7918cfa2-6bbe-4434-a62f-8b06e3ff324e-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-2zzvd\" (UID: \"7918cfa2-6bbe-4434-a62f-8b06e3ff324e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2zzvd" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.200629 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-547g7\" (UniqueName: \"kubernetes.io/projected/59ce8ebf-2806-4ff3-bd4c-7f1ace81f7e2-kube-api-access-547g7\") pod \"authentication-operator-69f744f599-p8vxn\" (UID: \"59ce8ebf-2806-4ff3-bd4c-7f1ace81f7e2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-p8vxn" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.200648 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/406a9c63-4d0d-4c27-8ade-804cd92b0985-images\") pod \"machine-config-operator-74547568cd-xjls2\" (UID: \"406a9c63-4d0d-4c27-8ade-804cd92b0985\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xjls2" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.200663 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/23eabf52-0331-423d-a779-c83ef2d2c0fc-signing-key\") pod \"service-ca-9c57cc56f-sljgx\" (UID: \"23eabf52-0331-423d-a779-c83ef2d2c0fc\") " pod="openshift-service-ca/service-ca-9c57cc56f-sljgx" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.200679 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlj4s\" (UniqueName: \"kubernetes.io/projected/a988a2aa-8447-47fe-9b03-771d1633d69f-kube-api-access-nlj4s\") pod \"openshift-controller-manager-operator-756b6f6bc6-4z48d\" (UID: \"a988a2aa-8447-47fe-9b03-771d1633d69f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4z48d" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.200694 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26hrb\" (UniqueName: \"kubernetes.io/projected/e3752441-ce0c-46e5-bf1c-3bfab4ae6819-kube-api-access-26hrb\") pod \"machine-config-controller-84d6567774-pttbb\" (UID: \"e3752441-ce0c-46e5-bf1c-3bfab4ae6819\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pttbb" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.200716 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72q79\" (UniqueName: \"kubernetes.io/projected/79abd33c-0184-473e-8bb9-c408a5c32efc-kube-api-access-72q79\") pod \"oauth-openshift-558db77b4-c7gls\" (UID: \"79abd33c-0184-473e-8bb9-c408a5c32efc\") " pod="openshift-authentication/oauth-openshift-558db77b4-c7gls" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.200731 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/766043b8-3f15-4428-855d-1e82aca4fb63-apiservice-cert\") pod \"packageserver-d55dfcdfc-cqw6s\" (UID: \"766043b8-3f15-4428-855d-1e82aca4fb63\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cqw6s" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.200747 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6fb639a8-cf5f-4b09-aae1-35844b2e792d-serving-cert\") pod \"etcd-operator-b45778765-tgbx7\" (UID: \"6fb639a8-cf5f-4b09-aae1-35844b2e792d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tgbx7" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.200763 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/79abd33c-0184-473e-8bb9-c408a5c32efc-audit-dir\") pod \"oauth-openshift-558db77b4-c7gls\" (UID: \"79abd33c-0184-473e-8bb9-c408a5c32efc\") " pod="openshift-authentication/oauth-openshift-558db77b4-c7gls" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.200779 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/79abd33c-0184-473e-8bb9-c408a5c32efc-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-c7gls\" (UID: \"79abd33c-0184-473e-8bb9-c408a5c32efc\") " pod="openshift-authentication/oauth-openshift-558db77b4-c7gls" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.200797 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/406a9c63-4d0d-4c27-8ade-804cd92b0985-proxy-tls\") pod \"machine-config-operator-74547568cd-xjls2\" (UID: \"406a9c63-4d0d-4c27-8ade-804cd92b0985\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xjls2" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.200796 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a988a2aa-8447-47fe-9b03-771d1633d69f-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-4z48d\" (UID: \"a988a2aa-8447-47fe-9b03-771d1633d69f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4z48d" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.200817 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7918cfa2-6bbe-4434-a62f-8b06e3ff324e-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-2zzvd\" (UID: \"7918cfa2-6bbe-4434-a62f-8b06e3ff324e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2zzvd" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.201184 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rn9hl"] Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.201212 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/79abd33c-0184-473e-8bb9-c408a5c32efc-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-c7gls\" (UID: \"79abd33c-0184-473e-8bb9-c408a5c32efc\") " pod="openshift-authentication/oauth-openshift-558db77b4-c7gls" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.201737 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hdrjx"] Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.201767 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.202109 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hdrjx" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.202332 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rn9hl" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.203370 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.203759 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf71ed2f-6b27-4ce7-93ae-6f5ced50b306-serving-cert\") pod \"openshift-config-operator-7777fb866f-sm85h\" (UID: \"bf71ed2f-6b27-4ce7-93ae-6f5ced50b306\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-sm85h" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.204418 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bb6e47d0-5966-48d3-be81-97265e7e7a4f-console-serving-cert\") pod \"console-f9d7485db-f9k95\" (UID: \"bb6e47d0-5966-48d3-be81-97265e7e7a4f\") " pod="openshift-console/console-f9d7485db-f9k95" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.204783 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.205038 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/766043b8-3f15-4428-855d-1e82aca4fb63-tmpfs\") pod \"packageserver-d55dfcdfc-cqw6s\" (UID: \"766043b8-3f15-4428-855d-1e82aca4fb63\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cqw6s" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.205103 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7aedeec4-fc41-47aa-85a1-d5a92de50deb-config\") pod \"route-controller-manager-6576b87f9c-2vtl4\" (UID: \"7aedeec4-fc41-47aa-85a1-d5a92de50deb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2vtl4" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.205133 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/79abd33c-0184-473e-8bb9-c408a5c32efc-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-c7gls\" (UID: \"79abd33c-0184-473e-8bb9-c408a5c32efc\") " pod="openshift-authentication/oauth-openshift-558db77b4-c7gls" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.205161 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/766043b8-3f15-4428-855d-1e82aca4fb63-webhook-cert\") pod \"packageserver-d55dfcdfc-cqw6s\" (UID: \"766043b8-3f15-4428-855d-1e82aca4fb63\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cqw6s" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.205182 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bf71ed2f-6b27-4ce7-93ae-6f5ced50b306-available-featuregates\") pod \"openshift-config-operator-7777fb866f-sm85h\" (UID: \"bf71ed2f-6b27-4ce7-93ae-6f5ced50b306\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-sm85h" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.205206 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/acf6d794-22cb-4a06-adf1-4bea5f76e854-serving-cert\") pod \"console-operator-58897d9998-hx54r\" (UID: \"acf6d794-22cb-4a06-adf1-4bea5f76e854\") " pod="openshift-console-operator/console-operator-58897d9998-hx54r" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.205237 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7918cfa2-6bbe-4434-a62f-8b06e3ff324e-serving-cert\") pod \"apiserver-7bbb656c7d-2zzvd\" (UID: \"7918cfa2-6bbe-4434-a62f-8b06e3ff324e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2zzvd" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.205290 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcb5aba5-cd6c-4535-a92d-d1583fe02b18-config\") pod \"kube-controller-manager-operator-78b949d7b-bcsxh\" (UID: \"dcb5aba5-cd6c-4535-a92d-d1583fe02b18\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bcsxh" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.205632 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7918cfa2-6bbe-4434-a62f-8b06e3ff324e-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-2zzvd\" (UID: \"7918cfa2-6bbe-4434-a62f-8b06e3ff324e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2zzvd" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.205886 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.206177 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcb5aba5-cd6c-4535-a92d-d1583fe02b18-config\") pod \"kube-controller-manager-operator-78b949d7b-bcsxh\" (UID: \"dcb5aba5-cd6c-4535-a92d-d1583fe02b18\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bcsxh" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.207091 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.207362 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.212310 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/acf6d794-22cb-4a06-adf1-4bea5f76e854-trusted-ca\") pod \"console-operator-58897d9998-hx54r\" (UID: \"acf6d794-22cb-4a06-adf1-4bea5f76e854\") " pod="openshift-console-operator/console-operator-58897d9998-hx54r" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.214343 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/79abd33c-0184-473e-8bb9-c408a5c32efc-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-c7gls\" (UID: \"79abd33c-0184-473e-8bb9-c408a5c32efc\") " pod="openshift-authentication/oauth-openshift-558db77b4-c7gls" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.214434 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7918cfa2-6bbe-4434-a62f-8b06e3ff324e-audit-policies\") pod \"apiserver-7bbb656c7d-2zzvd\" (UID: \"7918cfa2-6bbe-4434-a62f-8b06e3ff324e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2zzvd" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.215228 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59ce8ebf-2806-4ff3-bd4c-7f1ace81f7e2-config\") pod \"authentication-operator-69f744f599-p8vxn\" (UID: \"59ce8ebf-2806-4ff3-bd4c-7f1ace81f7e2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-p8vxn" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.216216 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409960-6f6w5"] Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.216336 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/79abd33c-0184-473e-8bb9-c408a5c32efc-audit-dir\") pod \"oauth-openshift-558db77b4-c7gls\" (UID: \"79abd33c-0184-473e-8bb9-c408a5c32efc\") " pod="openshift-authentication/oauth-openshift-558db77b4-c7gls" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.216654 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jsmc4"] Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.217674 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7aedeec4-fc41-47aa-85a1-d5a92de50deb-client-ca\") pod \"route-controller-manager-6576b87f9c-2vtl4\" (UID: \"7aedeec4-fc41-47aa-85a1-d5a92de50deb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2vtl4" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.217993 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jsmc4" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.218196 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/406a9c63-4d0d-4c27-8ade-804cd92b0985-auth-proxy-config\") pod \"machine-config-operator-74547568cd-xjls2\" (UID: \"406a9c63-4d0d-4c27-8ade-804cd92b0985\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xjls2" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.218576 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/59ce8ebf-2806-4ff3-bd4c-7f1ace81f7e2-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-p8vxn\" (UID: \"59ce8ebf-2806-4ff3-bd4c-7f1ace81f7e2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-p8vxn" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.218994 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/81c799b6-ed27-4b1c-9751-55fdcc101362-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-h9trv\" (UID: \"81c799b6-ed27-4b1c-9751-55fdcc101362\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h9trv" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.219013 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/79abd33c-0184-473e-8bb9-c408a5c32efc-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-c7gls\" (UID: \"79abd33c-0184-473e-8bb9-c408a5c32efc\") " pod="openshift-authentication/oauth-openshift-558db77b4-c7gls" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.219044 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7918cfa2-6bbe-4434-a62f-8b06e3ff324e-audit-dir\") pod \"apiserver-7bbb656c7d-2zzvd\" (UID: \"7918cfa2-6bbe-4434-a62f-8b06e3ff324e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2zzvd" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.220965 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409960-6f6w5" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.222426 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf56f15d-84d6-47c7-b8f7-1c9922d8f53f-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-flcxm\" (UID: \"bf56f15d-84d6-47c7-b8f7-1c9922d8f53f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-flcxm" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.223159 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a988a2aa-8447-47fe-9b03-771d1633d69f-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-4z48d\" (UID: \"a988a2aa-8447-47fe-9b03-771d1633d69f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4z48d" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.223445 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7918cfa2-6bbe-4434-a62f-8b06e3ff324e-encryption-config\") pod \"apiserver-7bbb656c7d-2zzvd\" (UID: \"7918cfa2-6bbe-4434-a62f-8b06e3ff324e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2zzvd" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.224388 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/79abd33c-0184-473e-8bb9-c408a5c32efc-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-c7gls\" (UID: \"79abd33c-0184-473e-8bb9-c408a5c32efc\") " pod="openshift-authentication/oauth-openshift-558db77b4-c7gls" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.225398 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-42pj4"] Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.225763 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59ce8ebf-2806-4ff3-bd4c-7f1ace81f7e2-serving-cert\") pod \"authentication-operator-69f744f599-p8vxn\" (UID: \"59ce8ebf-2806-4ff3-bd4c-7f1ace81f7e2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-p8vxn" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.226241 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/79abd33c-0184-473e-8bb9-c408a5c32efc-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-c7gls\" (UID: \"79abd33c-0184-473e-8bb9-c408a5c32efc\") " pod="openshift-authentication/oauth-openshift-558db77b4-c7gls" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.227785 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7918cfa2-6bbe-4434-a62f-8b06e3ff324e-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-2zzvd\" (UID: \"7918cfa2-6bbe-4434-a62f-8b06e3ff324e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2zzvd" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.228178 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-n4pr6"] Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.228208 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r6pct"] Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.228340 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/59ce8ebf-2806-4ff3-bd4c-7f1ace81f7e2-service-ca-bundle\") pod \"authentication-operator-69f744f599-p8vxn\" (UID: \"59ce8ebf-2806-4ff3-bd4c-7f1ace81f7e2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-p8vxn" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.228369 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.228868 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/766043b8-3f15-4428-855d-1e82aca4fb63-tmpfs\") pod \"packageserver-d55dfcdfc-cqw6s\" (UID: \"766043b8-3f15-4428-855d-1e82aca4fb63\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cqw6s" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.229045 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e3752441-ce0c-46e5-bf1c-3bfab4ae6819-proxy-tls\") pod \"machine-config-controller-84d6567774-pttbb\" (UID: \"e3752441-ce0c-46e5-bf1c-3bfab4ae6819\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pttbb" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.230097 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e3752441-ce0c-46e5-bf1c-3bfab4ae6819-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-pttbb\" (UID: \"e3752441-ce0c-46e5-bf1c-3bfab4ae6819\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pttbb" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.231037 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/79abd33c-0184-473e-8bb9-c408a5c32efc-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-c7gls\" (UID: \"79abd33c-0184-473e-8bb9-c408a5c32efc\") " pod="openshift-authentication/oauth-openshift-558db77b4-c7gls" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.231740 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/81c799b6-ed27-4b1c-9751-55fdcc101362-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-h9trv\" (UID: \"81c799b6-ed27-4b1c-9751-55fdcc101362\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h9trv" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.232177 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7918cfa2-6bbe-4434-a62f-8b06e3ff324e-etcd-client\") pod \"apiserver-7bbb656c7d-2zzvd\" (UID: \"7918cfa2-6bbe-4434-a62f-8b06e3ff324e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2zzvd" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.232201 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-f9k95"] Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.232640 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7aedeec4-fc41-47aa-85a1-d5a92de50deb-serving-cert\") pod \"route-controller-manager-6576b87f9c-2vtl4\" (UID: \"7aedeec4-fc41-47aa-85a1-d5a92de50deb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2vtl4" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.233031 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/79abd33c-0184-473e-8bb9-c408a5c32efc-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-c7gls\" (UID: \"79abd33c-0184-473e-8bb9-c408a5c32efc\") " pod="openshift-authentication/oauth-openshift-558db77b4-c7gls" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.233410 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/79abd33c-0184-473e-8bb9-c408a5c32efc-audit-policies\") pod \"oauth-openshift-558db77b4-c7gls\" (UID: \"79abd33c-0184-473e-8bb9-c408a5c32efc\") " pod="openshift-authentication/oauth-openshift-558db77b4-c7gls" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.233616 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bf71ed2f-6b27-4ce7-93ae-6f5ced50b306-available-featuregates\") pod \"openshift-config-operator-7777fb866f-sm85h\" (UID: \"bf71ed2f-6b27-4ce7-93ae-6f5ced50b306\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-sm85h" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.233634 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/79abd33c-0184-473e-8bb9-c408a5c32efc-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-c7gls\" (UID: \"79abd33c-0184-473e-8bb9-c408a5c32efc\") " pod="openshift-authentication/oauth-openshift-558db77b4-c7gls" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.233796 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.233848 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7aedeec4-fc41-47aa-85a1-d5a92de50deb-config\") pod \"route-controller-manager-6576b87f9c-2vtl4\" (UID: \"7aedeec4-fc41-47aa-85a1-d5a92de50deb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2vtl4" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.234730 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/acf6d794-22cb-4a06-adf1-4bea5f76e854-config\") pod \"console-operator-58897d9998-hx54r\" (UID: \"acf6d794-22cb-4a06-adf1-4bea5f76e854\") " pod="openshift-console-operator/console-operator-58897d9998-hx54r" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.235453 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe562f92-5985-4fbf-a5b9-8359e7a044d9-trusted-ca-bundle\") pod \"apiserver-76f77b778f-m8p5b\" (UID: \"fe562f92-5985-4fbf-a5b9-8359e7a044d9\") " pod="openshift-apiserver/apiserver-76f77b778f-m8p5b" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.235792 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pm2hc"] Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.236582 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pm2hc" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.237291 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-fxg9h"] Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.237710 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-fxg9h" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.238694 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-kwp96"] Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.239448 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kwp96" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.240725 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/79abd33c-0184-473e-8bb9-c408a5c32efc-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-c7gls\" (UID: \"79abd33c-0184-473e-8bb9-c408a5c32efc\") " pod="openshift-authentication/oauth-openshift-558db77b4-c7gls" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.240769 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jcxv5"] Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.240743 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/79abd33c-0184-473e-8bb9-c408a5c32efc-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-c7gls\" (UID: \"79abd33c-0184-473e-8bb9-c408a5c32efc\") " pod="openshift-authentication/oauth-openshift-558db77b4-c7gls" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.241805 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-79qpf"] Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.243014 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-m8p5b"] Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.243042 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jcxv5" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.243101 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-79qpf" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.243247 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/79abd33c-0184-473e-8bb9-c408a5c32efc-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-c7gls\" (UID: \"79abd33c-0184-473e-8bb9-c408a5c32efc\") " pod="openshift-authentication/oauth-openshift-558db77b4-c7gls" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.244004 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/766043b8-3f15-4428-855d-1e82aca4fb63-apiservice-cert\") pod \"packageserver-d55dfcdfc-cqw6s\" (UID: \"766043b8-3f15-4428-855d-1e82aca4fb63\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cqw6s" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.244444 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/acf6d794-22cb-4a06-adf1-4bea5f76e854-serving-cert\") pod \"console-operator-58897d9998-hx54r\" (UID: \"acf6d794-22cb-4a06-adf1-4bea5f76e854\") " pod="openshift-console-operator/console-operator-58897d9998-hx54r" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.244848 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dcb5aba5-cd6c-4535-a92d-d1583fe02b18-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-bcsxh\" (UID: \"dcb5aba5-cd6c-4535-a92d-d1583fe02b18\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bcsxh" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.247652 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-c7gls"] Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.248336 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.248439 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7918cfa2-6bbe-4434-a62f-8b06e3ff324e-serving-cert\") pod \"apiserver-7bbb656c7d-2zzvd\" (UID: \"7918cfa2-6bbe-4434-a62f-8b06e3ff324e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2zzvd" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.248456 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/766043b8-3f15-4428-855d-1e82aca4fb63-webhook-cert\") pod \"packageserver-d55dfcdfc-cqw6s\" (UID: \"766043b8-3f15-4428-855d-1e82aca4fb63\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cqw6s" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.248667 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-2vtl4"] Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.249709 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-2zzvd"] Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.251367 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-flcxm"] Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.252607 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4z48d"] Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.254292 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-p8vxn"] Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.254504 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-pttbb"] Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.255740 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cqw6s"] Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.259575 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pfvzd"] Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.263046 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hdrjx"] Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.265404 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-dsfs8"] Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.266078 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.268822 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h9trv"] Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.270249 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bcsxh"] Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.274869 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-nrbjm"] Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.278448 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-42z4z"] Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.283570 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-42z4z" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.284385 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-d2nj6"] Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.285811 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-d2nj6" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.286789 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.287782 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-c5f9s"] Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.289717 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-c5f9s" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.291473 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-2xmlv"] Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.292821 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6fb639a8-cf5f-4b09-aae1-35844b2e792d-etcd-client\") pod \"etcd-operator-b45778765-tgbx7\" (UID: \"6fb639a8-cf5f-4b09-aae1-35844b2e792d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tgbx7" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.293825 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-2xmlv" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.295281 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-tgbx7"] Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.298536 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-9r6bj"] Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.298586 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rn9hl"] Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.301161 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-hx54r"] Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.303124 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-7hljs"] Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.305710 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.306556 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-rqphx"] Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.308987 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-2l65t"] Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.310486 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-sm85h"] Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.312033 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-xjls2"] Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.313259 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-sljgx"] Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.314854 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jsmc4"] Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.316404 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409960-6f6w5"] Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.317858 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-42z4z"] Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.319364 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pm2hc"] Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.320586 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-79qpf"] Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.322016 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-d2nj6"] Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.325611 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.326020 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-kwp96"] Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.328496 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jcxv5"] Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.329120 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-c5f9s"] Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.330224 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gvp7h"] Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.345496 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.352935 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/6fb639a8-cf5f-4b09-aae1-35844b2e792d-etcd-service-ca\") pod \"etcd-operator-b45778765-tgbx7\" (UID: \"6fb639a8-cf5f-4b09-aae1-35844b2e792d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tgbx7" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.365604 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.386234 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.405981 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.412770 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fb639a8-cf5f-4b09-aae1-35844b2e792d-config\") pod \"etcd-operator-b45778765-tgbx7\" (UID: \"6fb639a8-cf5f-4b09-aae1-35844b2e792d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tgbx7" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.426075 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.433147 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/6fb639a8-cf5f-4b09-aae1-35844b2e792d-etcd-ca\") pod \"etcd-operator-b45778765-tgbx7\" (UID: \"6fb639a8-cf5f-4b09-aae1-35844b2e792d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tgbx7" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.446083 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.465643 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.468352 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/23eabf52-0331-423d-a779-c83ef2d2c0fc-signing-cabundle\") pod \"service-ca-9c57cc56f-sljgx\" (UID: \"23eabf52-0331-423d-a779-c83ef2d2c0fc\") " pod="openshift-service-ca/service-ca-9c57cc56f-sljgx" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.486120 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.506312 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.512602 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/23eabf52-0331-423d-a779-c83ef2d2c0fc-signing-key\") pod \"service-ca-9c57cc56f-sljgx\" (UID: \"23eabf52-0331-423d-a779-c83ef2d2c0fc\") " pod="openshift-service-ca/service-ca-9c57cc56f-sljgx" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.526038 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.545876 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.565267 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.569789 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6fb639a8-cf5f-4b09-aae1-35844b2e792d-serving-cert\") pod \"etcd-operator-b45778765-tgbx7\" (UID: \"6fb639a8-cf5f-4b09-aae1-35844b2e792d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tgbx7" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.585853 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.605815 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.607994 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/406a9c63-4d0d-4c27-8ade-804cd92b0985-images\") pod \"machine-config-operator-74547568cd-xjls2\" (UID: \"406a9c63-4d0d-4c27-8ade-804cd92b0985\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xjls2" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.626628 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.631022 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/406a9c63-4d0d-4c27-8ade-804cd92b0985-proxy-tls\") pod \"machine-config-operator-74547568cd-xjls2\" (UID: \"406a9c63-4d0d-4c27-8ade-804cd92b0985\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xjls2" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.679988 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwwvd\" (UniqueName: \"kubernetes.io/projected/c449e593-5832-4dc3-b251-fc8d2838e680-kube-api-access-rwwvd\") pod \"machine-approver-56656f9798-6wzhb\" (UID: \"c449e593-5832-4dc3-b251-fc8d2838e680\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6wzhb" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.700828 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qpjv\" (UniqueName: \"kubernetes.io/projected/5cd3272a-98a9-421b-97f9-424bc5907cd1-kube-api-access-5qpjv\") pod \"openshift-apiserver-operator-796bbdcf4f-r6pct\" (UID: \"5cd3272a-98a9-421b-97f9-424bc5907cd1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r6pct" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.722312 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l749s\" (UniqueName: \"kubernetes.io/projected/fe562f92-5985-4fbf-a5b9-8359e7a044d9-kube-api-access-l749s\") pod \"apiserver-76f77b778f-m8p5b\" (UID: \"fe562f92-5985-4fbf-a5b9-8359e7a044d9\") " pod="openshift-apiserver/apiserver-76f77b778f-m8p5b" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.740580 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lm7wr\" (UniqueName: \"kubernetes.io/projected/bb6e47d0-5966-48d3-be81-97265e7e7a4f-kube-api-access-lm7wr\") pod \"console-f9d7485db-f9k95\" (UID: \"bb6e47d0-5966-48d3-be81-97265e7e7a4f\") " pod="openshift-console/console-f9d7485db-f9k95" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.785193 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.806195 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.814481 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r6pct" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.824830 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-f9k95" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.826064 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.833675 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6wzhb" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.846171 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 01 14:00:23 crc kubenswrapper[4585]: W1201 14:00:23.847863 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc449e593_5832_4dc3_b251_fc8d2838e680.slice/crio-fd926f96a86dee12c7535a7b823cb00e237699eb304845fd96aaf00e2141b389 WatchSource:0}: Error finding container fd926f96a86dee12c7535a7b823cb00e237699eb304845fd96aaf00e2141b389: Status 404 returned error can't find the container with id fd926f96a86dee12c7535a7b823cb00e237699eb304845fd96aaf00e2141b389 Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.886906 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.907959 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.964671 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.966903 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.967595 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 01 14:00:23 crc kubenswrapper[4585]: I1201 14:00:23.985784 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.005574 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.025668 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.046235 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.065558 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r6pct"] Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.067185 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 01 14:00:24 crc kubenswrapper[4585]: W1201 14:00:24.074814 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5cd3272a_98a9_421b_97f9_424bc5907cd1.slice/crio-35b0e319a0b6cb55b1fc61f4d3efb8f1b486a5731d97cd0ac27d22df9fe49695 WatchSource:0}: Error finding container 35b0e319a0b6cb55b1fc61f4d3efb8f1b486a5731d97cd0ac27d22df9fe49695: Status 404 returned error can't find the container with id 35b0e319a0b6cb55b1fc61f4d3efb8f1b486a5731d97cd0ac27d22df9fe49695 Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.085882 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 01 14:00:24 crc kubenswrapper[4585]: E1201 14:00:24.087192 4585 secret.go:188] Couldn't get secret openshift-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 01 14:00:24 crc kubenswrapper[4585]: E1201 14:00:24.087276 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0b7b830-078c-4448-b914-ab62e5ff7059-serving-cert podName:e0b7b830-078c-4448-b914-ab62e5ff7059 nodeName:}" failed. No retries permitted until 2025-12-01 14:00:24.587253658 +0000 UTC m=+138.571467513 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/e0b7b830-078c-4448-b914-ab62e5ff7059-serving-cert") pod "controller-manager-879f6c89f-n4pr6" (UID: "e0b7b830-078c-4448-b914-ab62e5ff7059") : failed to sync secret cache: timed out waiting for the condition Dec 01 14:00:24 crc kubenswrapper[4585]: E1201 14:00:24.088851 4585 configmap.go:193] Couldn't get configMap openshift-machine-api/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Dec 01 14:00:24 crc kubenswrapper[4585]: E1201 14:00:24.088904 4585 secret.go:188] Couldn't get secret openshift-apiserver/serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 01 14:00:24 crc kubenswrapper[4585]: E1201 14:00:24.088911 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/795dab1c-49d5-4b05-a84f-4e1655d459fc-config podName:795dab1c-49d5-4b05-a84f-4e1655d459fc nodeName:}" failed. No retries permitted until 2025-12-01 14:00:24.588895451 +0000 UTC m=+138.573109306 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/795dab1c-49d5-4b05-a84f-4e1655d459fc-config") pod "machine-api-operator-5694c8668f-42pj4" (UID: "795dab1c-49d5-4b05-a84f-4e1655d459fc") : failed to sync configmap cache: timed out waiting for the condition Dec 01 14:00:24 crc kubenswrapper[4585]: E1201 14:00:24.088953 4585 configmap.go:193] Couldn't get configMap openshift-machine-api/machine-api-operator-images: failed to sync configmap cache: timed out waiting for the condition Dec 01 14:00:24 crc kubenswrapper[4585]: E1201 14:00:24.088986 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fe562f92-5985-4fbf-a5b9-8359e7a044d9-serving-cert podName:fe562f92-5985-4fbf-a5b9-8359e7a044d9 nodeName:}" failed. No retries permitted until 2025-12-01 14:00:24.588959383 +0000 UTC m=+138.573173238 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/fe562f92-5985-4fbf-a5b9-8359e7a044d9-serving-cert") pod "apiserver-76f77b778f-m8p5b" (UID: "fe562f92-5985-4fbf-a5b9-8359e7a044d9") : failed to sync secret cache: timed out waiting for the condition Dec 01 14:00:24 crc kubenswrapper[4585]: E1201 14:00:24.089029 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/795dab1c-49d5-4b05-a84f-4e1655d459fc-images podName:795dab1c-49d5-4b05-a84f-4e1655d459fc nodeName:}" failed. No retries permitted until 2025-12-01 14:00:24.589014705 +0000 UTC m=+138.573228630 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/795dab1c-49d5-4b05-a84f-4e1655d459fc-images") pod "machine-api-operator-5694c8668f-42pj4" (UID: "795dab1c-49d5-4b05-a84f-4e1655d459fc") : failed to sync configmap cache: timed out waiting for the condition Dec 01 14:00:24 crc kubenswrapper[4585]: E1201 14:00:24.092536 4585 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: failed to sync configmap cache: timed out waiting for the condition Dec 01 14:00:24 crc kubenswrapper[4585]: E1201 14:00:24.092599 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e0b7b830-078c-4448-b914-ab62e5ff7059-client-ca podName:e0b7b830-078c-4448-b914-ab62e5ff7059 nodeName:}" failed. No retries permitted until 2025-12-01 14:00:24.59258526 +0000 UTC m=+138.576799115 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/e0b7b830-078c-4448-b914-ab62e5ff7059-client-ca") pod "controller-manager-879f6c89f-n4pr6" (UID: "e0b7b830-078c-4448-b914-ab62e5ff7059") : failed to sync configmap cache: timed out waiting for the condition Dec 01 14:00:24 crc kubenswrapper[4585]: E1201 14:00:24.092632 4585 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: failed to sync configmap cache: timed out waiting for the condition Dec 01 14:00:24 crc kubenswrapper[4585]: E1201 14:00:24.092656 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e0b7b830-078c-4448-b914-ab62e5ff7059-proxy-ca-bundles podName:e0b7b830-078c-4448-b914-ab62e5ff7059 nodeName:}" failed. No retries permitted until 2025-12-01 14:00:24.592647352 +0000 UTC m=+138.576861207 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/e0b7b830-078c-4448-b914-ab62e5ff7059-proxy-ca-bundles") pod "controller-manager-879f6c89f-n4pr6" (UID: "e0b7b830-078c-4448-b914-ab62e5ff7059") : failed to sync configmap cache: timed out waiting for the condition Dec 01 14:00:24 crc kubenswrapper[4585]: E1201 14:00:24.092676 4585 secret.go:188] Couldn't get secret openshift-machine-api/machine-api-operator-tls: failed to sync secret cache: timed out waiting for the condition Dec 01 14:00:24 crc kubenswrapper[4585]: E1201 14:00:24.092696 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/795dab1c-49d5-4b05-a84f-4e1655d459fc-machine-api-operator-tls podName:795dab1c-49d5-4b05-a84f-4e1655d459fc nodeName:}" failed. No retries permitted until 2025-12-01 14:00:24.592690853 +0000 UTC m=+138.576904708 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "machine-api-operator-tls" (UniqueName: "kubernetes.io/secret/795dab1c-49d5-4b05-a84f-4e1655d459fc-machine-api-operator-tls") pod "machine-api-operator-5694c8668f-42pj4" (UID: "795dab1c-49d5-4b05-a84f-4e1655d459fc") : failed to sync secret cache: timed out waiting for the condition Dec 01 14:00:24 crc kubenswrapper[4585]: E1201 14:00:24.093796 4585 secret.go:188] Couldn't get secret openshift-apiserver/encryption-config-1: failed to sync secret cache: timed out waiting for the condition Dec 01 14:00:24 crc kubenswrapper[4585]: E1201 14:00:24.093859 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fe562f92-5985-4fbf-a5b9-8359e7a044d9-encryption-config podName:fe562f92-5985-4fbf-a5b9-8359e7a044d9 nodeName:}" failed. No retries permitted until 2025-12-01 14:00:24.59384648 +0000 UTC m=+138.578060335 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "encryption-config" (UniqueName: "kubernetes.io/secret/fe562f92-5985-4fbf-a5b9-8359e7a044d9-encryption-config") pod "apiserver-76f77b778f-m8p5b" (UID: "fe562f92-5985-4fbf-a5b9-8359e7a044d9") : failed to sync secret cache: timed out waiting for the condition Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.109273 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-f9k95"] Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.111907 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 01 14:00:24 crc kubenswrapper[4585]: W1201 14:00:24.123522 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb6e47d0_5966_48d3_be81_97265e7e7a4f.slice/crio-3d988e6436ba0817468a47dcfc9bd8caf37924a1c3905dad8be2ca04e6e53c99 WatchSource:0}: Error finding container 3d988e6436ba0817468a47dcfc9bd8caf37924a1c3905dad8be2ca04e6e53c99: Status 404 returned error can't find the container with id 3d988e6436ba0817468a47dcfc9bd8caf37924a1c3905dad8be2ca04e6e53c99 Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.125444 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.135726 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6wzhb" event={"ID":"c449e593-5832-4dc3-b251-fc8d2838e680","Type":"ContainerStarted","Data":"fd926f96a86dee12c7535a7b823cb00e237699eb304845fd96aaf00e2141b389"} Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.136372 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-f9k95" event={"ID":"bb6e47d0-5966-48d3-be81-97265e7e7a4f","Type":"ContainerStarted","Data":"3d988e6436ba0817468a47dcfc9bd8caf37924a1c3905dad8be2ca04e6e53c99"} Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.137017 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r6pct" event={"ID":"5cd3272a-98a9-421b-97f9-424bc5907cd1","Type":"ContainerStarted","Data":"35b0e319a0b6cb55b1fc61f4d3efb8f1b486a5731d97cd0ac27d22df9fe49695"} Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.145727 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.181372 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zns2k\" (UniqueName: \"kubernetes.io/projected/23eabf52-0331-423d-a779-c83ef2d2c0fc-kube-api-access-zns2k\") pod \"service-ca-9c57cc56f-sljgx\" (UID: \"23eabf52-0331-423d-a779-c83ef2d2c0fc\") " pod="openshift-service-ca/service-ca-9c57cc56f-sljgx" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.185909 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-sljgx" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.200322 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6scb\" (UniqueName: \"kubernetes.io/projected/7918cfa2-6bbe-4434-a62f-8b06e3ff324e-kube-api-access-b6scb\") pod \"apiserver-7bbb656c7d-2zzvd\" (UID: \"7918cfa2-6bbe-4434-a62f-8b06e3ff324e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2zzvd" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.203543 4585 request.go:700] Waited for 1.002792122s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-config-operator/serviceaccounts/openshift-config-operator/token Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.219928 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krsjb\" (UniqueName: \"kubernetes.io/projected/bf71ed2f-6b27-4ce7-93ae-6f5ced50b306-kube-api-access-krsjb\") pod \"openshift-config-operator-7777fb866f-sm85h\" (UID: \"bf71ed2f-6b27-4ce7-93ae-6f5ced50b306\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-sm85h" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.225408 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.247310 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2zzvd" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.247750 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.267241 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.285915 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.292143 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-sm85h" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.311190 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.326385 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.347310 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.382053 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bcv4\" (UniqueName: \"kubernetes.io/projected/406a9c63-4d0d-4c27-8ade-804cd92b0985-kube-api-access-2bcv4\") pod \"machine-config-operator-74547568cd-xjls2\" (UID: \"406a9c63-4d0d-4c27-8ade-804cd92b0985\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xjls2" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.403457 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzd22\" (UniqueName: \"kubernetes.io/projected/7aedeec4-fc41-47aa-85a1-d5a92de50deb-kube-api-access-bzd22\") pod \"route-controller-manager-6576b87f9c-2vtl4\" (UID: \"7aedeec4-fc41-47aa-85a1-d5a92de50deb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2vtl4" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.421915 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/81c799b6-ed27-4b1c-9751-55fdcc101362-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-h9trv\" (UID: \"81c799b6-ed27-4b1c-9751-55fdcc101362\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h9trv" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.445959 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgm82\" (UniqueName: \"kubernetes.io/projected/bf56f15d-84d6-47c7-b8f7-1c9922d8f53f-kube-api-access-kgm82\") pod \"cluster-samples-operator-665b6dd947-flcxm\" (UID: \"bf56f15d-84d6-47c7-b8f7-1c9922d8f53f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-flcxm" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.460446 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-sljgx"] Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.469237 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbb7l\" (UniqueName: \"kubernetes.io/projected/766043b8-3f15-4428-855d-1e82aca4fb63-kube-api-access-wbb7l\") pod \"packageserver-d55dfcdfc-cqw6s\" (UID: \"766043b8-3f15-4428-855d-1e82aca4fb63\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cqw6s" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.481432 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-2zzvd"] Dec 01 14:00:24 crc kubenswrapper[4585]: W1201 14:00:24.482560 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23eabf52_0331_423d_a779_c83ef2d2c0fc.slice/crio-9f867d01ae6a1c4d0fe4eaf889816eca439eff7dc728a18fb62db2d00ef02158 WatchSource:0}: Error finding container 9f867d01ae6a1c4d0fe4eaf889816eca439eff7dc728a18fb62db2d00ef02158: Status 404 returned error can't find the container with id 9f867d01ae6a1c4d0fe4eaf889816eca439eff7dc728a18fb62db2d00ef02158 Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.483444 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26hrb\" (UniqueName: \"kubernetes.io/projected/e3752441-ce0c-46e5-bf1c-3bfab4ae6819-kube-api-access-26hrb\") pod \"machine-config-controller-84d6567774-pttbb\" (UID: \"e3752441-ce0c-46e5-bf1c-3bfab4ae6819\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pttbb" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.495595 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xjls2" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.502204 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlj4s\" (UniqueName: \"kubernetes.io/projected/a988a2aa-8447-47fe-9b03-771d1633d69f-kube-api-access-nlj4s\") pod \"openshift-controller-manager-operator-756b6f6bc6-4z48d\" (UID: \"a988a2aa-8447-47fe-9b03-771d1633d69f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4z48d" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.520876 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bd26v\" (UniqueName: \"kubernetes.io/projected/6fb639a8-cf5f-4b09-aae1-35844b2e792d-kube-api-access-bd26v\") pod \"etcd-operator-b45778765-tgbx7\" (UID: \"6fb639a8-cf5f-4b09-aae1-35844b2e792d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tgbx7" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.533318 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2vtl4" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.548381 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.548676 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pbfk\" (UniqueName: \"kubernetes.io/projected/1d15ad5d-2ee0-4543-8e56-89fa7a2461f7-kube-api-access-5pbfk\") pod \"migrator-59844c95c7-rqphx\" (UID: \"1d15ad5d-2ee0-4543-8e56-89fa7a2461f7\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rqphx" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.559505 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4z48d" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.568304 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-sm85h"] Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.582749 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72q79\" (UniqueName: \"kubernetes.io/projected/79abd33c-0184-473e-8bb9-c408a5c32efc-kube-api-access-72q79\") pod \"oauth-openshift-558db77b4-c7gls\" (UID: \"79abd33c-0184-473e-8bb9-c408a5c32efc\") " pod="openshift-authentication/oauth-openshift-558db77b4-c7gls" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.587396 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.605255 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.629478 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.643477 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-flcxm" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.651556 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.652703 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe562f92-5985-4fbf-a5b9-8359e7a044d9-serving-cert\") pod \"apiserver-76f77b778f-m8p5b\" (UID: \"fe562f92-5985-4fbf-a5b9-8359e7a044d9\") " pod="openshift-apiserver/apiserver-76f77b778f-m8p5b" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.652761 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/795dab1c-49d5-4b05-a84f-4e1655d459fc-config\") pod \"machine-api-operator-5694c8668f-42pj4\" (UID: \"795dab1c-49d5-4b05-a84f-4e1655d459fc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-42pj4" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.652871 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0b7b830-078c-4448-b914-ab62e5ff7059-client-ca\") pod \"controller-manager-879f6c89f-n4pr6\" (UID: \"e0b7b830-078c-4448-b914-ab62e5ff7059\") " pod="openshift-controller-manager/controller-manager-879f6c89f-n4pr6" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.652899 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/795dab1c-49d5-4b05-a84f-4e1655d459fc-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-42pj4\" (UID: \"795dab1c-49d5-4b05-a84f-4e1655d459fc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-42pj4" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.652926 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e0b7b830-078c-4448-b914-ab62e5ff7059-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-n4pr6\" (UID: \"e0b7b830-078c-4448-b914-ab62e5ff7059\") " pod="openshift-controller-manager/controller-manager-879f6c89f-n4pr6" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.652963 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/fe562f92-5985-4fbf-a5b9-8359e7a044d9-encryption-config\") pod \"apiserver-76f77b778f-m8p5b\" (UID: \"fe562f92-5985-4fbf-a5b9-8359e7a044d9\") " pod="openshift-apiserver/apiserver-76f77b778f-m8p5b" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.653001 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0b7b830-078c-4448-b914-ab62e5ff7059-serving-cert\") pod \"controller-manager-879f6c89f-n4pr6\" (UID: \"e0b7b830-078c-4448-b914-ab62e5ff7059\") " pod="openshift-controller-manager/controller-manager-879f6c89f-n4pr6" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.653034 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/795dab1c-49d5-4b05-a84f-4e1655d459fc-images\") pod \"machine-api-operator-5694c8668f-42pj4\" (UID: \"795dab1c-49d5-4b05-a84f-4e1655d459fc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-42pj4" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.675542 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.704887 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pttbb" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.704957 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-547g7\" (UniqueName: \"kubernetes.io/projected/59ce8ebf-2806-4ff3-bd4c-7f1ace81f7e2-kube-api-access-547g7\") pod \"authentication-operator-69f744f599-p8vxn\" (UID: \"59ce8ebf-2806-4ff3-bd4c-7f1ace81f7e2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-p8vxn" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.719668 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wb7qg\" (UniqueName: \"kubernetes.io/projected/acf6d794-22cb-4a06-adf1-4bea5f76e854-kube-api-access-wb7qg\") pod \"console-operator-58897d9998-hx54r\" (UID: \"acf6d794-22cb-4a06-adf1-4bea5f76e854\") " pod="openshift-console-operator/console-operator-58897d9998-hx54r" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.741272 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dcb5aba5-cd6c-4535-a92d-d1583fe02b18-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-bcsxh\" (UID: \"dcb5aba5-cd6c-4535-a92d-d1583fe02b18\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bcsxh" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.752756 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cqw6s" Dec 01 14:00:24 crc kubenswrapper[4585]: E1201 14:00:24.760292 4585 projected.go:288] Couldn't get configMap openshift-controller-manager/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.761654 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-tgbx7" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.767095 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rqphx" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.767189 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-hx54r" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.774255 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wd92n\" (UniqueName: \"kubernetes.io/projected/81c799b6-ed27-4b1c-9751-55fdcc101362-kube-api-access-wd92n\") pod \"cluster-image-registry-operator-dc59b4c8b-h9trv\" (UID: \"81c799b6-ed27-4b1c-9751-55fdcc101362\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h9trv" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.776113 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-c7gls" Dec 01 14:00:24 crc kubenswrapper[4585]: E1201 14:00:24.777434 4585 projected.go:288] Couldn't get configMap openshift-machine-api/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.790057 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4txwv\" (UniqueName: \"kubernetes.io/projected/c239d6eb-535e-442b-a67a-f8227313ceb4-kube-api-access-4txwv\") pod \"downloads-7954f5f757-dsfs8\" (UID: \"c239d6eb-535e-442b-a67a-f8227313ceb4\") " pod="openshift-console/downloads-7954f5f757-dsfs8" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.791878 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.811919 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-2vtl4"] Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.813697 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.820106 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-dsfs8" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.828799 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.852657 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.854153 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-xjls2"] Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.854364 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4z48d"] Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.865698 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.874219 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-p8vxn" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.885334 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h9trv" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.887108 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.906524 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.928510 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.946090 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.966063 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 01 14:00:24 crc kubenswrapper[4585]: W1201 14:00:24.982186 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod406a9c63_4d0d_4c27_8ade_804cd92b0985.slice/crio-16b1a73b5ed7ceddb9e5ea2075b7585f683bc7b6d26a132c36798b888eb8f159 WatchSource:0}: Error finding container 16b1a73b5ed7ceddb9e5ea2075b7585f683bc7b6d26a132c36798b888eb8f159: Status 404 returned error can't find the container with id 16b1a73b5ed7ceddb9e5ea2075b7585f683bc7b6d26a132c36798b888eb8f159 Dec 01 14:00:24 crc kubenswrapper[4585]: I1201 14:00:24.985303 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.015357 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.025044 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bcsxh" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.025925 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.044916 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.046706 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-pttbb"] Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.078207 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.088872 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.106810 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.109927 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cqw6s"] Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.129112 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.135266 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-hx54r"] Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.147951 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.168948 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.195513 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.202722 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xjls2" event={"ID":"406a9c63-4d0d-4c27-8ade-804cd92b0985","Type":"ContainerStarted","Data":"16b1a73b5ed7ceddb9e5ea2075b7585f683bc7b6d26a132c36798b888eb8f159"} Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.204244 4585 request.go:700] Waited for 1.920329738s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-canary/secrets?fieldSelector=metadata.name%3Dcanary-serving-cert&limit=500&resourceVersion=0 Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.206303 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.209838 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pttbb" event={"ID":"e3752441-ce0c-46e5-bf1c-3bfab4ae6819","Type":"ContainerStarted","Data":"5c8a62b48db6412cf383505d13494a6fc0580437e5fc3dd75a982e42fa57332e"} Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.214248 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2vtl4" event={"ID":"7aedeec4-fc41-47aa-85a1-d5a92de50deb","Type":"ContainerStarted","Data":"399fa49b23bfa440b0d8a1ea9cb5a6d18bf49facf00a52b0589f86ceebf5ca49"} Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.271466 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.271672 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.271846 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.286647 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6wzhb" event={"ID":"c449e593-5832-4dc3-b251-fc8d2838e680","Type":"ContainerStarted","Data":"d66037de694dda2133e8fe032a42fa937169c6fb8307f11f9b2a694d1e73919a"} Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.286695 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6wzhb" event={"ID":"c449e593-5832-4dc3-b251-fc8d2838e680","Type":"ContainerStarted","Data":"5afa419980525c7432b0c40bff93e8b40241233e85e5f54be540af1ea34b731b"} Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.288724 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-sm85h" event={"ID":"bf71ed2f-6b27-4ce7-93ae-6f5ced50b306","Type":"ContainerStarted","Data":"fa79b76cf7eecd891d0a4aab1222ccaa72fd959efd3b1a70443d6acd53a2c31b"} Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.290224 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2zzvd" event={"ID":"7918cfa2-6bbe-4434-a62f-8b06e3ff324e","Type":"ContainerStarted","Data":"4c1f4dc9f28b99956b96dfff2217c40c2b6b9beb0931299c40074982151254a7"} Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.291316 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r6pct" event={"ID":"5cd3272a-98a9-421b-97f9-424bc5907cd1","Type":"ContainerStarted","Data":"296313fd1bb7e0546c94df4d1b58c09030df01d9402c8ade9142b4c22780bb45"} Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.292686 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4z48d" event={"ID":"a988a2aa-8447-47fe-9b03-771d1633d69f","Type":"ContainerStarted","Data":"8ff858b4b53b93563413256b2465c396c58155009929398fc438b48d2021e9e3"} Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.295488 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.307219 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-f9k95" event={"ID":"bb6e47d0-5966-48d3-be81-97265e7e7a4f","Type":"ContainerStarted","Data":"44719bd98646ac79145902ccb432144f564a1b267ced682f893392bd75c0241a"} Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.310141 4585 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.326367 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-flcxm"] Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.336107 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.350072 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.351887 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-sljgx" event={"ID":"23eabf52-0331-423d-a779-c83ef2d2c0fc","Type":"ContainerStarted","Data":"11a6ddcda5097e5852dde099dafa083ce6cb122a9b81201e01d5bdd02f590978"} Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.351922 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-sljgx" event={"ID":"23eabf52-0331-423d-a779-c83ef2d2c0fc","Type":"ContainerStarted","Data":"9f867d01ae6a1c4d0fe4eaf889816eca439eff7dc728a18fb62db2d00ef02158"} Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.371714 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.389401 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.430523 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.455535 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.457750 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/795dab1c-49d5-4b05-a84f-4e1655d459fc-config\") pod \"machine-api-operator-5694c8668f-42pj4\" (UID: \"795dab1c-49d5-4b05-a84f-4e1655d459fc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-42pj4" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.466435 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.471344 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6763aabd-f571-4b13-82fd-3a4a9bdf8406-ca-trust-extracted\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.471408 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6763aabd-f571-4b13-82fd-3a4a9bdf8406-installation-pull-secrets\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.471451 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.471702 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6763aabd-f571-4b13-82fd-3a4a9bdf8406-bound-sa-token\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.471771 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6763aabd-f571-4b13-82fd-3a4a9bdf8406-trusted-ca\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.471805 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwps2\" (UniqueName: \"kubernetes.io/projected/6763aabd-f571-4b13-82fd-3a4a9bdf8406-kube-api-access-dwps2\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.471822 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6763aabd-f571-4b13-82fd-3a4a9bdf8406-registry-tls\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.471842 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6763aabd-f571-4b13-82fd-3a4a9bdf8406-registry-certificates\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:25 crc kubenswrapper[4585]: E1201 14:00:25.472164 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 14:00:25.972153288 +0000 UTC m=+139.956367143 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrbjm" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.491786 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.507606 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.513613 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/fe562f92-5985-4fbf-a5b9-8359e7a044d9-encryption-config\") pod \"apiserver-76f77b778f-m8p5b\" (UID: \"fe562f92-5985-4fbf-a5b9-8359e7a044d9\") " pod="openshift-apiserver/apiserver-76f77b778f-m8p5b" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.514218 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0b7b830-078c-4448-b914-ab62e5ff7059-client-ca\") pod \"controller-manager-879f6c89f-n4pr6\" (UID: \"e0b7b830-078c-4448-b914-ab62e5ff7059\") " pod="openshift-controller-manager/controller-manager-879f6c89f-n4pr6" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.532763 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-tgbx7"] Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.552071 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.559270 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe562f92-5985-4fbf-a5b9-8359e7a044d9-serving-cert\") pod \"apiserver-76f77b778f-m8p5b\" (UID: \"fe562f92-5985-4fbf-a5b9-8359e7a044d9\") " pod="openshift-apiserver/apiserver-76f77b778f-m8p5b" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.581443 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.581575 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6763aabd-f571-4b13-82fd-3a4a9bdf8406-registry-certificates\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.581601 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f9e4580-6380-44b8-aec6-8b4ff897cd82-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pm2hc\" (UID: \"7f9e4580-6380-44b8-aec6-8b4ff897cd82\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pm2hc" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.581628 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/fb007ad6-c27c-4d9e-8b4f-942db474d37b-srv-cert\") pod \"catalog-operator-68c6474976-jcxv5\" (UID: \"fb007ad6-c27c-4d9e-8b4f-942db474d37b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jcxv5" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.581704 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94mbc\" (UniqueName: \"kubernetes.io/projected/0b61c7e4-9ec8-4fd7-a9d8-2e386dbf9acd-kube-api-access-94mbc\") pod \"dns-operator-744455d44c-2l65t\" (UID: \"0b61c7e4-9ec8-4fd7-a9d8-2e386dbf9acd\") " pod="openshift-dns-operator/dns-operator-744455d44c-2l65t" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.581749 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6djt\" (UniqueName: \"kubernetes.io/projected/e7b5b76d-8f89-4753-81a8-81a886d87abf-kube-api-access-h6djt\") pod \"kube-storage-version-migrator-operator-b67b599dd-gvp7h\" (UID: \"e7b5b76d-8f89-4753-81a8-81a886d87abf\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gvp7h" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.581780 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5d58ac16-d8ec-4d10-ab7f-cc20ecb76eb7-metrics-tls\") pod \"dns-default-d2nj6\" (UID: \"5d58ac16-d8ec-4d10-ab7f-cc20ecb76eb7\") " pod="openshift-dns/dns-default-d2nj6" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.581830 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6763aabd-f571-4b13-82fd-3a4a9bdf8406-installation-pull-secrets\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.581852 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6gsl\" (UniqueName: \"kubernetes.io/projected/dd4ddb1d-714d-42d7-b307-cd7db58d02de-kube-api-access-r6gsl\") pod \"olm-operator-6b444d44fb-pfvzd\" (UID: \"dd4ddb1d-714d-42d7-b307-cd7db58d02de\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pfvzd" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.581875 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f9e4580-6380-44b8-aec6-8b4ff897cd82-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pm2hc\" (UID: \"7f9e4580-6380-44b8-aec6-8b4ff897cd82\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pm2hc" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.581907 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddqrc\" (UniqueName: \"kubernetes.io/projected/bbfadd6e-84a7-4fa8-9766-0358345cf2e2-kube-api-access-ddqrc\") pod \"csi-hostpathplugin-c5f9s\" (UID: \"bbfadd6e-84a7-4fa8-9766-0358345cf2e2\") " pod="hostpath-provisioner/csi-hostpathplugin-c5f9s" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.581992 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c5a1b274-e180-463c-acf4-d2a5aa181827-metrics-tls\") pod \"ingress-operator-5b745b69d9-kwp96\" (UID: \"c5a1b274-e180-463c-acf4-d2a5aa181827\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kwp96" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.582012 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsdnr\" (UniqueName: \"kubernetes.io/projected/c5a1b274-e180-463c-acf4-d2a5aa181827-kube-api-access-zsdnr\") pod \"ingress-operator-5b745b69d9-kwp96\" (UID: \"c5a1b274-e180-463c-acf4-d2a5aa181827\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kwp96" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.582028 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5d58ac16-d8ec-4d10-ab7f-cc20ecb76eb7-config-volume\") pod \"dns-default-d2nj6\" (UID: \"5d58ac16-d8ec-4d10-ab7f-cc20ecb76eb7\") " pod="openshift-dns/dns-default-d2nj6" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.582045 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsrrk\" (UniqueName: \"kubernetes.io/projected/1ca012d6-094a-4703-b8cc-d9d53fa9886d-kube-api-access-qsrrk\") pod \"collect-profiles-29409960-6f6w5\" (UID: \"1ca012d6-094a-4703-b8cc-d9d53fa9886d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409960-6f6w5" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.582065 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/bbfadd6e-84a7-4fa8-9766-0358345cf2e2-mountpoint-dir\") pod \"csi-hostpathplugin-c5f9s\" (UID: \"bbfadd6e-84a7-4fa8-9766-0358345cf2e2\") " pod="hostpath-provisioner/csi-hostpathplugin-c5f9s" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.582086 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0de865f-14b4-4fd8-9be7-3b5655814a67-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-jsmc4\" (UID: \"f0de865f-14b4-4fd8-9be7-3b5655814a67\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jsmc4" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.582106 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e42acc88-ce5b-4fc0-b4a6-c3c78fcf8428-service-ca-bundle\") pod \"router-default-5444994796-fxg9h\" (UID: \"e42acc88-ce5b-4fc0-b4a6-c3c78fcf8428\") " pod="openshift-ingress/router-default-5444994796-fxg9h" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.582125 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/bbfadd6e-84a7-4fa8-9766-0358345cf2e2-plugins-dir\") pod \"csi-hostpathplugin-c5f9s\" (UID: \"bbfadd6e-84a7-4fa8-9766-0358345cf2e2\") " pod="hostpath-provisioner/csi-hostpathplugin-c5f9s" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.582148 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7f9e4580-6380-44b8-aec6-8b4ff897cd82-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pm2hc\" (UID: \"7f9e4580-6380-44b8-aec6-8b4ff897cd82\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pm2hc" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.582167 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f0de865f-14b4-4fd8-9be7-3b5655814a67-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-jsmc4\" (UID: \"f0de865f-14b4-4fd8-9be7-3b5655814a67\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jsmc4" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.582220 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1ca012d6-094a-4703-b8cc-d9d53fa9886d-secret-volume\") pod \"collect-profiles-29409960-6f6w5\" (UID: \"1ca012d6-094a-4703-b8cc-d9d53fa9886d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409960-6f6w5" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.582295 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/878aa194-4036-493a-8e75-990c4b57e793-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-7hljs\" (UID: \"878aa194-4036-493a-8e75-990c4b57e793\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7hljs" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.582330 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e42acc88-ce5b-4fc0-b4a6-c3c78fcf8428-metrics-certs\") pod \"router-default-5444994796-fxg9h\" (UID: \"e42acc88-ce5b-4fc0-b4a6-c3c78fcf8428\") " pod="openshift-ingress/router-default-5444994796-fxg9h" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.582352 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7b5b76d-8f89-4753-81a8-81a886d87abf-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-gvp7h\" (UID: \"e7b5b76d-8f89-4753-81a8-81a886d87abf\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gvp7h" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.582369 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e42acc88-ce5b-4fc0-b4a6-c3c78fcf8428-stats-auth\") pod \"router-default-5444994796-fxg9h\" (UID: \"e42acc88-ce5b-4fc0-b4a6-c3c78fcf8428\") " pod="openshift-ingress/router-default-5444994796-fxg9h" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.582435 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c5a1b274-e180-463c-acf4-d2a5aa181827-trusted-ca\") pod \"ingress-operator-5b745b69d9-kwp96\" (UID: \"c5a1b274-e180-463c-acf4-d2a5aa181827\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kwp96" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.582456 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfl8z\" (UniqueName: \"kubernetes.io/projected/e118cb4b-2a7d-4f32-ad96-98ad49c928f7-kube-api-access-qfl8z\") pod \"ingress-canary-42z4z\" (UID: \"e118cb4b-2a7d-4f32-ad96-98ad49c928f7\") " pod="openshift-ingress-canary/ingress-canary-42z4z" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.582475 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b615253a-f52e-4607-a63c-7cf1c07dab6b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-hdrjx\" (UID: \"b615253a-f52e-4607-a63c-7cf1c07dab6b\") " pod="openshift-marketplace/marketplace-operator-79b997595-hdrjx" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.582516 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxcds\" (UniqueName: \"kubernetes.io/projected/fc4fd121-a822-4178-b67f-3f93eedd535e-kube-api-access-lxcds\") pod \"package-server-manager-789f6589d5-79qpf\" (UID: \"fc4fd121-a822-4178-b67f-3f93eedd535e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-79qpf" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.582590 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/dd4ddb1d-714d-42d7-b307-cd7db58d02de-srv-cert\") pod \"olm-operator-6b444d44fb-pfvzd\" (UID: \"dd4ddb1d-714d-42d7-b307-cd7db58d02de\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pfvzd" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.582612 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgf9b\" (UniqueName: \"kubernetes.io/projected/3b3388b2-3b24-448b-8806-9857c4e057b9-kube-api-access-cgf9b\") pod \"machine-config-server-2xmlv\" (UID: \"3b3388b2-3b24-448b-8806-9857c4e057b9\") " pod="openshift-machine-config-operator/machine-config-server-2xmlv" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.582644 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6763aabd-f571-4b13-82fd-3a4a9bdf8406-ca-trust-extracted\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.582691 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b61c7e4-9ec8-4fd7-a9d8-2e386dbf9acd-metrics-tls\") pod \"dns-operator-744455d44c-2l65t\" (UID: \"0b61c7e4-9ec8-4fd7-a9d8-2e386dbf9acd\") " pod="openshift-dns-operator/dns-operator-744455d44c-2l65t" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.582715 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7b5b76d-8f89-4753-81a8-81a886d87abf-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-gvp7h\" (UID: \"e7b5b76d-8f89-4753-81a8-81a886d87abf\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gvp7h" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.582734 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0de865f-14b4-4fd8-9be7-3b5655814a67-config\") pod \"kube-apiserver-operator-766d6c64bb-jsmc4\" (UID: \"f0de865f-14b4-4fd8-9be7-3b5655814a67\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jsmc4" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.582763 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rrbm\" (UniqueName: \"kubernetes.io/projected/fb007ad6-c27c-4d9e-8b4f-942db474d37b-kube-api-access-2rrbm\") pod \"catalog-operator-68c6474976-jcxv5\" (UID: \"fb007ad6-c27c-4d9e-8b4f-942db474d37b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jcxv5" Dec 01 14:00:25 crc kubenswrapper[4585]: E1201 14:00:25.582785 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 14:00:26.082767621 +0000 UTC m=+140.066981476 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.582841 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c5a1b274-e180-463c-acf4-d2a5aa181827-bound-sa-token\") pod \"ingress-operator-5b745b69d9-kwp96\" (UID: \"c5a1b274-e180-463c-acf4-d2a5aa181827\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kwp96" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.582863 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bbfadd6e-84a7-4fa8-9766-0358345cf2e2-registration-dir\") pod \"csi-hostpathplugin-c5f9s\" (UID: \"bbfadd6e-84a7-4fa8-9766-0358345cf2e2\") " pod="hostpath-provisioner/csi-hostpathplugin-c5f9s" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.582908 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qth7v\" (UniqueName: \"kubernetes.io/projected/e42acc88-ce5b-4fc0-b4a6-c3c78fcf8428-kube-api-access-qth7v\") pod \"router-default-5444994796-fxg9h\" (UID: \"e42acc88-ce5b-4fc0-b4a6-c3c78fcf8428\") " pod="openshift-ingress/router-default-5444994796-fxg9h" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.582952 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e118cb4b-2a7d-4f32-ad96-98ad49c928f7-cert\") pod \"ingress-canary-42z4z\" (UID: \"e118cb4b-2a7d-4f32-ad96-98ad49c928f7\") " pod="openshift-ingress-canary/ingress-canary-42z4z" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.583018 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/dd4ddb1d-714d-42d7-b307-cd7db58d02de-profile-collector-cert\") pod \"olm-operator-6b444d44fb-pfvzd\" (UID: \"dd4ddb1d-714d-42d7-b307-cd7db58d02de\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pfvzd" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.583052 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57v2v\" (UniqueName: \"kubernetes.io/projected/b615253a-f52e-4607-a63c-7cf1c07dab6b-kube-api-access-57v2v\") pod \"marketplace-operator-79b997595-hdrjx\" (UID: \"b615253a-f52e-4607-a63c-7cf1c07dab6b\") " pod="openshift-marketplace/marketplace-operator-79b997595-hdrjx" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.583069 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1ca012d6-094a-4703-b8cc-d9d53fa9886d-config-volume\") pod \"collect-profiles-29409960-6f6w5\" (UID: \"1ca012d6-094a-4703-b8cc-d9d53fa9886d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409960-6f6w5" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.583132 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.583148 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e42acc88-ce5b-4fc0-b4a6-c3c78fcf8428-default-certificate\") pod \"router-default-5444994796-fxg9h\" (UID: \"e42acc88-ce5b-4fc0-b4a6-c3c78fcf8428\") " pod="openshift-ingress/router-default-5444994796-fxg9h" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.583226 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6763aabd-f571-4b13-82fd-3a4a9bdf8406-bound-sa-token\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.583248 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bbfadd6e-84a7-4fa8-9766-0358345cf2e2-socket-dir\") pod \"csi-hostpathplugin-c5f9s\" (UID: \"bbfadd6e-84a7-4fa8-9766-0358345cf2e2\") " pod="hostpath-provisioner/csi-hostpathplugin-c5f9s" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.583276 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/fc4fd121-a822-4178-b67f-3f93eedd535e-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-79qpf\" (UID: \"fc4fd121-a822-4178-b67f-3f93eedd535e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-79qpf" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.583307 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c6d8fb5-74db-4420-90f6-33a5026a755f-config\") pod \"service-ca-operator-777779d784-9r6bj\" (UID: \"4c6d8fb5-74db-4420-90f6-33a5026a755f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9r6bj" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.583324 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/bbfadd6e-84a7-4fa8-9766-0358345cf2e2-csi-data-dir\") pod \"csi-hostpathplugin-c5f9s\" (UID: \"bbfadd6e-84a7-4fa8-9766-0358345cf2e2\") " pod="hostpath-provisioner/csi-hostpathplugin-c5f9s" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.583362 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c6d8fb5-74db-4420-90f6-33a5026a755f-serving-cert\") pod \"service-ca-operator-777779d784-9r6bj\" (UID: \"4c6d8fb5-74db-4420-90f6-33a5026a755f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9r6bj" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.583378 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/3b3388b2-3b24-448b-8806-9857c4e057b9-certs\") pod \"machine-config-server-2xmlv\" (UID: \"3b3388b2-3b24-448b-8806-9857c4e057b9\") " pod="openshift-machine-config-operator/machine-config-server-2xmlv" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.583412 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b615253a-f52e-4607-a63c-7cf1c07dab6b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-hdrjx\" (UID: \"b615253a-f52e-4607-a63c-7cf1c07dab6b\") " pod="openshift-marketplace/marketplace-operator-79b997595-hdrjx" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.583443 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlbsz\" (UniqueName: \"kubernetes.io/projected/5d58ac16-d8ec-4d10-ab7f-cc20ecb76eb7-kube-api-access-zlbsz\") pod \"dns-default-d2nj6\" (UID: \"5d58ac16-d8ec-4d10-ab7f-cc20ecb76eb7\") " pod="openshift-dns/dns-default-d2nj6" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.583498 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6763aabd-f571-4b13-82fd-3a4a9bdf8406-trusted-ca\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.583515 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/dc83aa4a-2686-47c8-876b-c6cf2192b493-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-rn9hl\" (UID: \"dc83aa4a-2686-47c8-876b-c6cf2192b493\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rn9hl" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.583532 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/3b3388b2-3b24-448b-8806-9857c4e057b9-node-bootstrap-token\") pod \"machine-config-server-2xmlv\" (UID: \"3b3388b2-3b24-448b-8806-9857c4e057b9\") " pod="openshift-machine-config-operator/machine-config-server-2xmlv" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.583551 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/fb007ad6-c27c-4d9e-8b4f-942db474d37b-profile-collector-cert\") pod \"catalog-operator-68c6474976-jcxv5\" (UID: \"fb007ad6-c27c-4d9e-8b4f-942db474d37b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jcxv5" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.583580 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwps2\" (UniqueName: \"kubernetes.io/projected/6763aabd-f571-4b13-82fd-3a4a9bdf8406-kube-api-access-dwps2\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.583597 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6763aabd-f571-4b13-82fd-3a4a9bdf8406-registry-tls\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.583613 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vx585\" (UniqueName: \"kubernetes.io/projected/878aa194-4036-493a-8e75-990c4b57e793-kube-api-access-vx585\") pod \"multus-admission-controller-857f4d67dd-7hljs\" (UID: \"878aa194-4036-493a-8e75-990c4b57e793\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7hljs" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.583632 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgdgf\" (UniqueName: \"kubernetes.io/projected/4c6d8fb5-74db-4420-90f6-33a5026a755f-kube-api-access-pgdgf\") pod \"service-ca-operator-777779d784-9r6bj\" (UID: \"4c6d8fb5-74db-4420-90f6-33a5026a755f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9r6bj" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.583650 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q52mz\" (UniqueName: \"kubernetes.io/projected/dc83aa4a-2686-47c8-876b-c6cf2192b493-kube-api-access-q52mz\") pod \"control-plane-machine-set-operator-78cbb6b69f-rn9hl\" (UID: \"dc83aa4a-2686-47c8-876b-c6cf2192b493\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rn9hl" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.586765 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.596370 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6763aabd-f571-4b13-82fd-3a4a9bdf8406-registry-certificates\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.597650 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 01 14:00:25 crc kubenswrapper[4585]: E1201 14:00:25.598846 4585 projected.go:194] Error preparing data for projected volume kube-api-access-nhj9p for pod openshift-machine-api/machine-api-operator-5694c8668f-42pj4: failed to sync configmap cache: timed out waiting for the condition Dec 01 14:00:25 crc kubenswrapper[4585]: E1201 14:00:25.599634 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/795dab1c-49d5-4b05-a84f-4e1655d459fc-kube-api-access-nhj9p podName:795dab1c-49d5-4b05-a84f-4e1655d459fc nodeName:}" failed. No retries permitted until 2025-12-01 14:00:26.099609984 +0000 UTC m=+140.083823839 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-nhj9p" (UniqueName: "kubernetes.io/projected/795dab1c-49d5-4b05-a84f-4e1655d459fc-kube-api-access-nhj9p") pod "machine-api-operator-5694c8668f-42pj4" (UID: "795dab1c-49d5-4b05-a84f-4e1655d459fc") : failed to sync configmap cache: timed out waiting for the condition Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.600832 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6763aabd-f571-4b13-82fd-3a4a9bdf8406-ca-trust-extracted\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:25 crc kubenswrapper[4585]: E1201 14:00:25.619907 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 14:00:26.119889097 +0000 UTC m=+140.104102952 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrbjm" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.634127 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6763aabd-f571-4b13-82fd-3a4a9bdf8406-installation-pull-secrets\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.639469 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.646875 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/795dab1c-49d5-4b05-a84f-4e1655d459fc-images\") pod \"machine-api-operator-5694c8668f-42pj4\" (UID: \"795dab1c-49d5-4b05-a84f-4e1655d459fc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-42pj4" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.651338 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 01 14:00:25 crc kubenswrapper[4585]: E1201 14:00:25.655400 4585 secret.go:188] Couldn't get secret openshift-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.655629 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6763aabd-f571-4b13-82fd-3a4a9bdf8406-trusted-ca\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:25 crc kubenswrapper[4585]: E1201 14:00:25.657363 4585 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: failed to sync configmap cache: timed out waiting for the condition Dec 01 14:00:25 crc kubenswrapper[4585]: E1201 14:00:25.660121 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0b7b830-078c-4448-b914-ab62e5ff7059-serving-cert podName:e0b7b830-078c-4448-b914-ab62e5ff7059 nodeName:}" failed. No retries permitted until 2025-12-01 14:00:26.660101232 +0000 UTC m=+140.644315087 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/e0b7b830-078c-4448-b914-ab62e5ff7059-serving-cert") pod "controller-manager-879f6c89f-n4pr6" (UID: "e0b7b830-078c-4448-b914-ab62e5ff7059") : failed to sync secret cache: timed out waiting for the condition Dec 01 14:00:25 crc kubenswrapper[4585]: E1201 14:00:25.660253 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e0b7b830-078c-4448-b914-ab62e5ff7059-proxy-ca-bundles podName:e0b7b830-078c-4448-b914-ab62e5ff7059 nodeName:}" failed. No retries permitted until 2025-12-01 14:00:26.660244887 +0000 UTC m=+140.644458742 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/e0b7b830-078c-4448-b914-ab62e5ff7059-proxy-ca-bundles") pod "controller-manager-879f6c89f-n4pr6" (UID: "e0b7b830-078c-4448-b914-ab62e5ff7059") : failed to sync configmap cache: timed out waiting for the condition Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.661595 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6763aabd-f571-4b13-82fd-3a4a9bdf8406-registry-tls\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.665184 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.670604 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/795dab1c-49d5-4b05-a84f-4e1655d459fc-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-42pj4\" (UID: \"795dab1c-49d5-4b05-a84f-4e1655d459fc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-42pj4" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.689853 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 01 14:00:25 crc kubenswrapper[4585]: E1201 14:00:25.692380 4585 projected.go:194] Error preparing data for projected volume kube-api-access-ls596 for pod openshift-controller-manager/controller-manager-879f6c89f-n4pr6: failed to sync configmap cache: timed out waiting for the condition Dec 01 14:00:25 crc kubenswrapper[4585]: E1201 14:00:25.692480 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e0b7b830-078c-4448-b914-ab62e5ff7059-kube-api-access-ls596 podName:e0b7b830-078c-4448-b914-ab62e5ff7059 nodeName:}" failed. No retries permitted until 2025-12-01 14:00:26.192457004 +0000 UTC m=+140.176670859 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-ls596" (UniqueName: "kubernetes.io/projected/e0b7b830-078c-4448-b914-ab62e5ff7059-kube-api-access-ls596") pod "controller-manager-879f6c89f-n4pr6" (UID: "e0b7b830-078c-4448-b914-ab62e5ff7059") : failed to sync configmap cache: timed out waiting for the condition Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.693798 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.694473 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b61c7e4-9ec8-4fd7-a9d8-2e386dbf9acd-metrics-tls\") pod \"dns-operator-744455d44c-2l65t\" (UID: \"0b61c7e4-9ec8-4fd7-a9d8-2e386dbf9acd\") " pod="openshift-dns-operator/dns-operator-744455d44c-2l65t" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.694504 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7b5b76d-8f89-4753-81a8-81a886d87abf-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-gvp7h\" (UID: \"e7b5b76d-8f89-4753-81a8-81a886d87abf\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gvp7h" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.694528 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0de865f-14b4-4fd8-9be7-3b5655814a67-config\") pod \"kube-apiserver-operator-766d6c64bb-jsmc4\" (UID: \"f0de865f-14b4-4fd8-9be7-3b5655814a67\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jsmc4" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.694545 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rrbm\" (UniqueName: \"kubernetes.io/projected/fb007ad6-c27c-4d9e-8b4f-942db474d37b-kube-api-access-2rrbm\") pod \"catalog-operator-68c6474976-jcxv5\" (UID: \"fb007ad6-c27c-4d9e-8b4f-942db474d37b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jcxv5" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.694577 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c5a1b274-e180-463c-acf4-d2a5aa181827-bound-sa-token\") pod \"ingress-operator-5b745b69d9-kwp96\" (UID: \"c5a1b274-e180-463c-acf4-d2a5aa181827\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kwp96" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.694594 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bbfadd6e-84a7-4fa8-9766-0358345cf2e2-registration-dir\") pod \"csi-hostpathplugin-c5f9s\" (UID: \"bbfadd6e-84a7-4fa8-9766-0358345cf2e2\") " pod="hostpath-provisioner/csi-hostpathplugin-c5f9s" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.694615 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qth7v\" (UniqueName: \"kubernetes.io/projected/e42acc88-ce5b-4fc0-b4a6-c3c78fcf8428-kube-api-access-qth7v\") pod \"router-default-5444994796-fxg9h\" (UID: \"e42acc88-ce5b-4fc0-b4a6-c3c78fcf8428\") " pod="openshift-ingress/router-default-5444994796-fxg9h" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.694638 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e118cb4b-2a7d-4f32-ad96-98ad49c928f7-cert\") pod \"ingress-canary-42z4z\" (UID: \"e118cb4b-2a7d-4f32-ad96-98ad49c928f7\") " pod="openshift-ingress-canary/ingress-canary-42z4z" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.694663 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/dd4ddb1d-714d-42d7-b307-cd7db58d02de-profile-collector-cert\") pod \"olm-operator-6b444d44fb-pfvzd\" (UID: \"dd4ddb1d-714d-42d7-b307-cd7db58d02de\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pfvzd" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.694692 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57v2v\" (UniqueName: \"kubernetes.io/projected/b615253a-f52e-4607-a63c-7cf1c07dab6b-kube-api-access-57v2v\") pod \"marketplace-operator-79b997595-hdrjx\" (UID: \"b615253a-f52e-4607-a63c-7cf1c07dab6b\") " pod="openshift-marketplace/marketplace-operator-79b997595-hdrjx" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.694712 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1ca012d6-094a-4703-b8cc-d9d53fa9886d-config-volume\") pod \"collect-profiles-29409960-6f6w5\" (UID: \"1ca012d6-094a-4703-b8cc-d9d53fa9886d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409960-6f6w5" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.694742 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e42acc88-ce5b-4fc0-b4a6-c3c78fcf8428-default-certificate\") pod \"router-default-5444994796-fxg9h\" (UID: \"e42acc88-ce5b-4fc0-b4a6-c3c78fcf8428\") " pod="openshift-ingress/router-default-5444994796-fxg9h" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.694773 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bbfadd6e-84a7-4fa8-9766-0358345cf2e2-socket-dir\") pod \"csi-hostpathplugin-c5f9s\" (UID: \"bbfadd6e-84a7-4fa8-9766-0358345cf2e2\") " pod="hostpath-provisioner/csi-hostpathplugin-c5f9s" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.694798 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/fc4fd121-a822-4178-b67f-3f93eedd535e-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-79qpf\" (UID: \"fc4fd121-a822-4178-b67f-3f93eedd535e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-79qpf" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.694828 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c6d8fb5-74db-4420-90f6-33a5026a755f-config\") pod \"service-ca-operator-777779d784-9r6bj\" (UID: \"4c6d8fb5-74db-4420-90f6-33a5026a755f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9r6bj" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.694853 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/bbfadd6e-84a7-4fa8-9766-0358345cf2e2-csi-data-dir\") pod \"csi-hostpathplugin-c5f9s\" (UID: \"bbfadd6e-84a7-4fa8-9766-0358345cf2e2\") " pod="hostpath-provisioner/csi-hostpathplugin-c5f9s" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.694882 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c6d8fb5-74db-4420-90f6-33a5026a755f-serving-cert\") pod \"service-ca-operator-777779d784-9r6bj\" (UID: \"4c6d8fb5-74db-4420-90f6-33a5026a755f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9r6bj" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.694904 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/3b3388b2-3b24-448b-8806-9857c4e057b9-certs\") pod \"machine-config-server-2xmlv\" (UID: \"3b3388b2-3b24-448b-8806-9857c4e057b9\") " pod="openshift-machine-config-operator/machine-config-server-2xmlv" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.694933 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b615253a-f52e-4607-a63c-7cf1c07dab6b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-hdrjx\" (UID: \"b615253a-f52e-4607-a63c-7cf1c07dab6b\") " pod="openshift-marketplace/marketplace-operator-79b997595-hdrjx" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.694963 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlbsz\" (UniqueName: \"kubernetes.io/projected/5d58ac16-d8ec-4d10-ab7f-cc20ecb76eb7-kube-api-access-zlbsz\") pod \"dns-default-d2nj6\" (UID: \"5d58ac16-d8ec-4d10-ab7f-cc20ecb76eb7\") " pod="openshift-dns/dns-default-d2nj6" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.695001 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/dc83aa4a-2686-47c8-876b-c6cf2192b493-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-rn9hl\" (UID: \"dc83aa4a-2686-47c8-876b-c6cf2192b493\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rn9hl" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.695031 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/fb007ad6-c27c-4d9e-8b4f-942db474d37b-profile-collector-cert\") pod \"catalog-operator-68c6474976-jcxv5\" (UID: \"fb007ad6-c27c-4d9e-8b4f-942db474d37b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jcxv5" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.695064 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/3b3388b2-3b24-448b-8806-9857c4e057b9-node-bootstrap-token\") pod \"machine-config-server-2xmlv\" (UID: \"3b3388b2-3b24-448b-8806-9857c4e057b9\") " pod="openshift-machine-config-operator/machine-config-server-2xmlv" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.695094 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgdgf\" (UniqueName: \"kubernetes.io/projected/4c6d8fb5-74db-4420-90f6-33a5026a755f-kube-api-access-pgdgf\") pod \"service-ca-operator-777779d784-9r6bj\" (UID: \"4c6d8fb5-74db-4420-90f6-33a5026a755f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9r6bj" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.695110 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q52mz\" (UniqueName: \"kubernetes.io/projected/dc83aa4a-2686-47c8-876b-c6cf2192b493-kube-api-access-q52mz\") pod \"control-plane-machine-set-operator-78cbb6b69f-rn9hl\" (UID: \"dc83aa4a-2686-47c8-876b-c6cf2192b493\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rn9hl" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.695129 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vx585\" (UniqueName: \"kubernetes.io/projected/878aa194-4036-493a-8e75-990c4b57e793-kube-api-access-vx585\") pod \"multus-admission-controller-857f4d67dd-7hljs\" (UID: \"878aa194-4036-493a-8e75-990c4b57e793\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7hljs" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.695150 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f9e4580-6380-44b8-aec6-8b4ff897cd82-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pm2hc\" (UID: \"7f9e4580-6380-44b8-aec6-8b4ff897cd82\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pm2hc" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.695196 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/fb007ad6-c27c-4d9e-8b4f-942db474d37b-srv-cert\") pod \"catalog-operator-68c6474976-jcxv5\" (UID: \"fb007ad6-c27c-4d9e-8b4f-942db474d37b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jcxv5" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.695223 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94mbc\" (UniqueName: \"kubernetes.io/projected/0b61c7e4-9ec8-4fd7-a9d8-2e386dbf9acd-kube-api-access-94mbc\") pod \"dns-operator-744455d44c-2l65t\" (UID: \"0b61c7e4-9ec8-4fd7-a9d8-2e386dbf9acd\") " pod="openshift-dns-operator/dns-operator-744455d44c-2l65t" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.695245 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6djt\" (UniqueName: \"kubernetes.io/projected/e7b5b76d-8f89-4753-81a8-81a886d87abf-kube-api-access-h6djt\") pod \"kube-storage-version-migrator-operator-b67b599dd-gvp7h\" (UID: \"e7b5b76d-8f89-4753-81a8-81a886d87abf\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gvp7h" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.695263 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5d58ac16-d8ec-4d10-ab7f-cc20ecb76eb7-metrics-tls\") pod \"dns-default-d2nj6\" (UID: \"5d58ac16-d8ec-4d10-ab7f-cc20ecb76eb7\") " pod="openshift-dns/dns-default-d2nj6" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.695280 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6gsl\" (UniqueName: \"kubernetes.io/projected/dd4ddb1d-714d-42d7-b307-cd7db58d02de-kube-api-access-r6gsl\") pod \"olm-operator-6b444d44fb-pfvzd\" (UID: \"dd4ddb1d-714d-42d7-b307-cd7db58d02de\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pfvzd" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.695297 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f9e4580-6380-44b8-aec6-8b4ff897cd82-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pm2hc\" (UID: \"7f9e4580-6380-44b8-aec6-8b4ff897cd82\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pm2hc" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.695318 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddqrc\" (UniqueName: \"kubernetes.io/projected/bbfadd6e-84a7-4fa8-9766-0358345cf2e2-kube-api-access-ddqrc\") pod \"csi-hostpathplugin-c5f9s\" (UID: \"bbfadd6e-84a7-4fa8-9766-0358345cf2e2\") " pod="hostpath-provisioner/csi-hostpathplugin-c5f9s" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.695346 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c5a1b274-e180-463c-acf4-d2a5aa181827-metrics-tls\") pod \"ingress-operator-5b745b69d9-kwp96\" (UID: \"c5a1b274-e180-463c-acf4-d2a5aa181827\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kwp96" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.695363 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsdnr\" (UniqueName: \"kubernetes.io/projected/c5a1b274-e180-463c-acf4-d2a5aa181827-kube-api-access-zsdnr\") pod \"ingress-operator-5b745b69d9-kwp96\" (UID: \"c5a1b274-e180-463c-acf4-d2a5aa181827\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kwp96" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.695385 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5d58ac16-d8ec-4d10-ab7f-cc20ecb76eb7-config-volume\") pod \"dns-default-d2nj6\" (UID: \"5d58ac16-d8ec-4d10-ab7f-cc20ecb76eb7\") " pod="openshift-dns/dns-default-d2nj6" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.695402 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsrrk\" (UniqueName: \"kubernetes.io/projected/1ca012d6-094a-4703-b8cc-d9d53fa9886d-kube-api-access-qsrrk\") pod \"collect-profiles-29409960-6f6w5\" (UID: \"1ca012d6-094a-4703-b8cc-d9d53fa9886d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409960-6f6w5" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.695425 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/bbfadd6e-84a7-4fa8-9766-0358345cf2e2-mountpoint-dir\") pod \"csi-hostpathplugin-c5f9s\" (UID: \"bbfadd6e-84a7-4fa8-9766-0358345cf2e2\") " pod="hostpath-provisioner/csi-hostpathplugin-c5f9s" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.695443 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e42acc88-ce5b-4fc0-b4a6-c3c78fcf8428-service-ca-bundle\") pod \"router-default-5444994796-fxg9h\" (UID: \"e42acc88-ce5b-4fc0-b4a6-c3c78fcf8428\") " pod="openshift-ingress/router-default-5444994796-fxg9h" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.695462 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/bbfadd6e-84a7-4fa8-9766-0358345cf2e2-plugins-dir\") pod \"csi-hostpathplugin-c5f9s\" (UID: \"bbfadd6e-84a7-4fa8-9766-0358345cf2e2\") " pod="hostpath-provisioner/csi-hostpathplugin-c5f9s" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.695482 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0de865f-14b4-4fd8-9be7-3b5655814a67-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-jsmc4\" (UID: \"f0de865f-14b4-4fd8-9be7-3b5655814a67\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jsmc4" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.695496 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7f9e4580-6380-44b8-aec6-8b4ff897cd82-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pm2hc\" (UID: \"7f9e4580-6380-44b8-aec6-8b4ff897cd82\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pm2hc" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.695514 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f0de865f-14b4-4fd8-9be7-3b5655814a67-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-jsmc4\" (UID: \"f0de865f-14b4-4fd8-9be7-3b5655814a67\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jsmc4" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.695534 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1ca012d6-094a-4703-b8cc-d9d53fa9886d-secret-volume\") pod \"collect-profiles-29409960-6f6w5\" (UID: \"1ca012d6-094a-4703-b8cc-d9d53fa9886d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409960-6f6w5" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.695558 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/878aa194-4036-493a-8e75-990c4b57e793-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-7hljs\" (UID: \"878aa194-4036-493a-8e75-990c4b57e793\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7hljs" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.695574 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e42acc88-ce5b-4fc0-b4a6-c3c78fcf8428-metrics-certs\") pod \"router-default-5444994796-fxg9h\" (UID: \"e42acc88-ce5b-4fc0-b4a6-c3c78fcf8428\") " pod="openshift-ingress/router-default-5444994796-fxg9h" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.695594 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7b5b76d-8f89-4753-81a8-81a886d87abf-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-gvp7h\" (UID: \"e7b5b76d-8f89-4753-81a8-81a886d87abf\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gvp7h" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.695612 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e42acc88-ce5b-4fc0-b4a6-c3c78fcf8428-stats-auth\") pod \"router-default-5444994796-fxg9h\" (UID: \"e42acc88-ce5b-4fc0-b4a6-c3c78fcf8428\") " pod="openshift-ingress/router-default-5444994796-fxg9h" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.695636 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfl8z\" (UniqueName: \"kubernetes.io/projected/e118cb4b-2a7d-4f32-ad96-98ad49c928f7-kube-api-access-qfl8z\") pod \"ingress-canary-42z4z\" (UID: \"e118cb4b-2a7d-4f32-ad96-98ad49c928f7\") " pod="openshift-ingress-canary/ingress-canary-42z4z" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.695657 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b615253a-f52e-4607-a63c-7cf1c07dab6b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-hdrjx\" (UID: \"b615253a-f52e-4607-a63c-7cf1c07dab6b\") " pod="openshift-marketplace/marketplace-operator-79b997595-hdrjx" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.695673 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c5a1b274-e180-463c-acf4-d2a5aa181827-trusted-ca\") pod \"ingress-operator-5b745b69d9-kwp96\" (UID: \"c5a1b274-e180-463c-acf4-d2a5aa181827\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kwp96" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.695693 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxcds\" (UniqueName: \"kubernetes.io/projected/fc4fd121-a822-4178-b67f-3f93eedd535e-kube-api-access-lxcds\") pod \"package-server-manager-789f6589d5-79qpf\" (UID: \"fc4fd121-a822-4178-b67f-3f93eedd535e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-79qpf" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.695719 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/dd4ddb1d-714d-42d7-b307-cd7db58d02de-srv-cert\") pod \"olm-operator-6b444d44fb-pfvzd\" (UID: \"dd4ddb1d-714d-42d7-b307-cd7db58d02de\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pfvzd" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.695740 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgf9b\" (UniqueName: \"kubernetes.io/projected/3b3388b2-3b24-448b-8806-9857c4e057b9-kube-api-access-cgf9b\") pod \"machine-config-server-2xmlv\" (UID: \"3b3388b2-3b24-448b-8806-9857c4e057b9\") " pod="openshift-machine-config-operator/machine-config-server-2xmlv" Dec 01 14:00:25 crc kubenswrapper[4585]: E1201 14:00:25.696093 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 14:00:26.196074751 +0000 UTC m=+140.180288606 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.700710 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/fb007ad6-c27c-4d9e-8b4f-942db474d37b-srv-cert\") pod \"catalog-operator-68c6474976-jcxv5\" (UID: \"fb007ad6-c27c-4d9e-8b4f-942db474d37b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jcxv5" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.701640 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.704392 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b61c7e4-9ec8-4fd7-a9d8-2e386dbf9acd-metrics-tls\") pod \"dns-operator-744455d44c-2l65t\" (UID: \"0b61c7e4-9ec8-4fd7-a9d8-2e386dbf9acd\") " pod="openshift-dns-operator/dns-operator-744455d44c-2l65t" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.704913 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7b5b76d-8f89-4753-81a8-81a886d87abf-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-gvp7h\" (UID: \"e7b5b76d-8f89-4753-81a8-81a886d87abf\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gvp7h" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.705487 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0de865f-14b4-4fd8-9be7-3b5655814a67-config\") pod \"kube-apiserver-operator-766d6c64bb-jsmc4\" (UID: \"f0de865f-14b4-4fd8-9be7-3b5655814a67\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jsmc4" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.705909 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bbfadd6e-84a7-4fa8-9766-0358345cf2e2-registration-dir\") pod \"csi-hostpathplugin-c5f9s\" (UID: \"bbfadd6e-84a7-4fa8-9766-0358345cf2e2\") " pod="hostpath-provisioner/csi-hostpathplugin-c5f9s" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.707033 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/bbfadd6e-84a7-4fa8-9766-0358345cf2e2-mountpoint-dir\") pod \"csi-hostpathplugin-c5f9s\" (UID: \"bbfadd6e-84a7-4fa8-9766-0358345cf2e2\") " pod="hostpath-provisioner/csi-hostpathplugin-c5f9s" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.707353 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-m8p5b" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.707827 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e42acc88-ce5b-4fc0-b4a6-c3c78fcf8428-service-ca-bundle\") pod \"router-default-5444994796-fxg9h\" (UID: \"e42acc88-ce5b-4fc0-b4a6-c3c78fcf8428\") " pod="openshift-ingress/router-default-5444994796-fxg9h" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.708041 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/bbfadd6e-84a7-4fa8-9766-0358345cf2e2-plugins-dir\") pod \"csi-hostpathplugin-c5f9s\" (UID: \"bbfadd6e-84a7-4fa8-9766-0358345cf2e2\") " pod="hostpath-provisioner/csi-hostpathplugin-c5f9s" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.743870 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7b5b76d-8f89-4753-81a8-81a886d87abf-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-gvp7h\" (UID: \"e7b5b76d-8f89-4753-81a8-81a886d87abf\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gvp7h" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.744592 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e118cb4b-2a7d-4f32-ad96-98ad49c928f7-cert\") pod \"ingress-canary-42z4z\" (UID: \"e118cb4b-2a7d-4f32-ad96-98ad49c928f7\") " pod="openshift-ingress-canary/ingress-canary-42z4z" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.745033 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/3b3388b2-3b24-448b-8806-9857c4e057b9-certs\") pod \"machine-config-server-2xmlv\" (UID: \"3b3388b2-3b24-448b-8806-9857c4e057b9\") " pod="openshift-machine-config-operator/machine-config-server-2xmlv" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.745042 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c5a1b274-e180-463c-acf4-d2a5aa181827-trusted-ca\") pod \"ingress-operator-5b745b69d9-kwp96\" (UID: \"c5a1b274-e180-463c-acf4-d2a5aa181827\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kwp96" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.745439 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0de865f-14b4-4fd8-9be7-3b5655814a67-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-jsmc4\" (UID: \"f0de865f-14b4-4fd8-9be7-3b5655814a67\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jsmc4" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.745672 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.745807 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/3b3388b2-3b24-448b-8806-9857c4e057b9-node-bootstrap-token\") pod \"machine-config-server-2xmlv\" (UID: \"3b3388b2-3b24-448b-8806-9857c4e057b9\") " pod="openshift-machine-config-operator/machine-config-server-2xmlv" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.746314 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/dd4ddb1d-714d-42d7-b307-cd7db58d02de-profile-collector-cert\") pod \"olm-operator-6b444d44fb-pfvzd\" (UID: \"dd4ddb1d-714d-42d7-b307-cd7db58d02de\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pfvzd" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.746479 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/878aa194-4036-493a-8e75-990c4b57e793-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-7hljs\" (UID: \"878aa194-4036-493a-8e75-990c4b57e793\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7hljs" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.746649 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5d58ac16-d8ec-4d10-ab7f-cc20ecb76eb7-metrics-tls\") pod \"dns-default-d2nj6\" (UID: \"5d58ac16-d8ec-4d10-ab7f-cc20ecb76eb7\") " pod="openshift-dns/dns-default-d2nj6" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.747298 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c5a1b274-e180-463c-acf4-d2a5aa181827-metrics-tls\") pod \"ingress-operator-5b745b69d9-kwp96\" (UID: \"c5a1b274-e180-463c-acf4-d2a5aa181827\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kwp96" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.747601 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f9e4580-6380-44b8-aec6-8b4ff897cd82-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pm2hc\" (UID: \"7f9e4580-6380-44b8-aec6-8b4ff897cd82\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pm2hc" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.747709 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/bbfadd6e-84a7-4fa8-9766-0358345cf2e2-csi-data-dir\") pod \"csi-hostpathplugin-c5f9s\" (UID: \"bbfadd6e-84a7-4fa8-9766-0358345cf2e2\") " pod="hostpath-provisioner/csi-hostpathplugin-c5f9s" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.747895 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bbfadd6e-84a7-4fa8-9766-0358345cf2e2-socket-dir\") pod \"csi-hostpathplugin-c5f9s\" (UID: \"bbfadd6e-84a7-4fa8-9766-0358345cf2e2\") " pod="hostpath-provisioner/csi-hostpathplugin-c5f9s" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.748157 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b615253a-f52e-4607-a63c-7cf1c07dab6b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-hdrjx\" (UID: \"b615253a-f52e-4607-a63c-7cf1c07dab6b\") " pod="openshift-marketplace/marketplace-operator-79b997595-hdrjx" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.748680 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1ca012d6-094a-4703-b8cc-d9d53fa9886d-config-volume\") pod \"collect-profiles-29409960-6f6w5\" (UID: \"1ca012d6-094a-4703-b8cc-d9d53fa9886d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409960-6f6w5" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.756490 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.761449 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c6d8fb5-74db-4420-90f6-33a5026a755f-config\") pod \"service-ca-operator-777779d784-9r6bj\" (UID: \"4c6d8fb5-74db-4420-90f6-33a5026a755f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9r6bj" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.761938 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1ca012d6-094a-4703-b8cc-d9d53fa9886d-secret-volume\") pod \"collect-profiles-29409960-6f6w5\" (UID: \"1ca012d6-094a-4703-b8cc-d9d53fa9886d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409960-6f6w5" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.763424 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5d58ac16-d8ec-4d10-ab7f-cc20ecb76eb7-config-volume\") pod \"dns-default-d2nj6\" (UID: \"5d58ac16-d8ec-4d10-ab7f-cc20ecb76eb7\") " pod="openshift-dns/dns-default-d2nj6" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.763884 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f9e4580-6380-44b8-aec6-8b4ff897cd82-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pm2hc\" (UID: \"7f9e4580-6380-44b8-aec6-8b4ff897cd82\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pm2hc" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.775536 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b615253a-f52e-4607-a63c-7cf1c07dab6b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-hdrjx\" (UID: \"b615253a-f52e-4607-a63c-7cf1c07dab6b\") " pod="openshift-marketplace/marketplace-operator-79b997595-hdrjx" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.786557 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c6d8fb5-74db-4420-90f6-33a5026a755f-serving-cert\") pod \"service-ca-operator-777779d784-9r6bj\" (UID: \"4c6d8fb5-74db-4420-90f6-33a5026a755f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9r6bj" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.786636 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e42acc88-ce5b-4fc0-b4a6-c3c78fcf8428-stats-auth\") pod \"router-default-5444994796-fxg9h\" (UID: \"e42acc88-ce5b-4fc0-b4a6-c3c78fcf8428\") " pod="openshift-ingress/router-default-5444994796-fxg9h" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.798893 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6763aabd-f571-4b13-82fd-3a4a9bdf8406-bound-sa-token\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.802485 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-rqphx"] Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.803995 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:25 crc kubenswrapper[4585]: E1201 14:00:25.804570 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 14:00:26.304557346 +0000 UTC m=+140.288771201 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrbjm" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.805763 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/dd4ddb1d-714d-42d7-b307-cd7db58d02de-srv-cert\") pod \"olm-operator-6b444d44fb-pfvzd\" (UID: \"dd4ddb1d-714d-42d7-b307-cd7db58d02de\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pfvzd" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.806277 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/dc83aa4a-2686-47c8-876b-c6cf2192b493-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-rn9hl\" (UID: \"dc83aa4a-2686-47c8-876b-c6cf2192b493\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rn9hl" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.806747 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/fc4fd121-a822-4178-b67f-3f93eedd535e-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-79qpf\" (UID: \"fc4fd121-a822-4178-b67f-3f93eedd535e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-79qpf" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.807111 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/fb007ad6-c27c-4d9e-8b4f-942db474d37b-profile-collector-cert\") pod \"catalog-operator-68c6474976-jcxv5\" (UID: \"fb007ad6-c27c-4d9e-8b4f-942db474d37b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jcxv5" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.809620 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e42acc88-ce5b-4fc0-b4a6-c3c78fcf8428-default-certificate\") pod \"router-default-5444994796-fxg9h\" (UID: \"e42acc88-ce5b-4fc0-b4a6-c3c78fcf8428\") " pod="openshift-ingress/router-default-5444994796-fxg9h" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.812330 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e42acc88-ce5b-4fc0-b4a6-c3c78fcf8428-metrics-certs\") pod \"router-default-5444994796-fxg9h\" (UID: \"e42acc88-ce5b-4fc0-b4a6-c3c78fcf8428\") " pod="openshift-ingress/router-default-5444994796-fxg9h" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.819175 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwps2\" (UniqueName: \"kubernetes.io/projected/6763aabd-f571-4b13-82fd-3a4a9bdf8406-kube-api-access-dwps2\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.859013 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-c7gls"] Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.862038 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94mbc\" (UniqueName: \"kubernetes.io/projected/0b61c7e4-9ec8-4fd7-a9d8-2e386dbf9acd-kube-api-access-94mbc\") pod \"dns-operator-744455d44c-2l65t\" (UID: \"0b61c7e4-9ec8-4fd7-a9d8-2e386dbf9acd\") " pod="openshift-dns-operator/dns-operator-744455d44c-2l65t" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.864280 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-p8vxn"] Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.867293 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6djt\" (UniqueName: \"kubernetes.io/projected/e7b5b76d-8f89-4753-81a8-81a886d87abf-kube-api-access-h6djt\") pod \"kube-storage-version-migrator-operator-b67b599dd-gvp7h\" (UID: \"e7b5b76d-8f89-4753-81a8-81a886d87abf\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gvp7h" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.880145 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7f9e4580-6380-44b8-aec6-8b4ff897cd82-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pm2hc\" (UID: \"7f9e4580-6380-44b8-aec6-8b4ff897cd82\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pm2hc" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.906037 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 14:00:25 crc kubenswrapper[4585]: E1201 14:00:25.906584 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 14:00:26.406565501 +0000 UTC m=+140.390779356 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.926516 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6gsl\" (UniqueName: \"kubernetes.io/projected/dd4ddb1d-714d-42d7-b307-cd7db58d02de-kube-api-access-r6gsl\") pod \"olm-operator-6b444d44fb-pfvzd\" (UID: \"dd4ddb1d-714d-42d7-b307-cd7db58d02de\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pfvzd" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.930784 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgf9b\" (UniqueName: \"kubernetes.io/projected/3b3388b2-3b24-448b-8806-9857c4e057b9-kube-api-access-cgf9b\") pod \"machine-config-server-2xmlv\" (UID: \"3b3388b2-3b24-448b-8806-9857c4e057b9\") " pod="openshift-machine-config-operator/machine-config-server-2xmlv" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.947012 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddqrc\" (UniqueName: \"kubernetes.io/projected/bbfadd6e-84a7-4fa8-9766-0358345cf2e2-kube-api-access-ddqrc\") pod \"csi-hostpathplugin-c5f9s\" (UID: \"bbfadd6e-84a7-4fa8-9766-0358345cf2e2\") " pod="hostpath-provisioner/csi-hostpathplugin-c5f9s" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.949450 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsrrk\" (UniqueName: \"kubernetes.io/projected/1ca012d6-094a-4703-b8cc-d9d53fa9886d-kube-api-access-qsrrk\") pod \"collect-profiles-29409960-6f6w5\" (UID: \"1ca012d6-094a-4703-b8cc-d9d53fa9886d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409960-6f6w5" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.949667 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h9trv"] Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.965156 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rrbm\" (UniqueName: \"kubernetes.io/projected/fb007ad6-c27c-4d9e-8b4f-942db474d37b-kube-api-access-2rrbm\") pod \"catalog-operator-68c6474976-jcxv5\" (UID: \"fb007ad6-c27c-4d9e-8b4f-942db474d37b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jcxv5" Dec 01 14:00:25 crc kubenswrapper[4585]: I1201 14:00:25.985392 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c5a1b274-e180-463c-acf4-d2a5aa181827-bound-sa-token\") pod \"ingress-operator-5b745b69d9-kwp96\" (UID: \"c5a1b274-e180-463c-acf4-d2a5aa181827\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kwp96" Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.003082 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qth7v\" (UniqueName: \"kubernetes.io/projected/e42acc88-ce5b-4fc0-b4a6-c3c78fcf8428-kube-api-access-qth7v\") pod \"router-default-5444994796-fxg9h\" (UID: \"e42acc88-ce5b-4fc0-b4a6-c3c78fcf8428\") " pod="openshift-ingress/router-default-5444994796-fxg9h" Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.003274 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-2l65t" Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.007475 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:26 crc kubenswrapper[4585]: E1201 14:00:26.008510 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 14:00:26.508489284 +0000 UTC m=+140.492703239 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrbjm" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.013040 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pfvzd" Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.020760 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f0de865f-14b4-4fd8-9be7-3b5655814a67-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-jsmc4\" (UID: \"f0de865f-14b4-4fd8-9be7-3b5655814a67\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jsmc4" Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.036593 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gvp7h" Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.041791 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfl8z\" (UniqueName: \"kubernetes.io/projected/e118cb4b-2a7d-4f32-ad96-98ad49c928f7-kube-api-access-qfl8z\") pod \"ingress-canary-42z4z\" (UID: \"e118cb4b-2a7d-4f32-ad96-98ad49c928f7\") " pod="openshift-ingress-canary/ingress-canary-42z4z" Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.061852 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jsmc4" Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.063276 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsdnr\" (UniqueName: \"kubernetes.io/projected/c5a1b274-e180-463c-acf4-d2a5aa181827-kube-api-access-zsdnr\") pod \"ingress-operator-5b745b69d9-kwp96\" (UID: \"c5a1b274-e180-463c-acf4-d2a5aa181827\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kwp96" Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.079506 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pm2hc" Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.080713 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409960-6f6w5" Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.082022 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-fxg9h" Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.084546 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxcds\" (UniqueName: \"kubernetes.io/projected/fc4fd121-a822-4178-b67f-3f93eedd535e-kube-api-access-lxcds\") pod \"package-server-manager-789f6589d5-79qpf\" (UID: \"fc4fd121-a822-4178-b67f-3f93eedd535e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-79qpf" Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.090256 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kwp96" Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.098660 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jcxv5" Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.101595 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57v2v\" (UniqueName: \"kubernetes.io/projected/b615253a-f52e-4607-a63c-7cf1c07dab6b-kube-api-access-57v2v\") pod \"marketplace-operator-79b997595-hdrjx\" (UID: \"b615253a-f52e-4607-a63c-7cf1c07dab6b\") " pod="openshift-marketplace/marketplace-operator-79b997595-hdrjx" Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.103035 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-79qpf" Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.113782 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-42z4z" Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.114656 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.114790 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhj9p\" (UniqueName: \"kubernetes.io/projected/795dab1c-49d5-4b05-a84f-4e1655d459fc-kube-api-access-nhj9p\") pod \"machine-api-operator-5694c8668f-42pj4\" (UID: \"795dab1c-49d5-4b05-a84f-4e1655d459fc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-42pj4" Dec 01 14:00:26 crc kubenswrapper[4585]: E1201 14:00:26.115035 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 14:00:26.615020485 +0000 UTC m=+140.599234340 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.122718 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhj9p\" (UniqueName: \"kubernetes.io/projected/795dab1c-49d5-4b05-a84f-4e1655d459fc-kube-api-access-nhj9p\") pod \"machine-api-operator-5694c8668f-42pj4\" (UID: \"795dab1c-49d5-4b05-a84f-4e1655d459fc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-42pj4" Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.137575 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-c5f9s" Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.143186 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-2xmlv" Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.144557 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgdgf\" (UniqueName: \"kubernetes.io/projected/4c6d8fb5-74db-4420-90f6-33a5026a755f-kube-api-access-pgdgf\") pod \"service-ca-operator-777779d784-9r6bj\" (UID: \"4c6d8fb5-74db-4420-90f6-33a5026a755f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9r6bj" Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.178232 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-42pj4" Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.192012 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q52mz\" (UniqueName: \"kubernetes.io/projected/dc83aa4a-2686-47c8-876b-c6cf2192b493-kube-api-access-q52mz\") pod \"control-plane-machine-set-operator-78cbb6b69f-rn9hl\" (UID: \"dc83aa4a-2686-47c8-876b-c6cf2192b493\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rn9hl" Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.216101 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ls596\" (UniqueName: \"kubernetes.io/projected/e0b7b830-078c-4448-b914-ab62e5ff7059-kube-api-access-ls596\") pod \"controller-manager-879f6c89f-n4pr6\" (UID: \"e0b7b830-078c-4448-b914-ab62e5ff7059\") " pod="openshift-controller-manager/controller-manager-879f6c89f-n4pr6" Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.216176 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.216437 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vx585\" (UniqueName: \"kubernetes.io/projected/878aa194-4036-493a-8e75-990c4b57e793-kube-api-access-vx585\") pod \"multus-admission-controller-857f4d67dd-7hljs\" (UID: \"878aa194-4036-493a-8e75-990c4b57e793\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7hljs" Dec 01 14:00:26 crc kubenswrapper[4585]: E1201 14:00:26.216626 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 14:00:26.716612127 +0000 UTC m=+140.700826062 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrbjm" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.317204 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 14:00:26 crc kubenswrapper[4585]: E1201 14:00:26.317394 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 14:00:26.817360753 +0000 UTC m=+140.801574608 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.317562 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:26 crc kubenswrapper[4585]: E1201 14:00:26.317913 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 14:00:26.81790223 +0000 UTC m=+140.802116145 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrbjm" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.320812 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-9r6bj" Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.327788 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-7hljs" Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.345398 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hdrjx" Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.352792 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rn9hl" Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.356835 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4z48d" event={"ID":"a988a2aa-8447-47fe-9b03-771d1633d69f","Type":"ContainerStarted","Data":"95ea318d936052756c82e81dc442cf60c18cc4d4c8f33f1ae90e2246645f0a28"} Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.358251 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cqw6s" event={"ID":"766043b8-3f15-4428-855d-1e82aca4fb63","Type":"ContainerStarted","Data":"f021083197c6188c26aa9975bfa67ab73df8e4e0d6be827c18740315a451ee1e"} Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.359419 4585 generic.go:334] "Generic (PLEG): container finished" podID="7918cfa2-6bbe-4434-a62f-8b06e3ff324e" containerID="05d397925091a4a9693f81c4140392f77358f9674101afe6e71f1ad555008569" exitCode=0 Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.359502 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2zzvd" event={"ID":"7918cfa2-6bbe-4434-a62f-8b06e3ff324e","Type":"ContainerDied","Data":"05d397925091a4a9693f81c4140392f77358f9674101afe6e71f1ad555008569"} Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.360114 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-p8vxn" event={"ID":"59ce8ebf-2806-4ff3-bd4c-7f1ace81f7e2","Type":"ContainerStarted","Data":"a26d335143be09d73ba7703226cbb3fefb4f569d5b65d4d2443a7ac960d4c6d1"} Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.361244 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pttbb" event={"ID":"e3752441-ce0c-46e5-bf1c-3bfab4ae6819","Type":"ContainerStarted","Data":"8dff68b57cca6caf7dc4002ad28df0078468f31bb953104f56f80e5c6c4c9265"} Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.361925 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-c7gls" event={"ID":"79abd33c-0184-473e-8bb9-c408a5c32efc","Type":"ContainerStarted","Data":"c46d3fa15dabe36f629944748219ada1ed13fbd3eae3e4a9a3525fb66a6a7a8d"} Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.362964 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-tgbx7" event={"ID":"6fb639a8-cf5f-4b09-aae1-35844b2e792d","Type":"ContainerStarted","Data":"112ab330e6555663de723d1ca61f595eac670bec08a7b4d9badab1d21ad74ed4"} Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.363840 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-hx54r" event={"ID":"acf6d794-22cb-4a06-adf1-4bea5f76e854","Type":"ContainerStarted","Data":"22412dcd77635f9cd15d26b4e600c57ca7bc066da1fb9757d07ba8e31b6b38c7"} Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.363944 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-hx54r" event={"ID":"acf6d794-22cb-4a06-adf1-4bea5f76e854","Type":"ContainerStarted","Data":"8d80aa3323565ce39fb1f85d528194157b281342c6e0d375646f82ebf7a8d15d"} Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.364087 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-hx54r" Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.364673 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rqphx" event={"ID":"1d15ad5d-2ee0-4543-8e56-89fa7a2461f7","Type":"ContainerStarted","Data":"703c254e307804ff138151b8429a62b79960793a035274a492b3bf7109b26187"} Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.365618 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xjls2" event={"ID":"406a9c63-4d0d-4c27-8ade-804cd92b0985","Type":"ContainerStarted","Data":"d8fbfded3662d7a61e1375f3f261fde718beda877baa92aa203f5117832c35b0"} Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.366614 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2vtl4" event={"ID":"7aedeec4-fc41-47aa-85a1-d5a92de50deb","Type":"ContainerStarted","Data":"3f69e7acf4e89062c30ccd7ac049a19d74400767bd1fa888a301bc5a4e125c1c"} Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.366811 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2vtl4" Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.367878 4585 generic.go:334] "Generic (PLEG): container finished" podID="bf71ed2f-6b27-4ce7-93ae-6f5ced50b306" containerID="23605b8a1a5ab67da7742b8ef1b53e8e845814696b5f5c3f23a6e11ee2f7dd39" exitCode=0 Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.368675 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-sm85h" event={"ID":"bf71ed2f-6b27-4ce7-93ae-6f5ced50b306","Type":"ContainerDied","Data":"23605b8a1a5ab67da7742b8ef1b53e8e845814696b5f5c3f23a6e11ee2f7dd39"} Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.379850 4585 patch_prober.go:28] interesting pod/console-operator-58897d9998-hx54r container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/readyz\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.379928 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-hx54r" podUID="acf6d794-22cb-4a06-adf1-4bea5f76e854" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.34:8443/readyz\": dial tcp 10.217.0.34:8443: connect: connection refused" Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.418910 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 14:00:26 crc kubenswrapper[4585]: E1201 14:00:26.419556 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 14:00:26.919539774 +0000 UTC m=+140.903753629 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.422799 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlbsz\" (UniqueName: \"kubernetes.io/projected/5d58ac16-d8ec-4d10-ab7f-cc20ecb76eb7-kube-api-access-zlbsz\") pod \"dns-default-d2nj6\" (UID: \"5d58ac16-d8ec-4d10-ab7f-cc20ecb76eb7\") " pod="openshift-dns/dns-default-d2nj6" Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.452695 4585 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-2vtl4 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.452750 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2vtl4" podUID="7aedeec4-fc41-47aa-85a1-d5a92de50deb" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.453616 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ls596\" (UniqueName: \"kubernetes.io/projected/e0b7b830-078c-4448-b914-ab62e5ff7059-kube-api-access-ls596\") pod \"controller-manager-879f6c89f-n4pr6\" (UID: \"e0b7b830-078c-4448-b914-ab62e5ff7059\") " pod="openshift-controller-manager/controller-manager-879f6c89f-n4pr6" Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.457182 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-dsfs8"] Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.472790 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bcsxh"] Dec 01 14:00:26 crc kubenswrapper[4585]: W1201 14:00:26.484862 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81c799b6_ed27_4b1c_9751_55fdcc101362.slice/crio-51240faa6c72a506fe1babadbbacafe7e865259f1b416aa0d1a7c3a6e72f920e WatchSource:0}: Error finding container 51240faa6c72a506fe1babadbbacafe7e865259f1b416aa0d1a7c3a6e72f920e: Status 404 returned error can't find the container with id 51240faa6c72a506fe1babadbbacafe7e865259f1b416aa0d1a7c3a6e72f920e Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.523575 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:26 crc kubenswrapper[4585]: W1201 14:00:26.527820 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc239d6eb_535e_442b_a67a_f8227313ceb4.slice/crio-8b7dfa209eaddf7a4bb3ddbe4f1af8030091ce07057ad4ae831779fc9148fd31 WatchSource:0}: Error finding container 8b7dfa209eaddf7a4bb3ddbe4f1af8030091ce07057ad4ae831779fc9148fd31: Status 404 returned error can't find the container with id 8b7dfa209eaddf7a4bb3ddbe4f1af8030091ce07057ad4ae831779fc9148fd31 Dec 01 14:00:26 crc kubenswrapper[4585]: E1201 14:00:26.535917 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 14:00:27.035895802 +0000 UTC m=+141.020109657 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrbjm" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.632053 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 14:00:26 crc kubenswrapper[4585]: E1201 14:00:26.651383 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 14:00:27.151350761 +0000 UTC m=+141.135564616 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.723644 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-d2nj6" Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.736614 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0b7b830-078c-4448-b914-ab62e5ff7059-serving-cert\") pod \"controller-manager-879f6c89f-n4pr6\" (UID: \"e0b7b830-078c-4448-b914-ab62e5ff7059\") " pod="openshift-controller-manager/controller-manager-879f6c89f-n4pr6" Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.736669 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.736804 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e0b7b830-078c-4448-b914-ab62e5ff7059-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-n4pr6\" (UID: \"e0b7b830-078c-4448-b914-ab62e5ff7059\") " pod="openshift-controller-manager/controller-manager-879f6c89f-n4pr6" Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.737773 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e0b7b830-078c-4448-b914-ab62e5ff7059-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-n4pr6\" (UID: \"e0b7b830-078c-4448-b914-ab62e5ff7059\") " pod="openshift-controller-manager/controller-manager-879f6c89f-n4pr6" Dec 01 14:00:26 crc kubenswrapper[4585]: E1201 14:00:26.738840 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 14:00:27.238823978 +0000 UTC m=+141.223037833 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrbjm" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.753128 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0b7b830-078c-4448-b914-ab62e5ff7059-serving-cert\") pod \"controller-manager-879f6c89f-n4pr6\" (UID: \"e0b7b830-078c-4448-b914-ab62e5ff7059\") " pod="openshift-controller-manager/controller-manager-879f6c89f-n4pr6" Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.791688 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-n4pr6" Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.837997 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 14:00:26 crc kubenswrapper[4585]: E1201 14:00:26.838400 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 14:00:27.338385325 +0000 UTC m=+141.322599180 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.957432 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-m8p5b"] Dec 01 14:00:26 crc kubenswrapper[4585]: I1201 14:00:26.958157 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:26 crc kubenswrapper[4585]: E1201 14:00:26.959055 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 14:00:27.459037752 +0000 UTC m=+141.443251687 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrbjm" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:27 crc kubenswrapper[4585]: I1201 14:00:27.061370 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 14:00:27 crc kubenswrapper[4585]: E1201 14:00:27.066061 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 14:00:27.566037979 +0000 UTC m=+141.550251834 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:27 crc kubenswrapper[4585]: I1201 14:00:27.066198 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:27 crc kubenswrapper[4585]: E1201 14:00:27.066498 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 14:00:27.566490663 +0000 UTC m=+141.550704518 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrbjm" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:27 crc kubenswrapper[4585]: I1201 14:00:27.168504 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 14:00:27 crc kubenswrapper[4585]: E1201 14:00:27.168822 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 14:00:27.668807459 +0000 UTC m=+141.653021304 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:27 crc kubenswrapper[4585]: I1201 14:00:27.279735 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:27 crc kubenswrapper[4585]: E1201 14:00:27.280181 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 14:00:27.780169656 +0000 UTC m=+141.764383511 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrbjm" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:27 crc kubenswrapper[4585]: I1201 14:00:27.384402 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 14:00:27 crc kubenswrapper[4585]: E1201 14:00:27.385243 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 14:00:27.8852219 +0000 UTC m=+141.869435765 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:27 crc kubenswrapper[4585]: I1201 14:00:27.468568 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-flcxm" event={"ID":"bf56f15d-84d6-47c7-b8f7-1c9922d8f53f","Type":"ContainerStarted","Data":"35278dfb1b90b885927a12cfa26ebab3f24abdfe6a92ec5244e3fb647cac5d8a"} Dec 01 14:00:27 crc kubenswrapper[4585]: I1201 14:00:27.486623 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:27 crc kubenswrapper[4585]: E1201 14:00:27.487070 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 14:00:27.98705532 +0000 UTC m=+141.971269175 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrbjm" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:27 crc kubenswrapper[4585]: I1201 14:00:27.499912 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r6pct" podStartSLOduration=121.499895554 podStartE2EDuration="2m1.499895554s" podCreationTimestamp="2025-12-01 13:58:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:00:27.408335425 +0000 UTC m=+141.392549280" watchObservedRunningTime="2025-12-01 14:00:27.499895554 +0000 UTC m=+141.484109399" Dec 01 14:00:27 crc kubenswrapper[4585]: I1201 14:00:27.587627 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 14:00:27 crc kubenswrapper[4585]: E1201 14:00:27.587743 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 14:00:28.087725973 +0000 UTC m=+142.071939828 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:27 crc kubenswrapper[4585]: I1201 14:00:27.587857 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:27 crc kubenswrapper[4585]: I1201 14:00:27.588091 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-p8vxn" event={"ID":"59ce8ebf-2806-4ff3-bd4c-7f1ace81f7e2","Type":"ContainerStarted","Data":"f06c70c3acf4a2dc142043775e49ea5aa073e2e19dd6ed650188332cecbed138"} Dec 01 14:00:27 crc kubenswrapper[4585]: E1201 14:00:27.588111 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 14:00:28.088103445 +0000 UTC m=+142.072317300 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrbjm" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:27 crc kubenswrapper[4585]: I1201 14:00:27.689445 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 14:00:27 crc kubenswrapper[4585]: E1201 14:00:27.689806 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 14:00:28.189782241 +0000 UTC m=+142.173996096 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:27 crc kubenswrapper[4585]: I1201 14:00:27.689870 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:27 crc kubenswrapper[4585]: E1201 14:00:27.690164 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 14:00:28.190155063 +0000 UTC m=+142.174368918 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrbjm" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:27 crc kubenswrapper[4585]: I1201 14:00:27.696027 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-hx54r" podStartSLOduration=121.696013931 podStartE2EDuration="2m1.696013931s" podCreationTimestamp="2025-12-01 13:58:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:00:27.695430173 +0000 UTC m=+141.679644028" watchObservedRunningTime="2025-12-01 14:00:27.696013931 +0000 UTC m=+141.680227787" Dec 01 14:00:27 crc kubenswrapper[4585]: I1201 14:00:27.763825 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-tgbx7" event={"ID":"6fb639a8-cf5f-4b09-aae1-35844b2e792d","Type":"ContainerStarted","Data":"0a61fa056db4c87e27622f2af490acc35a3b67601b9b6eb7c8c95c6c900cee3e"} Dec 01 14:00:27 crc kubenswrapper[4585]: I1201 14:00:27.804093 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 14:00:27 crc kubenswrapper[4585]: E1201 14:00:27.804523 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 14:00:28.304496156 +0000 UTC m=+142.288710001 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:27 crc kubenswrapper[4585]: I1201 14:00:27.828253 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-42pj4"] Dec 01 14:00:27 crc kubenswrapper[4585]: I1201 14:00:27.890482 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4z48d" podStartSLOduration=121.890467475 podStartE2EDuration="2m1.890467475s" podCreationTimestamp="2025-12-01 13:58:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:00:27.888611245 +0000 UTC m=+141.872825100" watchObservedRunningTime="2025-12-01 14:00:27.890467475 +0000 UTC m=+141.874681330" Dec 01 14:00:27 crc kubenswrapper[4585]: I1201 14:00:27.906889 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:27 crc kubenswrapper[4585]: E1201 14:00:27.907428 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 14:00:28.407415651 +0000 UTC m=+142.391629506 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrbjm" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:27 crc kubenswrapper[4585]: I1201 14:00:27.937268 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pttbb" event={"ID":"e3752441-ce0c-46e5-bf1c-3bfab4ae6819","Type":"ContainerStarted","Data":"b75a3539dc1388b955992669374457fbcdd452f2461838377068c1432d1c14b1"} Dec 01 14:00:28 crc kubenswrapper[4585]: I1201 14:00:28.008794 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-sljgx" podStartSLOduration=121.008774766 podStartE2EDuration="2m1.008774766s" podCreationTimestamp="2025-12-01 13:58:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:00:28.005439899 +0000 UTC m=+141.989653754" watchObservedRunningTime="2025-12-01 14:00:28.008774766 +0000 UTC m=+141.992988631" Dec 01 14:00:28 crc kubenswrapper[4585]: I1201 14:00:28.011934 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 14:00:28 crc kubenswrapper[4585]: E1201 14:00:28.012316 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 14:00:28.51230166 +0000 UTC m=+142.496515515 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:28 crc kubenswrapper[4585]: I1201 14:00:28.050488 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bcsxh" event={"ID":"dcb5aba5-cd6c-4535-a92d-d1583fe02b18","Type":"ContainerStarted","Data":"3513c99fae64594fa9220fd1f349bb1a7e62cff0cd17de96e04960341e63c126"} Dec 01 14:00:28 crc kubenswrapper[4585]: I1201 14:00:28.062743 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pfvzd"] Dec 01 14:00:28 crc kubenswrapper[4585]: I1201 14:00:28.124985 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:28 crc kubenswrapper[4585]: E1201 14:00:28.127272 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 14:00:28.627259333 +0000 UTC m=+142.611473188 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrbjm" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:28 crc kubenswrapper[4585]: I1201 14:00:28.191304 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cqw6s" event={"ID":"766043b8-3f15-4428-855d-1e82aca4fb63","Type":"ContainerStarted","Data":"ef41dc6f44f6760510ff983dd89336cf4496ce51e30f90b92d508f11533d8f04"} Dec 01 14:00:28 crc kubenswrapper[4585]: I1201 14:00:28.196171 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cqw6s" Dec 01 14:00:28 crc kubenswrapper[4585]: I1201 14:00:28.230169 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 14:00:28 crc kubenswrapper[4585]: E1201 14:00:28.230516 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 14:00:28.730501728 +0000 UTC m=+142.714715583 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:28 crc kubenswrapper[4585]: I1201 14:00:28.240640 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xjls2" event={"ID":"406a9c63-4d0d-4c27-8ade-804cd92b0985","Type":"ContainerStarted","Data":"a8fab2e310389cd2d29478a10a5f06d95bd7dbde10f901f32afc69d2c5b3598e"} Dec 01 14:00:28 crc kubenswrapper[4585]: I1201 14:00:28.274785 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-m8p5b" event={"ID":"fe562f92-5985-4fbf-a5b9-8359e7a044d9","Type":"ContainerStarted","Data":"c7e116a813ace8b66891ec374c73f5baf93891094c04698fa33ef0dc513a5932"} Dec 01 14:00:28 crc kubenswrapper[4585]: I1201 14:00:28.277349 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-f9k95" podStartSLOduration=122.277326287 podStartE2EDuration="2m2.277326287s" podCreationTimestamp="2025-12-01 13:58:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:00:28.272006705 +0000 UTC m=+142.256220560" watchObservedRunningTime="2025-12-01 14:00:28.277326287 +0000 UTC m=+142.261540142" Dec 01 14:00:28 crc kubenswrapper[4585]: I1201 14:00:28.292117 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-dsfs8" event={"ID":"c239d6eb-535e-442b-a67a-f8227313ceb4","Type":"ContainerStarted","Data":"8b7dfa209eaddf7a4bb3ddbe4f1af8030091ce07057ad4ae831779fc9148fd31"} Dec 01 14:00:28 crc kubenswrapper[4585]: I1201 14:00:28.320211 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2vtl4" podStartSLOduration=121.320190427 podStartE2EDuration="2m1.320190427s" podCreationTimestamp="2025-12-01 13:58:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:00:28.305325549 +0000 UTC m=+142.289539404" watchObservedRunningTime="2025-12-01 14:00:28.320190427 +0000 UTC m=+142.304404282" Dec 01 14:00:28 crc kubenswrapper[4585]: I1201 14:00:28.332897 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:28 crc kubenswrapper[4585]: E1201 14:00:28.333799 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 14:00:28.833787515 +0000 UTC m=+142.818001360 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrbjm" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:28 crc kubenswrapper[4585]: I1201 14:00:28.346379 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h9trv" event={"ID":"81c799b6-ed27-4b1c-9751-55fdcc101362","Type":"ContainerStarted","Data":"51240faa6c72a506fe1babadbbacafe7e865259f1b416aa0d1a7c3a6e72f920e"} Dec 01 14:00:28 crc kubenswrapper[4585]: I1201 14:00:28.379664 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2vtl4" Dec 01 14:00:28 crc kubenswrapper[4585]: I1201 14:00:28.427033 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6wzhb" podStartSLOduration=122.427014538 podStartE2EDuration="2m2.427014538s" podCreationTimestamp="2025-12-01 13:58:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:00:28.424937091 +0000 UTC m=+142.409150966" watchObservedRunningTime="2025-12-01 14:00:28.427014538 +0000 UTC m=+142.411228393" Dec 01 14:00:28 crc kubenswrapper[4585]: I1201 14:00:28.433462 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 14:00:28 crc kubenswrapper[4585]: E1201 14:00:28.434757 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 14:00:28.934737307 +0000 UTC m=+142.918951162 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:28 crc kubenswrapper[4585]: I1201 14:00:28.534810 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:28 crc kubenswrapper[4585]: E1201 14:00:28.564163 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 14:00:29.064095164 +0000 UTC m=+143.048309009 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrbjm" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:28 crc kubenswrapper[4585]: I1201 14:00:28.662291 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 14:00:28 crc kubenswrapper[4585]: E1201 14:00:28.663110 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 14:00:29.163092433 +0000 UTC m=+143.147306288 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:28 crc kubenswrapper[4585]: I1201 14:00:28.663147 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cqw6s" podStartSLOduration=121.663125854 podStartE2EDuration="2m1.663125854s" podCreationTimestamp="2025-12-01 13:58:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:00:28.659367243 +0000 UTC m=+142.643581098" watchObservedRunningTime="2025-12-01 14:00:28.663125854 +0000 UTC m=+142.647339709" Dec 01 14:00:28 crc kubenswrapper[4585]: I1201 14:00:28.742992 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xjls2" podStartSLOduration=122.742953365 podStartE2EDuration="2m2.742953365s" podCreationTimestamp="2025-12-01 13:58:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:00:28.742219802 +0000 UTC m=+142.726433657" watchObservedRunningTime="2025-12-01 14:00:28.742953365 +0000 UTC m=+142.727167220" Dec 01 14:00:28 crc kubenswrapper[4585]: I1201 14:00:28.770298 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:28 crc kubenswrapper[4585]: E1201 14:00:28.770669 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 14:00:29.270657208 +0000 UTC m=+143.254871053 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrbjm" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:28 crc kubenswrapper[4585]: I1201 14:00:28.872669 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 14:00:28 crc kubenswrapper[4585]: E1201 14:00:28.890799 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 14:00:29.390759707 +0000 UTC m=+143.374973562 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:28 crc kubenswrapper[4585]: I1201 14:00:28.907902 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-hx54r" Dec 01 14:00:28 crc kubenswrapper[4585]: I1201 14:00:28.923831 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-tgbx7" podStartSLOduration=122.923810011 podStartE2EDuration="2m2.923810011s" podCreationTimestamp="2025-12-01 13:58:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:00:28.860309895 +0000 UTC m=+142.844523750" watchObservedRunningTime="2025-12-01 14:00:28.923810011 +0000 UTC m=+142.908023876" Dec 01 14:00:28 crc kubenswrapper[4585]: I1201 14:00:28.992243 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:28 crc kubenswrapper[4585]: E1201 14:00:28.992574 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 14:00:29.492559115 +0000 UTC m=+143.476772970 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrbjm" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:29 crc kubenswrapper[4585]: I1201 14:00:29.100000 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 14:00:29 crc kubenswrapper[4585]: E1201 14:00:29.100138 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 14:00:29.60010996 +0000 UTC m=+143.584323815 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:29 crc kubenswrapper[4585]: I1201 14:00:29.100202 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:29 crc kubenswrapper[4585]: E1201 14:00:29.100645 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 14:00:29.600628796 +0000 UTC m=+143.584842651 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrbjm" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:29 crc kubenswrapper[4585]: I1201 14:00:29.109033 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pttbb" podStartSLOduration=123.109012327 podStartE2EDuration="2m3.109012327s" podCreationTimestamp="2025-12-01 13:58:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:00:29.026731586 +0000 UTC m=+143.010945461" watchObservedRunningTime="2025-12-01 14:00:29.109012327 +0000 UTC m=+143.093226182" Dec 01 14:00:29 crc kubenswrapper[4585]: I1201 14:00:29.111927 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-p8vxn" podStartSLOduration=123.1119172 podStartE2EDuration="2m3.1119172s" podCreationTimestamp="2025-12-01 13:58:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:00:29.106350471 +0000 UTC m=+143.090564326" watchObservedRunningTime="2025-12-01 14:00:29.1119172 +0000 UTC m=+143.096131055" Dec 01 14:00:29 crc kubenswrapper[4585]: I1201 14:00:29.113601 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-79qpf"] Dec 01 14:00:29 crc kubenswrapper[4585]: I1201 14:00:29.200356 4585 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-cqw6s container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.17:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 01 14:00:29 crc kubenswrapper[4585]: I1201 14:00:29.200404 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cqw6s" podUID="766043b8-3f15-4428-855d-1e82aca4fb63" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.17:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 01 14:00:29 crc kubenswrapper[4585]: I1201 14:00:29.201995 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 14:00:29 crc kubenswrapper[4585]: E1201 14:00:29.202174 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 14:00:29.702151887 +0000 UTC m=+143.686365742 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:29 crc kubenswrapper[4585]: I1201 14:00:29.202264 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:29 crc kubenswrapper[4585]: E1201 14:00:29.202640 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 14:00:29.702629762 +0000 UTC m=+143.686843677 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrbjm" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:29 crc kubenswrapper[4585]: I1201 14:00:29.303360 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 14:00:29 crc kubenswrapper[4585]: E1201 14:00:29.304155 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 14:00:29.804140642 +0000 UTC m=+143.788354497 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:29 crc kubenswrapper[4585]: I1201 14:00:29.379714 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rqphx" event={"ID":"1d15ad5d-2ee0-4543-8e56-89fa7a2461f7","Type":"ContainerStarted","Data":"77a54c1ab1de538b4af5dfb489baac3c6e4aa5a4a46a85c7899c0b83cfbc4871"} Dec 01 14:00:29 crc kubenswrapper[4585]: I1201 14:00:29.379771 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rqphx" event={"ID":"1d15ad5d-2ee0-4543-8e56-89fa7a2461f7","Type":"ContainerStarted","Data":"83fc0ef9731daf1e6e795c5ed58fc2f3e554300383d61417f1ad155b6978feb4"} Dec 01 14:00:29 crc kubenswrapper[4585]: W1201 14:00:29.380604 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc4fd121_a822_4178_b67f_3f93eedd535e.slice/crio-4d4104fe859a57c5d683e9c4c50bb3724062b50e8c52bef532df4cce03ae7a5d WatchSource:0}: Error finding container 4d4104fe859a57c5d683e9c4c50bb3724062b50e8c52bef532df4cce03ae7a5d: Status 404 returned error can't find the container with id 4d4104fe859a57c5d683e9c4c50bb3724062b50e8c52bef532df4cce03ae7a5d Dec 01 14:00:29 crc kubenswrapper[4585]: I1201 14:00:29.407731 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:29 crc kubenswrapper[4585]: E1201 14:00:29.408037 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 14:00:29.908024788 +0000 UTC m=+143.892238643 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrbjm" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:29 crc kubenswrapper[4585]: I1201 14:00:29.417942 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-c7gls" event={"ID":"79abd33c-0184-473e-8bb9-c408a5c32efc","Type":"ContainerStarted","Data":"35526697d308ffd1515ecaf107060b6aed20a104c81b79a8baab559edf7f4497"} Dec 01 14:00:29 crc kubenswrapper[4585]: I1201 14:00:29.418821 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-c7gls" Dec 01 14:00:29 crc kubenswrapper[4585]: I1201 14:00:29.427543 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rqphx" podStartSLOduration=123.427524486 podStartE2EDuration="2m3.427524486s" podCreationTimestamp="2025-12-01 13:58:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:00:29.427519176 +0000 UTC m=+143.411733031" watchObservedRunningTime="2025-12-01 14:00:29.427524486 +0000 UTC m=+143.411738341" Dec 01 14:00:29 crc kubenswrapper[4585]: I1201 14:00:29.433316 4585 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-c7gls container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.10:6443/healthz\": dial tcp 10.217.0.10:6443: connect: connection refused" start-of-body= Dec 01 14:00:29 crc kubenswrapper[4585]: I1201 14:00:29.433391 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-c7gls" podUID="79abd33c-0184-473e-8bb9-c408a5c32efc" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.10:6443/healthz\": dial tcp 10.217.0.10:6443: connect: connection refused" Dec 01 14:00:29 crc kubenswrapper[4585]: I1201 14:00:29.458439 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-c7gls" podStartSLOduration=123.458421171 podStartE2EDuration="2m3.458421171s" podCreationTimestamp="2025-12-01 13:58:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:00:29.456376035 +0000 UTC m=+143.440589900" watchObservedRunningTime="2025-12-01 14:00:29.458421171 +0000 UTC m=+143.442635026" Dec 01 14:00:29 crc kubenswrapper[4585]: I1201 14:00:29.464568 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h9trv" event={"ID":"81c799b6-ed27-4b1c-9751-55fdcc101362","Type":"ContainerStarted","Data":"2b9076079959b60b0e8b4d0192ff76d5ea50e3520571bd2dbb88f6ef74ffbb45"} Dec 01 14:00:29 crc kubenswrapper[4585]: I1201 14:00:29.491742 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bcsxh" event={"ID":"dcb5aba5-cd6c-4535-a92d-d1583fe02b18","Type":"ContainerStarted","Data":"b7abd93df5c4604addc87e1f0a12441c30ba5b98b8d51652e4eaa2e484cd27f2"} Dec 01 14:00:29 crc kubenswrapper[4585]: I1201 14:00:29.513565 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 14:00:29 crc kubenswrapper[4585]: E1201 14:00:29.513893 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 14:00:30.013869027 +0000 UTC m=+143.998082882 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:29 crc kubenswrapper[4585]: I1201 14:00:29.514104 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:29 crc kubenswrapper[4585]: E1201 14:00:29.515120 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 14:00:30.015108557 +0000 UTC m=+143.999322412 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrbjm" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:29 crc kubenswrapper[4585]: I1201 14:00:29.523777 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-flcxm" event={"ID":"bf56f15d-84d6-47c7-b8f7-1c9922d8f53f","Type":"ContainerStarted","Data":"8c9b995ca2003dea46a65323daf8d6e998201dbc4cd996041e1b2def6963c414"} Dec 01 14:00:29 crc kubenswrapper[4585]: I1201 14:00:29.523830 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-flcxm" event={"ID":"bf56f15d-84d6-47c7-b8f7-1c9922d8f53f","Type":"ContainerStarted","Data":"d022ab095a36b9f9d610d3fb6389316d5662cd9bd0b7e0b48a38e32bbbf8905d"} Dec 01 14:00:29 crc kubenswrapper[4585]: I1201 14:00:29.524776 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-2xmlv" event={"ID":"3b3388b2-3b24-448b-8806-9857c4e057b9","Type":"ContainerStarted","Data":"f5a417b3f069d9372ffb29f6099cb54762c21ed328939b6ba6bc8ca46ece026f"} Dec 01 14:00:29 crc kubenswrapper[4585]: I1201 14:00:29.524798 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-2xmlv" event={"ID":"3b3388b2-3b24-448b-8806-9857c4e057b9","Type":"ContainerStarted","Data":"b5a12b37e27f5adbd3a66977705494bee89156d2a58d1c7f07ed71c56f70397f"} Dec 01 14:00:29 crc kubenswrapper[4585]: I1201 14:00:29.542230 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-42pj4" event={"ID":"795dab1c-49d5-4b05-a84f-4e1655d459fc","Type":"ContainerStarted","Data":"7530e805bf07487889dffa54fb76a364192db40ea0f9827299df263f48753e8d"} Dec 01 14:00:29 crc kubenswrapper[4585]: I1201 14:00:29.545740 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gvp7h"] Dec 01 14:00:29 crc kubenswrapper[4585]: I1201 14:00:29.557235 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-fxg9h" event={"ID":"e42acc88-ce5b-4fc0-b4a6-c3c78fcf8428","Type":"ContainerStarted","Data":"f06bf7f6b6aed14c333f0a01298bb4acd2acb348ce4a98ee572c215c69895b0b"} Dec 01 14:00:29 crc kubenswrapper[4585]: I1201 14:00:29.557289 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-fxg9h" event={"ID":"e42acc88-ce5b-4fc0-b4a6-c3c78fcf8428","Type":"ContainerStarted","Data":"dad2f71b91d90915ef7dc92caa8179efd48c3500fc3fe896d6b03bad9a111330"} Dec 01 14:00:29 crc kubenswrapper[4585]: I1201 14:00:29.572771 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-sm85h" event={"ID":"bf71ed2f-6b27-4ce7-93ae-6f5ced50b306","Type":"ContainerStarted","Data":"f36e8f7b26d5778dc693d326a388e1c2026e4b826bb9f75471bfaa63ce1c3f37"} Dec 01 14:00:29 crc kubenswrapper[4585]: I1201 14:00:29.573153 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-sm85h" Dec 01 14:00:29 crc kubenswrapper[4585]: I1201 14:00:29.598468 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2zzvd" event={"ID":"7918cfa2-6bbe-4434-a62f-8b06e3ff324e","Type":"ContainerStarted","Data":"a48d218d8432d32fa15cc79cc413d1f365403c6a019d2e756114801efaaeb2c2"} Dec 01 14:00:29 crc kubenswrapper[4585]: I1201 14:00:29.615023 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bcsxh" podStartSLOduration=123.615001655 podStartE2EDuration="2m3.615001655s" podCreationTimestamp="2025-12-01 13:58:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:00:29.563819726 +0000 UTC m=+143.548033571" watchObservedRunningTime="2025-12-01 14:00:29.615001655 +0000 UTC m=+143.599215510" Dec 01 14:00:29 crc kubenswrapper[4585]: I1201 14:00:29.617430 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 14:00:29 crc kubenswrapper[4585]: E1201 14:00:29.618082 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 14:00:30.118061863 +0000 UTC m=+144.102275718 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:29 crc kubenswrapper[4585]: I1201 14:00:29.624564 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pfvzd" event={"ID":"dd4ddb1d-714d-42d7-b307-cd7db58d02de","Type":"ContainerStarted","Data":"cf394da26cd048ce500b527f61530da550d884173d0ab4be2d5a8ab6e7cb951e"} Dec 01 14:00:29 crc kubenswrapper[4585]: I1201 14:00:29.624634 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pfvzd" event={"ID":"dd4ddb1d-714d-42d7-b307-cd7db58d02de","Type":"ContainerStarted","Data":"542c8ef4a0ae92b192dae5875815fe38b17122cb2ab24acfc0d150f5fc14cd42"} Dec 01 14:00:29 crc kubenswrapper[4585]: I1201 14:00:29.634235 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-dsfs8" event={"ID":"c239d6eb-535e-442b-a67a-f8227313ceb4","Type":"ContainerStarted","Data":"b49db3d9ac3c761d310e04175bc8ae16aeae4f17d66fafb39d0c663aacb5789b"} Dec 01 14:00:29 crc kubenswrapper[4585]: I1201 14:00:29.634276 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-dsfs8" Dec 01 14:00:29 crc kubenswrapper[4585]: I1201 14:00:29.635869 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h9trv" podStartSLOduration=123.635858797 podStartE2EDuration="2m3.635858797s" podCreationTimestamp="2025-12-01 13:58:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:00:29.617760554 +0000 UTC m=+143.601974429" watchObservedRunningTime="2025-12-01 14:00:29.635858797 +0000 UTC m=+143.620072652" Dec 01 14:00:29 crc kubenswrapper[4585]: I1201 14:00:29.637995 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pfvzd" Dec 01 14:00:29 crc kubenswrapper[4585]: I1201 14:00:29.655663 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-42z4z"] Dec 01 14:00:29 crc kubenswrapper[4585]: I1201 14:00:29.677298 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jcxv5"] Dec 01 14:00:29 crc kubenswrapper[4585]: I1201 14:00:29.742203 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:29 crc kubenswrapper[4585]: I1201 14:00:29.744180 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-fxg9h" podStartSLOduration=123.744160254 podStartE2EDuration="2m3.744160254s" podCreationTimestamp="2025-12-01 13:58:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:00:29.717355921 +0000 UTC m=+143.701569776" watchObservedRunningTime="2025-12-01 14:00:29.744160254 +0000 UTC m=+143.728374109" Dec 01 14:00:29 crc kubenswrapper[4585]: I1201 14:00:29.750958 4585 patch_prober.go:28] interesting pod/downloads-7954f5f757-dsfs8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Dec 01 14:00:29 crc kubenswrapper[4585]: I1201 14:00:29.751012 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dsfs8" podUID="c239d6eb-535e-442b-a67a-f8227313ceb4" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Dec 01 14:00:29 crc kubenswrapper[4585]: I1201 14:00:29.751061 4585 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-pfvzd container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.18:8443/healthz\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Dec 01 14:00:29 crc kubenswrapper[4585]: I1201 14:00:29.751074 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pfvzd" podUID="dd4ddb1d-714d-42d7-b307-cd7db58d02de" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.18:8443/healthz\": dial tcp 10.217.0.18:8443: connect: connection refused" Dec 01 14:00:29 crc kubenswrapper[4585]: E1201 14:00:29.770299 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 14:00:30.270284026 +0000 UTC m=+144.254497871 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrbjm" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:29 crc kubenswrapper[4585]: I1201 14:00:29.774201 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pm2hc"] Dec 01 14:00:29 crc kubenswrapper[4585]: I1201 14:00:29.786222 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cqw6s" Dec 01 14:00:29 crc kubenswrapper[4585]: I1201 14:00:29.859393 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-2xmlv" podStartSLOduration=6.859375336 podStartE2EDuration="6.859375336s" podCreationTimestamp="2025-12-01 14:00:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:00:29.859325584 +0000 UTC m=+143.843539429" watchObservedRunningTime="2025-12-01 14:00:29.859375336 +0000 UTC m=+143.843589201" Dec 01 14:00:29 crc kubenswrapper[4585]: I1201 14:00:29.860128 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-flcxm" podStartSLOduration=123.86012043 podStartE2EDuration="2m3.86012043s" podCreationTimestamp="2025-12-01 13:58:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:00:29.806388549 +0000 UTC m=+143.790602404" watchObservedRunningTime="2025-12-01 14:00:29.86012043 +0000 UTC m=+143.844334285" Dec 01 14:00:29 crc kubenswrapper[4585]: I1201 14:00:29.860788 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 14:00:29 crc kubenswrapper[4585]: E1201 14:00:29.861546 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 14:00:30.361531335 +0000 UTC m=+144.345745190 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:29 crc kubenswrapper[4585]: I1201 14:00:29.971011 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:29 crc kubenswrapper[4585]: E1201 14:00:29.971311 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 14:00:30.471299581 +0000 UTC m=+144.455513436 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrbjm" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:30 crc kubenswrapper[4585]: I1201 14:00:30.006236 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-sm85h" podStartSLOduration=124.006217745 podStartE2EDuration="2m4.006217745s" podCreationTimestamp="2025-12-01 13:58:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:00:29.980341072 +0000 UTC m=+143.964554927" watchObservedRunningTime="2025-12-01 14:00:30.006217745 +0000 UTC m=+143.990431600" Dec 01 14:00:30 crc kubenswrapper[4585]: I1201 14:00:30.082859 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 14:00:30 crc kubenswrapper[4585]: E1201 14:00:30.083276 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 14:00:30.583257277 +0000 UTC m=+144.567471142 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:30 crc kubenswrapper[4585]: I1201 14:00:30.083458 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-fxg9h" Dec 01 14:00:30 crc kubenswrapper[4585]: I1201 14:00:30.092548 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2zzvd" podStartSLOduration=123.092530786 podStartE2EDuration="2m3.092530786s" podCreationTimestamp="2025-12-01 13:58:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:00:30.046075369 +0000 UTC m=+144.030289224" watchObservedRunningTime="2025-12-01 14:00:30.092530786 +0000 UTC m=+144.076744641" Dec 01 14:00:30 crc kubenswrapper[4585]: I1201 14:00:30.098931 4585 patch_prober.go:28] interesting pod/router-default-5444994796-fxg9h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 14:00:30 crc kubenswrapper[4585]: [-]has-synced failed: reason withheld Dec 01 14:00:30 crc kubenswrapper[4585]: [+]process-running ok Dec 01 14:00:30 crc kubenswrapper[4585]: healthz check failed Dec 01 14:00:30 crc kubenswrapper[4585]: I1201 14:00:30.098998 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fxg9h" podUID="e42acc88-ce5b-4fc0-b4a6-c3c78fcf8428" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 14:00:30 crc kubenswrapper[4585]: I1201 14:00:30.142560 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-dsfs8" podStartSLOduration=124.142513956 podStartE2EDuration="2m4.142513956s" podCreationTimestamp="2025-12-01 13:58:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:00:30.140599214 +0000 UTC m=+144.124813069" watchObservedRunningTime="2025-12-01 14:00:30.142513956 +0000 UTC m=+144.126727811" Dec 01 14:00:30 crc kubenswrapper[4585]: I1201 14:00:30.144333 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pfvzd" podStartSLOduration=123.144322704 podStartE2EDuration="2m3.144322704s" podCreationTimestamp="2025-12-01 13:58:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:00:30.094400246 +0000 UTC m=+144.078614111" watchObservedRunningTime="2025-12-01 14:00:30.144322704 +0000 UTC m=+144.128536559" Dec 01 14:00:30 crc kubenswrapper[4585]: I1201 14:00:30.190955 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:30 crc kubenswrapper[4585]: E1201 14:00:30.191318 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 14:00:30.691306787 +0000 UTC m=+144.675520642 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrbjm" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:30 crc kubenswrapper[4585]: I1201 14:00:30.309744 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 14:00:30 crc kubenswrapper[4585]: E1201 14:00:30.310192 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 14:00:30.810173666 +0000 UTC m=+144.794387521 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:30 crc kubenswrapper[4585]: I1201 14:00:30.417924 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:30 crc kubenswrapper[4585]: E1201 14:00:30.418361 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 14:00:30.918339891 +0000 UTC m=+144.902553806 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrbjm" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:30 crc kubenswrapper[4585]: I1201 14:00:30.523458 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 14:00:30 crc kubenswrapper[4585]: E1201 14:00:30.523690 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 14:00:31.023659603 +0000 UTC m=+145.007873458 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:30 crc kubenswrapper[4585]: I1201 14:00:30.523767 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:30 crc kubenswrapper[4585]: E1201 14:00:30.524186 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 14:00:31.024172759 +0000 UTC m=+145.008386624 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrbjm" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:30 crc kubenswrapper[4585]: I1201 14:00:30.625384 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 14:00:30 crc kubenswrapper[4585]: E1201 14:00:30.625688 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 14:00:31.125673589 +0000 UTC m=+145.109887444 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:30 crc kubenswrapper[4585]: I1201 14:00:30.678583 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-79qpf" event={"ID":"fc4fd121-a822-4178-b67f-3f93eedd535e","Type":"ContainerStarted","Data":"eefbc303d6a038959f07aa9283fed57e017d3a7f13f400b63a777bd4ae31159a"} Dec 01 14:00:30 crc kubenswrapper[4585]: I1201 14:00:30.678622 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-79qpf" event={"ID":"fc4fd121-a822-4178-b67f-3f93eedd535e","Type":"ContainerStarted","Data":"4d4104fe859a57c5d683e9c4c50bb3724062b50e8c52bef532df4cce03ae7a5d"} Dec 01 14:00:30 crc kubenswrapper[4585]: I1201 14:00:30.679256 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jcxv5" event={"ID":"fb007ad6-c27c-4d9e-8b4f-942db474d37b","Type":"ContainerStarted","Data":"4fa6e05fad216661bea3cd8e2038747ef24e31bbee09e39d895f1324b1595d27"} Dec 01 14:00:30 crc kubenswrapper[4585]: I1201 14:00:30.679798 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pm2hc" event={"ID":"7f9e4580-6380-44b8-aec6-8b4ff897cd82","Type":"ContainerStarted","Data":"ee694dcb57adc86114023731649678f6b0bae4c1a48b93642519a2eba55751ab"} Dec 01 14:00:30 crc kubenswrapper[4585]: I1201 14:00:30.685161 4585 generic.go:334] "Generic (PLEG): container finished" podID="fe562f92-5985-4fbf-a5b9-8359e7a044d9" containerID="181017314dc990009a08710fd2363f849b8b3139a80a8ef79f1952f702ab16a9" exitCode=0 Dec 01 14:00:30 crc kubenswrapper[4585]: I1201 14:00:30.685238 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-m8p5b" event={"ID":"fe562f92-5985-4fbf-a5b9-8359e7a044d9","Type":"ContainerDied","Data":"181017314dc990009a08710fd2363f849b8b3139a80a8ef79f1952f702ab16a9"} Dec 01 14:00:30 crc kubenswrapper[4585]: I1201 14:00:30.685267 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-m8p5b" event={"ID":"fe562f92-5985-4fbf-a5b9-8359e7a044d9","Type":"ContainerStarted","Data":"ab5971fb4dbe6a44895bd978dd01aeceeecfe9a191bb93867b2b15301f0c1eb0"} Dec 01 14:00:30 crc kubenswrapper[4585]: I1201 14:00:30.696095 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-42pj4" event={"ID":"795dab1c-49d5-4b05-a84f-4e1655d459fc","Type":"ContainerStarted","Data":"56db1c50386c4e0b294e03452ef9ea04189a3984e1f7e18e159f77d7bdbe4f26"} Dec 01 14:00:30 crc kubenswrapper[4585]: I1201 14:00:30.696130 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-42pj4" event={"ID":"795dab1c-49d5-4b05-a84f-4e1655d459fc","Type":"ContainerStarted","Data":"86623e23b660a9c88b6097fe18e6a04858b4d31c6e5b466f3a00619b6f49e1f7"} Dec 01 14:00:30 crc kubenswrapper[4585]: I1201 14:00:30.729650 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:30 crc kubenswrapper[4585]: E1201 14:00:30.730092 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 14:00:31.230076592 +0000 UTC m=+145.214290447 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrbjm" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:30 crc kubenswrapper[4585]: I1201 14:00:30.731097 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gvp7h" event={"ID":"e7b5b76d-8f89-4753-81a8-81a886d87abf","Type":"ContainerStarted","Data":"0daeea84d0832a365525563bad0969c432c0b1c1dcd14a9b7990f8850058dcd2"} Dec 01 14:00:30 crc kubenswrapper[4585]: I1201 14:00:30.731129 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gvp7h" event={"ID":"e7b5b76d-8f89-4753-81a8-81a886d87abf","Type":"ContainerStarted","Data":"d113871a2f51264c1a86caa35465ad77686aa4b5cbb641f6736b4d4f76e17010"} Dec 01 14:00:30 crc kubenswrapper[4585]: I1201 14:00:30.753012 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-9r6bj"] Dec 01 14:00:30 crc kubenswrapper[4585]: I1201 14:00:30.785994 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-42z4z" event={"ID":"e118cb4b-2a7d-4f32-ad96-98ad49c928f7","Type":"ContainerStarted","Data":"7d033beb135caf01c84b1448d677a12cccb958bb27a23013491bca465d3c067a"} Dec 01 14:00:30 crc kubenswrapper[4585]: I1201 14:00:30.806127 4585 patch_prober.go:28] interesting pod/downloads-7954f5f757-dsfs8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Dec 01 14:00:30 crc kubenswrapper[4585]: I1201 14:00:30.806185 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dsfs8" podUID="c239d6eb-535e-442b-a67a-f8227313ceb4" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Dec 01 14:00:30 crc kubenswrapper[4585]: I1201 14:00:30.833387 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 14:00:30 crc kubenswrapper[4585]: E1201 14:00:30.834522 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 14:00:31.334501575 +0000 UTC m=+145.318715430 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:30 crc kubenswrapper[4585]: I1201 14:00:30.854982 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-42pj4" podStartSLOduration=124.854949714 podStartE2EDuration="2m4.854949714s" podCreationTimestamp="2025-12-01 13:58:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:00:30.772041293 +0000 UTC m=+144.756255178" watchObservedRunningTime="2025-12-01 14:00:30.854949714 +0000 UTC m=+144.839163579" Dec 01 14:00:30 crc kubenswrapper[4585]: I1201 14:00:30.886094 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pfvzd" Dec 01 14:00:30 crc kubenswrapper[4585]: I1201 14:00:30.899462 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gvp7h" podStartSLOduration=124.899440147 podStartE2EDuration="2m4.899440147s" podCreationTimestamp="2025-12-01 13:58:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:00:30.861740343 +0000 UTC m=+144.845954198" watchObservedRunningTime="2025-12-01 14:00:30.899440147 +0000 UTC m=+144.883654002" Dec 01 14:00:30 crc kubenswrapper[4585]: I1201 14:00:30.905833 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-n4pr6"] Dec 01 14:00:30 crc kubenswrapper[4585]: I1201 14:00:30.919585 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-c5f9s"] Dec 01 14:00:30 crc kubenswrapper[4585]: I1201 14:00:30.937264 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:30 crc kubenswrapper[4585]: E1201 14:00:30.980494 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 14:00:31.480472477 +0000 UTC m=+145.464686332 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrbjm" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:31 crc kubenswrapper[4585]: I1201 14:00:31.032829 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jsmc4"] Dec 01 14:00:31 crc kubenswrapper[4585]: I1201 14:00:31.085739 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409960-6f6w5"] Dec 01 14:00:31 crc kubenswrapper[4585]: I1201 14:00:31.089562 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 14:00:31 crc kubenswrapper[4585]: E1201 14:00:31.091003 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 14:00:31.590932955 +0000 UTC m=+145.575146810 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:31 crc kubenswrapper[4585]: I1201 14:00:31.089186 4585 patch_prober.go:28] interesting pod/router-default-5444994796-fxg9h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 14:00:31 crc kubenswrapper[4585]: [-]has-synced failed: reason withheld Dec 01 14:00:31 crc kubenswrapper[4585]: [+]process-running ok Dec 01 14:00:31 crc kubenswrapper[4585]: healthz check failed Dec 01 14:00:31 crc kubenswrapper[4585]: I1201 14:00:31.092651 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fxg9h" podUID="e42acc88-ce5b-4fc0-b4a6-c3c78fcf8428" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 14:00:31 crc kubenswrapper[4585]: I1201 14:00:31.097699 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:31 crc kubenswrapper[4585]: E1201 14:00:31.099246 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 14:00:31.599228732 +0000 UTC m=+145.583442587 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrbjm" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:31 crc kubenswrapper[4585]: I1201 14:00:31.187681 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rn9hl"] Dec 01 14:00:31 crc kubenswrapper[4585]: I1201 14:00:31.189903 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hdrjx"] Dec 01 14:00:31 crc kubenswrapper[4585]: I1201 14:00:31.203477 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 14:00:31 crc kubenswrapper[4585]: E1201 14:00:31.203882 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 14:00:31.703866693 +0000 UTC m=+145.688080538 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:31 crc kubenswrapper[4585]: I1201 14:00:31.278707 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-d2nj6"] Dec 01 14:00:31 crc kubenswrapper[4585]: I1201 14:00:31.308575 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:31 crc kubenswrapper[4585]: E1201 14:00:31.308914 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 14:00:31.808900816 +0000 UTC m=+145.793114681 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrbjm" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:31 crc kubenswrapper[4585]: I1201 14:00:31.409119 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 14:00:31 crc kubenswrapper[4585]: E1201 14:00:31.409507 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 14:00:31.909493066 +0000 UTC m=+145.893706921 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:31 crc kubenswrapper[4585]: I1201 14:00:31.475053 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-2l65t"] Dec 01 14:00:31 crc kubenswrapper[4585]: I1201 14:00:31.510671 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:31 crc kubenswrapper[4585]: E1201 14:00:31.511053 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 14:00:32.011022147 +0000 UTC m=+145.995236002 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrbjm" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:31 crc kubenswrapper[4585]: I1201 14:00:31.615690 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 14:00:31 crc kubenswrapper[4585]: E1201 14:00:31.616005 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 14:00:32.115989738 +0000 UTC m=+146.100203603 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:31 crc kubenswrapper[4585]: I1201 14:00:31.717348 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:31 crc kubenswrapper[4585]: E1201 14:00:31.717962 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 14:00:32.217950662 +0000 UTC m=+146.202164517 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrbjm" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:31 crc kubenswrapper[4585]: I1201 14:00:31.800201 4585 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-c7gls container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.10:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 01 14:00:31 crc kubenswrapper[4585]: I1201 14:00:31.800273 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-c7gls" podUID="79abd33c-0184-473e-8bb9-c408a5c32efc" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.10:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 01 14:00:31 crc kubenswrapper[4585]: I1201 14:00:31.818849 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 14:00:31 crc kubenswrapper[4585]: E1201 14:00:31.819255 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 14:00:32.319240935 +0000 UTC m=+146.303454790 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:31 crc kubenswrapper[4585]: I1201 14:00:31.820137 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409960-6f6w5" event={"ID":"1ca012d6-094a-4703-b8cc-d9d53fa9886d","Type":"ContainerStarted","Data":"a9aad827d52d1f52de173a6dcfb71211259ba5168b0d26b084d15fbd09147e29"} Dec 01 14:00:31 crc kubenswrapper[4585]: I1201 14:00:31.821860 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-kwp96"] Dec 01 14:00:31 crc kubenswrapper[4585]: I1201 14:00:31.822362 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-9r6bj" event={"ID":"4c6d8fb5-74db-4420-90f6-33a5026a755f","Type":"ContainerStarted","Data":"52ec8de6f3ee1160535f95ef043358c26289fe5277a929f1071f9e9c79dafbce"} Dec 01 14:00:31 crc kubenswrapper[4585]: I1201 14:00:31.822397 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-9r6bj" event={"ID":"4c6d8fb5-74db-4420-90f6-33a5026a755f","Type":"ContainerStarted","Data":"24e4bbf8c3065e77dc240a45a8347b79b434a9e2048c99b511708fbc31068b69"} Dec 01 14:00:31 crc kubenswrapper[4585]: I1201 14:00:31.861878 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-79qpf" event={"ID":"fc4fd121-a822-4178-b67f-3f93eedd535e","Type":"ContainerStarted","Data":"5cc0f29d92cb1c440eea9655e88e7d1d72673911c32aaea522f96e08e5c4c9e3"} Dec 01 14:00:31 crc kubenswrapper[4585]: I1201 14:00:31.862343 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-79qpf" Dec 01 14:00:31 crc kubenswrapper[4585]: I1201 14:00:31.868063 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pm2hc" event={"ID":"7f9e4580-6380-44b8-aec6-8b4ff897cd82","Type":"ContainerStarted","Data":"26a14334b42a7912723b2b3ccca189f315a7f7e2c7002b76618ba01093150b22"} Dec 01 14:00:31 crc kubenswrapper[4585]: I1201 14:00:31.910713 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-9r6bj" podStartSLOduration=124.910695501 podStartE2EDuration="2m4.910695501s" podCreationTimestamp="2025-12-01 13:58:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:00:31.91005273 +0000 UTC m=+145.894266585" watchObservedRunningTime="2025-12-01 14:00:31.910695501 +0000 UTC m=+145.894909356" Dec 01 14:00:31 crc kubenswrapper[4585]: I1201 14:00:31.911496 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-d2nj6" event={"ID":"5d58ac16-d8ec-4d10-ab7f-cc20ecb76eb7","Type":"ContainerStarted","Data":"3fb12a50769d1748a1b61bf0c96b25745fac6bb0415f3e8c25d87d964d44beb1"} Dec 01 14:00:31 crc kubenswrapper[4585]: I1201 14:00:31.926909 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:31 crc kubenswrapper[4585]: E1201 14:00:31.929592 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 14:00:32.429580779 +0000 UTC m=+146.413794624 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrbjm" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:31 crc kubenswrapper[4585]: I1201 14:00:31.947607 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-42z4z" event={"ID":"e118cb4b-2a7d-4f32-ad96-98ad49c928f7","Type":"ContainerStarted","Data":"94ac112b7cb387698fc41027bfb286ce6bd61b5ee727a0b2270be8bc6b31b36c"} Dec 01 14:00:31 crc kubenswrapper[4585]: I1201 14:00:31.975243 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-79qpf" podStartSLOduration=124.975210189 podStartE2EDuration="2m4.975210189s" podCreationTimestamp="2025-12-01 13:58:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:00:31.959560765 +0000 UTC m=+145.943774620" watchObservedRunningTime="2025-12-01 14:00:31.975210189 +0000 UTC m=+145.959424044" Dec 01 14:00:31 crc kubenswrapper[4585]: I1201 14:00:31.975419 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-7hljs"] Dec 01 14:00:31 crc kubenswrapper[4585]: I1201 14:00:31.982819 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-m8p5b" event={"ID":"fe562f92-5985-4fbf-a5b9-8359e7a044d9","Type":"ContainerStarted","Data":"78dc951ea2f34e2b9896416a03760c92996fc5abfc81c95cf656b89012d57dc3"} Dec 01 14:00:32 crc kubenswrapper[4585]: I1201 14:00:32.005297 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pm2hc" podStartSLOduration=126.005278818 podStartE2EDuration="2m6.005278818s" podCreationTimestamp="2025-12-01 13:58:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:00:32.004934847 +0000 UTC m=+145.989148702" watchObservedRunningTime="2025-12-01 14:00:32.005278818 +0000 UTC m=+145.989492673" Dec 01 14:00:32 crc kubenswrapper[4585]: I1201 14:00:32.028486 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 14:00:32 crc kubenswrapper[4585]: E1201 14:00:32.028674 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 14:00:32.52864654 +0000 UTC m=+146.512860405 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:32 crc kubenswrapper[4585]: I1201 14:00:32.028925 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:32 crc kubenswrapper[4585]: E1201 14:00:32.030267 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 14:00:32.530247722 +0000 UTC m=+146.514461647 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrbjm" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:32 crc kubenswrapper[4585]: I1201 14:00:32.051009 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rn9hl" event={"ID":"dc83aa4a-2686-47c8-876b-c6cf2192b493","Type":"ContainerStarted","Data":"db3c719c81cd91d0fb5701bfa4d1928943290bf013454c81dbe7c9426e6f750e"} Dec 01 14:00:32 crc kubenswrapper[4585]: I1201 14:00:32.071323 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-42z4z" podStartSLOduration=10.071308975 podStartE2EDuration="10.071308975s" podCreationTimestamp="2025-12-01 14:00:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:00:32.069035271 +0000 UTC m=+146.053249126" watchObservedRunningTime="2025-12-01 14:00:32.071308975 +0000 UTC m=+146.055522830" Dec 01 14:00:32 crc kubenswrapper[4585]: I1201 14:00:32.086174 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jsmc4" event={"ID":"f0de865f-14b4-4fd8-9be7-3b5655814a67","Type":"ContainerStarted","Data":"be1dfca690923a42aa503298523d069c2fde55b26c59b96dc6e47fcb77320fb0"} Dec 01 14:00:32 crc kubenswrapper[4585]: I1201 14:00:32.110314 4585 patch_prober.go:28] interesting pod/router-default-5444994796-fxg9h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 14:00:32 crc kubenswrapper[4585]: [-]has-synced failed: reason withheld Dec 01 14:00:32 crc kubenswrapper[4585]: [+]process-running ok Dec 01 14:00:32 crc kubenswrapper[4585]: healthz check failed Dec 01 14:00:32 crc kubenswrapper[4585]: I1201 14:00:32.110358 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fxg9h" podUID="e42acc88-ce5b-4fc0-b4a6-c3c78fcf8428" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 14:00:32 crc kubenswrapper[4585]: I1201 14:00:32.138705 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 14:00:32 crc kubenswrapper[4585]: I1201 14:00:32.138929 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-m8p5b" podStartSLOduration=126.138908022 podStartE2EDuration="2m6.138908022s" podCreationTimestamp="2025-12-01 13:58:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:00:32.13822144 +0000 UTC m=+146.122435295" watchObservedRunningTime="2025-12-01 14:00:32.138908022 +0000 UTC m=+146.123121877" Dec 01 14:00:32 crc kubenswrapper[4585]: E1201 14:00:32.139047 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 14:00:32.639032386 +0000 UTC m=+146.623246241 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:32 crc kubenswrapper[4585]: I1201 14:00:32.182219 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jcxv5" event={"ID":"fb007ad6-c27c-4d9e-8b4f-942db474d37b","Type":"ContainerStarted","Data":"09d49f3f083eb63dbe026a69f15df388115c3ab68a32a0fe5fb5c39bd3278afc"} Dec 01 14:00:32 crc kubenswrapper[4585]: I1201 14:00:32.182578 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jcxv5" Dec 01 14:00:32 crc kubenswrapper[4585]: I1201 14:00:32.185689 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-c5f9s" event={"ID":"bbfadd6e-84a7-4fa8-9766-0358345cf2e2","Type":"ContainerStarted","Data":"4d8be49a86b9084c38fb17fa3c1341cdf08aafe74036eeb64d3bc73c625da539"} Dec 01 14:00:32 crc kubenswrapper[4585]: I1201 14:00:32.210489 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hdrjx" event={"ID":"b615253a-f52e-4607-a63c-7cf1c07dab6b","Type":"ContainerStarted","Data":"33f4eb70234be22f1a89b6a1b7f15ceb7661bb524881e415d3dd8edbec9a3dd0"} Dec 01 14:00:32 crc kubenswrapper[4585]: I1201 14:00:32.211559 4585 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-jcxv5 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" start-of-body= Dec 01 14:00:32 crc kubenswrapper[4585]: I1201 14:00:32.211604 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jcxv5" podUID="fb007ad6-c27c-4d9e-8b4f-942db474d37b" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" Dec 01 14:00:32 crc kubenswrapper[4585]: I1201 14:00:32.225542 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-n4pr6" event={"ID":"e0b7b830-078c-4448-b914-ab62e5ff7059","Type":"ContainerStarted","Data":"c271bb81a6ff7c8a174716750428e5b3af447fa7e355a3f9f0c65bd01d6fba6f"} Dec 01 14:00:32 crc kubenswrapper[4585]: I1201 14:00:32.226370 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-n4pr6" Dec 01 14:00:32 crc kubenswrapper[4585]: I1201 14:00:32.233129 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-2l65t" event={"ID":"0b61c7e4-9ec8-4fd7-a9d8-2e386dbf9acd","Type":"ContainerStarted","Data":"3d19b004db55fcd63e6140cfa8b0ad0b17a05f30104ab850ec2db77bae75f44f"} Dec 01 14:00:32 crc kubenswrapper[4585]: I1201 14:00:32.248023 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:32 crc kubenswrapper[4585]: E1201 14:00:32.249039 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 14:00:32.749029409 +0000 UTC m=+146.733243264 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrbjm" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:32 crc kubenswrapper[4585]: I1201 14:00:32.254140 4585 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-n4pr6 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Dec 01 14:00:32 crc kubenswrapper[4585]: I1201 14:00:32.254177 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-n4pr6" podUID="e0b7b830-078c-4448-b914-ab62e5ff7059" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Dec 01 14:00:32 crc kubenswrapper[4585]: I1201 14:00:32.260298 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jcxv5" podStartSLOduration=125.260284492 podStartE2EDuration="2m5.260284492s" podCreationTimestamp="2025-12-01 13:58:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:00:32.234323256 +0000 UTC m=+146.218537111" watchObservedRunningTime="2025-12-01 14:00:32.260284492 +0000 UTC m=+146.244498347" Dec 01 14:00:32 crc kubenswrapper[4585]: I1201 14:00:32.348987 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 14:00:32 crc kubenswrapper[4585]: E1201 14:00:32.349341 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 14:00:32.84931614 +0000 UTC m=+146.833529995 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:32 crc kubenswrapper[4585]: I1201 14:00:32.349487 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:32 crc kubenswrapper[4585]: E1201 14:00:32.352245 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 14:00:32.852236314 +0000 UTC m=+146.836450169 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrbjm" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:32 crc kubenswrapper[4585]: I1201 14:00:32.450669 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 14:00:32 crc kubenswrapper[4585]: E1201 14:00:32.450997 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 14:00:32.950965274 +0000 UTC m=+146.935179119 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:32 crc kubenswrapper[4585]: I1201 14:00:32.552261 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:32 crc kubenswrapper[4585]: E1201 14:00:32.553127 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 14:00:33.053110574 +0000 UTC m=+147.037324429 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrbjm" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:32 crc kubenswrapper[4585]: I1201 14:00:32.653935 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 14:00:32 crc kubenswrapper[4585]: E1201 14:00:32.654317 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 14:00:33.154302704 +0000 UTC m=+147.138516559 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:32 crc kubenswrapper[4585]: I1201 14:00:32.755479 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:32 crc kubenswrapper[4585]: E1201 14:00:32.755810 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 14:00:33.255794513 +0000 UTC m=+147.240008368 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrbjm" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:32 crc kubenswrapper[4585]: I1201 14:00:32.856724 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 14:00:32 crc kubenswrapper[4585]: E1201 14:00:32.857243 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 14:00:33.35722803 +0000 UTC m=+147.341441885 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:32 crc kubenswrapper[4585]: I1201 14:00:32.958261 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:32 crc kubenswrapper[4585]: E1201 14:00:32.958637 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 14:00:33.458619116 +0000 UTC m=+147.442833041 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrbjm" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:33 crc kubenswrapper[4585]: I1201 14:00:33.060579 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 14:00:33 crc kubenswrapper[4585]: E1201 14:00:33.060743 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 14:00:33.560710814 +0000 UTC m=+147.544924679 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:33 crc kubenswrapper[4585]: I1201 14:00:33.060823 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:33 crc kubenswrapper[4585]: E1201 14:00:33.061272 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 14:00:33.561259722 +0000 UTC m=+147.545473577 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrbjm" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:33 crc kubenswrapper[4585]: I1201 14:00:33.106225 4585 patch_prober.go:28] interesting pod/router-default-5444994796-fxg9h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 14:00:33 crc kubenswrapper[4585]: [-]has-synced failed: reason withheld Dec 01 14:00:33 crc kubenswrapper[4585]: [+]process-running ok Dec 01 14:00:33 crc kubenswrapper[4585]: healthz check failed Dec 01 14:00:33 crc kubenswrapper[4585]: I1201 14:00:33.106282 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fxg9h" podUID="e42acc88-ce5b-4fc0-b4a6-c3c78fcf8428" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 14:00:33 crc kubenswrapper[4585]: I1201 14:00:33.162596 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 14:00:33 crc kubenswrapper[4585]: E1201 14:00:33.162941 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 14:00:33.662923487 +0000 UTC m=+147.647137342 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:33 crc kubenswrapper[4585]: I1201 14:00:33.263587 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:33 crc kubenswrapper[4585]: E1201 14:00:33.264179 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 14:00:33.764168337 +0000 UTC m=+147.748382192 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrbjm" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:33 crc kubenswrapper[4585]: I1201 14:00:33.293263 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-c5f9s" event={"ID":"bbfadd6e-84a7-4fa8-9766-0358345cf2e2","Type":"ContainerStarted","Data":"e3feb15cd01f16e39b16f268b67fc01d4a922c240356392fdd3fd20edbd4889b"} Dec 01 14:00:33 crc kubenswrapper[4585]: I1201 14:00:33.309721 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-sm85h" Dec 01 14:00:33 crc kubenswrapper[4585]: I1201 14:00:33.319716 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hdrjx" event={"ID":"b615253a-f52e-4607-a63c-7cf1c07dab6b","Type":"ContainerStarted","Data":"8657a9e1dd258ca6c13c7fc6d2ca6b96548f6fdfd234e7153f50143454c084d4"} Dec 01 14:00:33 crc kubenswrapper[4585]: I1201 14:00:33.320630 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-hdrjx" Dec 01 14:00:33 crc kubenswrapper[4585]: I1201 14:00:33.338912 4585 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-hdrjx container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/healthz\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Dec 01 14:00:33 crc kubenswrapper[4585]: I1201 14:00:33.338961 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-hdrjx" podUID="b615253a-f52e-4607-a63c-7cf1c07dab6b" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.23:8080/healthz\": dial tcp 10.217.0.23:8080: connect: connection refused" Dec 01 14:00:33 crc kubenswrapper[4585]: I1201 14:00:33.341175 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-n4pr6" event={"ID":"e0b7b830-078c-4448-b914-ab62e5ff7059","Type":"ContainerStarted","Data":"2cbe9f4750d1dc4f5fdacae9f66accbd7e008fac26e4d61d68b288134b1bf124"} Dec 01 14:00:33 crc kubenswrapper[4585]: I1201 14:00:33.352770 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-n4pr6" Dec 01 14:00:33 crc kubenswrapper[4585]: I1201 14:00:33.359230 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rn9hl" event={"ID":"dc83aa4a-2686-47c8-876b-c6cf2192b493","Type":"ContainerStarted","Data":"95975ca939ec5c3989988fecb2587eaac8bab4a3e222f732dc323ba43f0b6a24"} Dec 01 14:00:33 crc kubenswrapper[4585]: I1201 14:00:33.367323 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 14:00:33 crc kubenswrapper[4585]: E1201 14:00:33.368432 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 14:00:33.868415865 +0000 UTC m=+147.852629720 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:33 crc kubenswrapper[4585]: I1201 14:00:33.393390 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kwp96" event={"ID":"c5a1b274-e180-463c-acf4-d2a5aa181827","Type":"ContainerStarted","Data":"fc0d8680a234502a8f8c352a1b845369333dd47573232e46e81df83b869c1bc6"} Dec 01 14:00:33 crc kubenswrapper[4585]: I1201 14:00:33.393434 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kwp96" event={"ID":"c5a1b274-e180-463c-acf4-d2a5aa181827","Type":"ContainerStarted","Data":"b21cd1fdcdcd4f83af75fa396d1bb71446890c1f806300786462828e679183f9"} Dec 01 14:00:33 crc kubenswrapper[4585]: I1201 14:00:33.393444 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kwp96" event={"ID":"c5a1b274-e180-463c-acf4-d2a5aa181827","Type":"ContainerStarted","Data":"3cb52dd2aa4df1cc2d9a33d2e50734fe065f41c2a815216ec14ea3f65a17ec79"} Dec 01 14:00:33 crc kubenswrapper[4585]: I1201 14:00:33.395089 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-7hljs" event={"ID":"878aa194-4036-493a-8e75-990c4b57e793","Type":"ContainerStarted","Data":"efe4ce47fc07eebe50f6484e893e058f5965747df68b5e58a26a79003e24397d"} Dec 01 14:00:33 crc kubenswrapper[4585]: I1201 14:00:33.395113 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-7hljs" event={"ID":"878aa194-4036-493a-8e75-990c4b57e793","Type":"ContainerStarted","Data":"d0e0f39e34bc3454c729658d39e6ddcd66b506fa4a600626426326e1b263ca8a"} Dec 01 14:00:33 crc kubenswrapper[4585]: I1201 14:00:33.396022 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409960-6f6w5" event={"ID":"1ca012d6-094a-4703-b8cc-d9d53fa9886d","Type":"ContainerStarted","Data":"589913f8e0eccdf800c0ca0f20d5850b40b34cbd7ee4a27991847f30a8b4690f"} Dec 01 14:00:33 crc kubenswrapper[4585]: I1201 14:00:33.397117 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-2l65t" event={"ID":"0b61c7e4-9ec8-4fd7-a9d8-2e386dbf9acd","Type":"ContainerStarted","Data":"1bceb66f3df9e4de14373cc636c60c0f27fb8a980e2529e039b5149c2a0c7590"} Dec 01 14:00:33 crc kubenswrapper[4585]: I1201 14:00:33.398287 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-d2nj6" event={"ID":"5d58ac16-d8ec-4d10-ab7f-cc20ecb76eb7","Type":"ContainerStarted","Data":"ba131968147119cb19253513aba62ee70c9f067c5d430ded62e4354ec6e819cc"} Dec 01 14:00:33 crc kubenswrapper[4585]: I1201 14:00:33.398312 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-d2nj6" event={"ID":"5d58ac16-d8ec-4d10-ab7f-cc20ecb76eb7","Type":"ContainerStarted","Data":"318e23742f654fb283eb03ccfc13c3d19e83c35b18783983bed5d21ea729a51d"} Dec 01 14:00:33 crc kubenswrapper[4585]: I1201 14:00:33.398523 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-d2nj6" Dec 01 14:00:33 crc kubenswrapper[4585]: I1201 14:00:33.416466 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-n4pr6" podStartSLOduration=127.416446702 podStartE2EDuration="2m7.416446702s" podCreationTimestamp="2025-12-01 13:58:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:00:32.262954148 +0000 UTC m=+146.247168003" watchObservedRunningTime="2025-12-01 14:00:33.416446702 +0000 UTC m=+147.400660567" Dec 01 14:00:33 crc kubenswrapper[4585]: I1201 14:00:33.420666 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jsmc4" event={"ID":"f0de865f-14b4-4fd8-9be7-3b5655814a67","Type":"ContainerStarted","Data":"f0d90a12467d5796ed9884d35cdf3aabc20e06610712f647bdbca7e76349e3b7"} Dec 01 14:00:33 crc kubenswrapper[4585]: I1201 14:00:33.471777 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:33 crc kubenswrapper[4585]: I1201 14:00:33.471858 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 14:00:33 crc kubenswrapper[4585]: I1201 14:00:33.471886 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 14:00:33 crc kubenswrapper[4585]: I1201 14:00:33.471917 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 14:00:33 crc kubenswrapper[4585]: I1201 14:00:33.471991 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 14:00:33 crc kubenswrapper[4585]: E1201 14:00:33.474680 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 14:00:33.974668448 +0000 UTC m=+147.958882303 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrbjm" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:33 crc kubenswrapper[4585]: I1201 14:00:33.484096 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 14:00:33 crc kubenswrapper[4585]: I1201 14:00:33.498385 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jcxv5" Dec 01 14:00:33 crc kubenswrapper[4585]: I1201 14:00:33.499692 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 14:00:33 crc kubenswrapper[4585]: I1201 14:00:33.501965 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 14:00:33 crc kubenswrapper[4585]: I1201 14:00:33.502609 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-hdrjx" podStartSLOduration=126.502595477 podStartE2EDuration="2m6.502595477s" podCreationTimestamp="2025-12-01 13:58:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:00:33.502104151 +0000 UTC m=+147.486318006" watchObservedRunningTime="2025-12-01 14:00:33.502595477 +0000 UTC m=+147.486809322" Dec 01 14:00:33 crc kubenswrapper[4585]: I1201 14:00:33.510743 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 14:00:33 crc kubenswrapper[4585]: I1201 14:00:33.574234 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 14:00:33 crc kubenswrapper[4585]: E1201 14:00:33.574328 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 14:00:34.074314117 +0000 UTC m=+148.058527972 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:33 crc kubenswrapper[4585]: I1201 14:00:33.574500 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:33 crc kubenswrapper[4585]: E1201 14:00:33.574772 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 14:00:34.074765292 +0000 UTC m=+148.058979147 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrbjm" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:33 crc kubenswrapper[4585]: I1201 14:00:33.679728 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 14:00:33 crc kubenswrapper[4585]: E1201 14:00:33.680200 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 14:00:34.180177038 +0000 UTC m=+148.164390893 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:33 crc kubenswrapper[4585]: I1201 14:00:33.688142 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-d2nj6" podStartSLOduration=11.688126094 podStartE2EDuration="11.688126094s" podCreationTimestamp="2025-12-01 14:00:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:00:33.681128218 +0000 UTC m=+147.665342073" watchObservedRunningTime="2025-12-01 14:00:33.688126094 +0000 UTC m=+147.672339949" Dec 01 14:00:33 crc kubenswrapper[4585]: I1201 14:00:33.732554 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 14:00:33 crc kubenswrapper[4585]: I1201 14:00:33.740327 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rn9hl" podStartSLOduration=127.740310455 podStartE2EDuration="2m7.740310455s" podCreationTimestamp="2025-12-01 13:58:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:00:33.740272804 +0000 UTC m=+147.724486659" watchObservedRunningTime="2025-12-01 14:00:33.740310455 +0000 UTC m=+147.724524310" Dec 01 14:00:33 crc kubenswrapper[4585]: I1201 14:00:33.751996 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 14:00:33 crc kubenswrapper[4585]: I1201 14:00:33.758392 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 14:00:33 crc kubenswrapper[4585]: I1201 14:00:33.796926 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:33 crc kubenswrapper[4585]: E1201 14:00:33.797226 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 14:00:34.297212228 +0000 UTC m=+148.281426083 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrbjm" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:33 crc kubenswrapper[4585]: I1201 14:00:33.825314 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-f9k95" Dec 01 14:00:33 crc kubenswrapper[4585]: I1201 14:00:33.825352 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-f9k95" Dec 01 14:00:33 crc kubenswrapper[4585]: I1201 14:00:33.840115 4585 patch_prober.go:28] interesting pod/console-f9d7485db-f9k95 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.9:8443/health\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Dec 01 14:00:33 crc kubenswrapper[4585]: I1201 14:00:33.840170 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-f9k95" podUID="bb6e47d0-5966-48d3-be81-97265e7e7a4f" containerName="console" probeResult="failure" output="Get \"https://10.217.0.9:8443/health\": dial tcp 10.217.0.9:8443: connect: connection refused" Dec 01 14:00:33 crc kubenswrapper[4585]: I1201 14:00:33.849404 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29409960-6f6w5" podStartSLOduration=33.849389748 podStartE2EDuration="33.849389748s" podCreationTimestamp="2025-12-01 14:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:00:33.771997665 +0000 UTC m=+147.756211520" watchObservedRunningTime="2025-12-01 14:00:33.849389748 +0000 UTC m=+147.833603593" Dec 01 14:00:33 crc kubenswrapper[4585]: I1201 14:00:33.901072 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 14:00:33 crc kubenswrapper[4585]: E1201 14:00:33.902246 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 14:00:34.40222987 +0000 UTC m=+148.386443725 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:33 crc kubenswrapper[4585]: I1201 14:00:33.919459 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kwp96" podStartSLOduration=127.919441365 podStartE2EDuration="2m7.919441365s" podCreationTimestamp="2025-12-01 13:58:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:00:33.849905945 +0000 UTC m=+147.834119800" watchObservedRunningTime="2025-12-01 14:00:33.919441365 +0000 UTC m=+147.903655230" Dec 01 14:00:34 crc kubenswrapper[4585]: I1201 14:00:34.003673 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:34 crc kubenswrapper[4585]: E1201 14:00:34.004026 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 14:00:34.504015039 +0000 UTC m=+148.488228894 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrbjm" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:34 crc kubenswrapper[4585]: I1201 14:00:34.030544 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jsmc4" podStartSLOduration=128.030529023 podStartE2EDuration="2m8.030529023s" podCreationTimestamp="2025-12-01 13:58:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:00:34.029379736 +0000 UTC m=+148.013593611" watchObservedRunningTime="2025-12-01 14:00:34.030529023 +0000 UTC m=+148.014742878" Dec 01 14:00:34 crc kubenswrapper[4585]: I1201 14:00:34.116208 4585 patch_prober.go:28] interesting pod/router-default-5444994796-fxg9h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 14:00:34 crc kubenswrapper[4585]: [-]has-synced failed: reason withheld Dec 01 14:00:34 crc kubenswrapper[4585]: [+]process-running ok Dec 01 14:00:34 crc kubenswrapper[4585]: healthz check failed Dec 01 14:00:34 crc kubenswrapper[4585]: I1201 14:00:34.116262 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fxg9h" podUID="e42acc88-ce5b-4fc0-b4a6-c3c78fcf8428" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 14:00:34 crc kubenswrapper[4585]: I1201 14:00:34.117608 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 14:00:34 crc kubenswrapper[4585]: E1201 14:00:34.117886 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 14:00:34.617870307 +0000 UTC m=+148.602084162 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:34 crc kubenswrapper[4585]: I1201 14:00:34.219476 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:34 crc kubenswrapper[4585]: E1201 14:00:34.219895 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 14:00:34.719880552 +0000 UTC m=+148.704094407 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrbjm" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:34 crc kubenswrapper[4585]: I1201 14:00:34.248478 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2zzvd" Dec 01 14:00:34 crc kubenswrapper[4585]: I1201 14:00:34.248551 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2zzvd" Dec 01 14:00:34 crc kubenswrapper[4585]: I1201 14:00:34.257151 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2zzvd" Dec 01 14:00:34 crc kubenswrapper[4585]: I1201 14:00:34.320895 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 14:00:34 crc kubenswrapper[4585]: E1201 14:00:34.321090 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 14:00:34.821073812 +0000 UTC m=+148.805287667 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:34 crc kubenswrapper[4585]: I1201 14:00:34.321359 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:34 crc kubenswrapper[4585]: E1201 14:00:34.321686 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 14:00:34.821667781 +0000 UTC m=+148.805881636 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrbjm" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:34 crc kubenswrapper[4585]: I1201 14:00:34.422026 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 14:00:34 crc kubenswrapper[4585]: E1201 14:00:34.422378 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 14:00:34.922364145 +0000 UTC m=+148.906578000 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:34 crc kubenswrapper[4585]: I1201 14:00:34.432645 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ng9zf"] Dec 01 14:00:34 crc kubenswrapper[4585]: I1201 14:00:34.442738 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-7hljs" event={"ID":"878aa194-4036-493a-8e75-990c4b57e793","Type":"ContainerStarted","Data":"e8f0e12f57feb7cc0d410bf88d69e17d5df99eee6c8f23541d0291f5300032c5"} Dec 01 14:00:34 crc kubenswrapper[4585]: I1201 14:00:34.442859 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ng9zf" Dec 01 14:00:34 crc kubenswrapper[4585]: I1201 14:00:34.448829 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-2l65t" event={"ID":"0b61c7e4-9ec8-4fd7-a9d8-2e386dbf9acd","Type":"ContainerStarted","Data":"96b018208735aff070d884253def3d78dce378f3bc9b9e549b20ff365b11aea0"} Dec 01 14:00:34 crc kubenswrapper[4585]: I1201 14:00:34.452496 4585 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-hdrjx container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/healthz\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Dec 01 14:00:34 crc kubenswrapper[4585]: I1201 14:00:34.452545 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-hdrjx" podUID="b615253a-f52e-4607-a63c-7cf1c07dab6b" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.23:8080/healthz\": dial tcp 10.217.0.23:8080: connect: connection refused" Dec 01 14:00:34 crc kubenswrapper[4585]: I1201 14:00:34.452575 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 01 14:00:34 crc kubenswrapper[4585]: I1201 14:00:34.459422 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ng9zf"] Dec 01 14:00:34 crc kubenswrapper[4585]: I1201 14:00:34.461033 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2zzvd" Dec 01 14:00:34 crc kubenswrapper[4585]: I1201 14:00:34.524290 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/611e970e-43b2-43b8-b2d6-6302693b7c88-utilities\") pod \"community-operators-ng9zf\" (UID: \"611e970e-43b2-43b8-b2d6-6302693b7c88\") " pod="openshift-marketplace/community-operators-ng9zf" Dec 01 14:00:34 crc kubenswrapper[4585]: I1201 14:00:34.524633 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5fhb\" (UniqueName: \"kubernetes.io/projected/611e970e-43b2-43b8-b2d6-6302693b7c88-kube-api-access-v5fhb\") pod \"community-operators-ng9zf\" (UID: \"611e970e-43b2-43b8-b2d6-6302693b7c88\") " pod="openshift-marketplace/community-operators-ng9zf" Dec 01 14:00:34 crc kubenswrapper[4585]: I1201 14:00:34.524775 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:34 crc kubenswrapper[4585]: I1201 14:00:34.524815 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/611e970e-43b2-43b8-b2d6-6302693b7c88-catalog-content\") pod \"community-operators-ng9zf\" (UID: \"611e970e-43b2-43b8-b2d6-6302693b7c88\") " pod="openshift-marketplace/community-operators-ng9zf" Dec 01 14:00:34 crc kubenswrapper[4585]: E1201 14:00:34.527090 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 14:00:35.027078718 +0000 UTC m=+149.011292573 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrbjm" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:34 crc kubenswrapper[4585]: I1201 14:00:34.543354 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-7hljs" podStartSLOduration=128.543339121 podStartE2EDuration="2m8.543339121s" podCreationTimestamp="2025-12-01 13:58:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:00:34.472187779 +0000 UTC m=+148.456401634" watchObservedRunningTime="2025-12-01 14:00:34.543339121 +0000 UTC m=+148.527552976" Dec 01 14:00:34 crc kubenswrapper[4585]: I1201 14:00:34.602277 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-2l65t" podStartSLOduration=128.602259779 podStartE2EDuration="2m8.602259779s" podCreationTimestamp="2025-12-01 13:58:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:00:34.546839084 +0000 UTC m=+148.531052939" watchObservedRunningTime="2025-12-01 14:00:34.602259779 +0000 UTC m=+148.586473634" Dec 01 14:00:34 crc kubenswrapper[4585]: I1201 14:00:34.626472 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 14:00:34 crc kubenswrapper[4585]: E1201 14:00:34.626572 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 14:00:35.126557452 +0000 UTC m=+149.110771307 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:34 crc kubenswrapper[4585]: I1201 14:00:34.626771 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5fhb\" (UniqueName: \"kubernetes.io/projected/611e970e-43b2-43b8-b2d6-6302693b7c88-kube-api-access-v5fhb\") pod \"community-operators-ng9zf\" (UID: \"611e970e-43b2-43b8-b2d6-6302693b7c88\") " pod="openshift-marketplace/community-operators-ng9zf" Dec 01 14:00:34 crc kubenswrapper[4585]: I1201 14:00:34.626826 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:34 crc kubenswrapper[4585]: I1201 14:00:34.626847 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/611e970e-43b2-43b8-b2d6-6302693b7c88-catalog-content\") pod \"community-operators-ng9zf\" (UID: \"611e970e-43b2-43b8-b2d6-6302693b7c88\") " pod="openshift-marketplace/community-operators-ng9zf" Dec 01 14:00:34 crc kubenswrapper[4585]: I1201 14:00:34.626875 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/611e970e-43b2-43b8-b2d6-6302693b7c88-utilities\") pod \"community-operators-ng9zf\" (UID: \"611e970e-43b2-43b8-b2d6-6302693b7c88\") " pod="openshift-marketplace/community-operators-ng9zf" Dec 01 14:00:34 crc kubenswrapper[4585]: I1201 14:00:34.627314 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/611e970e-43b2-43b8-b2d6-6302693b7c88-utilities\") pod \"community-operators-ng9zf\" (UID: \"611e970e-43b2-43b8-b2d6-6302693b7c88\") " pod="openshift-marketplace/community-operators-ng9zf" Dec 01 14:00:34 crc kubenswrapper[4585]: E1201 14:00:34.627753 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 14:00:35.12774569 +0000 UTC m=+149.111959535 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrbjm" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:34 crc kubenswrapper[4585]: I1201 14:00:34.628108 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/611e970e-43b2-43b8-b2d6-6302693b7c88-catalog-content\") pod \"community-operators-ng9zf\" (UID: \"611e970e-43b2-43b8-b2d6-6302693b7c88\") " pod="openshift-marketplace/community-operators-ng9zf" Dec 01 14:00:34 crc kubenswrapper[4585]: I1201 14:00:34.630733 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8pvrj"] Dec 01 14:00:34 crc kubenswrapper[4585]: I1201 14:00:34.631542 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8pvrj" Dec 01 14:00:34 crc kubenswrapper[4585]: W1201 14:00:34.643671 4585 reflector.go:561] object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g": failed to list *v1.Secret: secrets "certified-operators-dockercfg-4rs5g" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-marketplace": no relationship found between node 'crc' and this object Dec 01 14:00:34 crc kubenswrapper[4585]: E1201 14:00:34.643714 4585 reflector.go:158] "Unhandled Error" err="object-\"openshift-marketplace\"/\"certified-operators-dockercfg-4rs5g\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"certified-operators-dockercfg-4rs5g\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-marketplace\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 01 14:00:34 crc kubenswrapper[4585]: I1201 14:00:34.687935 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5fhb\" (UniqueName: \"kubernetes.io/projected/611e970e-43b2-43b8-b2d6-6302693b7c88-kube-api-access-v5fhb\") pod \"community-operators-ng9zf\" (UID: \"611e970e-43b2-43b8-b2d6-6302693b7c88\") " pod="openshift-marketplace/community-operators-ng9zf" Dec 01 14:00:34 crc kubenswrapper[4585]: I1201 14:00:34.740228 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 14:00:34 crc kubenswrapper[4585]: I1201 14:00:34.740404 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd194ad4-dd93-47aa-8c18-afc2426825ac-utilities\") pod \"certified-operators-8pvrj\" (UID: \"dd194ad4-dd93-47aa-8c18-afc2426825ac\") " pod="openshift-marketplace/certified-operators-8pvrj" Dec 01 14:00:34 crc kubenswrapper[4585]: I1201 14:00:34.740446 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd194ad4-dd93-47aa-8c18-afc2426825ac-catalog-content\") pod \"certified-operators-8pvrj\" (UID: \"dd194ad4-dd93-47aa-8c18-afc2426825ac\") " pod="openshift-marketplace/certified-operators-8pvrj" Dec 01 14:00:34 crc kubenswrapper[4585]: I1201 14:00:34.740519 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7cr6\" (UniqueName: \"kubernetes.io/projected/dd194ad4-dd93-47aa-8c18-afc2426825ac-kube-api-access-t7cr6\") pod \"certified-operators-8pvrj\" (UID: \"dd194ad4-dd93-47aa-8c18-afc2426825ac\") " pod="openshift-marketplace/certified-operators-8pvrj" Dec 01 14:00:34 crc kubenswrapper[4585]: E1201 14:00:34.740614 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 14:00:35.240596095 +0000 UTC m=+149.224809950 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:34 crc kubenswrapper[4585]: I1201 14:00:34.764368 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ng9zf" Dec 01 14:00:34 crc kubenswrapper[4585]: I1201 14:00:34.799739 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8pvrj"] Dec 01 14:00:34 crc kubenswrapper[4585]: I1201 14:00:34.821598 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-c7gls" Dec 01 14:00:34 crc kubenswrapper[4585]: I1201 14:00:34.822355 4585 patch_prober.go:28] interesting pod/downloads-7954f5f757-dsfs8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Dec 01 14:00:34 crc kubenswrapper[4585]: I1201 14:00:34.822407 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dsfs8" podUID="c239d6eb-535e-442b-a67a-f8227313ceb4" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Dec 01 14:00:34 crc kubenswrapper[4585]: I1201 14:00:34.834278 4585 patch_prober.go:28] interesting pod/downloads-7954f5f757-dsfs8 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Dec 01 14:00:34 crc kubenswrapper[4585]: I1201 14:00:34.834530 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-dsfs8" podUID="c239d6eb-535e-442b-a67a-f8227313ceb4" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Dec 01 14:00:34 crc kubenswrapper[4585]: I1201 14:00:34.847275 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nh772"] Dec 01 14:00:34 crc kubenswrapper[4585]: I1201 14:00:34.848405 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nh772" Dec 01 14:00:34 crc kubenswrapper[4585]: I1201 14:00:34.850710 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd194ad4-dd93-47aa-8c18-afc2426825ac-catalog-content\") pod \"certified-operators-8pvrj\" (UID: \"dd194ad4-dd93-47aa-8c18-afc2426825ac\") " pod="openshift-marketplace/certified-operators-8pvrj" Dec 01 14:00:34 crc kubenswrapper[4585]: I1201 14:00:34.850808 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:34 crc kubenswrapper[4585]: I1201 14:00:34.850834 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7cr6\" (UniqueName: \"kubernetes.io/projected/dd194ad4-dd93-47aa-8c18-afc2426825ac-kube-api-access-t7cr6\") pod \"certified-operators-8pvrj\" (UID: \"dd194ad4-dd93-47aa-8c18-afc2426825ac\") " pod="openshift-marketplace/certified-operators-8pvrj" Dec 01 14:00:34 crc kubenswrapper[4585]: I1201 14:00:34.850858 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd194ad4-dd93-47aa-8c18-afc2426825ac-utilities\") pod \"certified-operators-8pvrj\" (UID: \"dd194ad4-dd93-47aa-8c18-afc2426825ac\") " pod="openshift-marketplace/certified-operators-8pvrj" Dec 01 14:00:34 crc kubenswrapper[4585]: I1201 14:00:34.851281 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd194ad4-dd93-47aa-8c18-afc2426825ac-utilities\") pod \"certified-operators-8pvrj\" (UID: \"dd194ad4-dd93-47aa-8c18-afc2426825ac\") " pod="openshift-marketplace/certified-operators-8pvrj" Dec 01 14:00:34 crc kubenswrapper[4585]: I1201 14:00:34.851498 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd194ad4-dd93-47aa-8c18-afc2426825ac-catalog-content\") pod \"certified-operators-8pvrj\" (UID: \"dd194ad4-dd93-47aa-8c18-afc2426825ac\") " pod="openshift-marketplace/certified-operators-8pvrj" Dec 01 14:00:34 crc kubenswrapper[4585]: E1201 14:00:34.851766 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 14:00:35.351754846 +0000 UTC m=+149.335968701 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrbjm" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:34 crc kubenswrapper[4585]: I1201 14:00:34.966793 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 14:00:34 crc kubenswrapper[4585]: I1201 14:00:34.967678 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7cr6\" (UniqueName: \"kubernetes.io/projected/dd194ad4-dd93-47aa-8c18-afc2426825ac-kube-api-access-t7cr6\") pod \"certified-operators-8pvrj\" (UID: \"dd194ad4-dd93-47aa-8c18-afc2426825ac\") " pod="openshift-marketplace/certified-operators-8pvrj" Dec 01 14:00:34 crc kubenswrapper[4585]: E1201 14:00:34.969988 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 14:00:35.469956483 +0000 UTC m=+149.454170338 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:35 crc kubenswrapper[4585]: I1201 14:00:35.023018 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nh772"] Dec 01 14:00:35 crc kubenswrapper[4585]: I1201 14:00:35.034199 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qzv99"] Dec 01 14:00:35 crc kubenswrapper[4585]: I1201 14:00:35.036167 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qzv99" Dec 01 14:00:35 crc kubenswrapper[4585]: I1201 14:00:35.070316 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qzv99"] Dec 01 14:00:35 crc kubenswrapper[4585]: I1201 14:00:35.073246 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:35 crc kubenswrapper[4585]: I1201 14:00:35.073311 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfzqw\" (UniqueName: \"kubernetes.io/projected/96c46a5f-2a97-42e9-bf04-4a4b83caf45f-kube-api-access-dfzqw\") pod \"community-operators-nh772\" (UID: \"96c46a5f-2a97-42e9-bf04-4a4b83caf45f\") " pod="openshift-marketplace/community-operators-nh772" Dec 01 14:00:35 crc kubenswrapper[4585]: I1201 14:00:35.073364 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96c46a5f-2a97-42e9-bf04-4a4b83caf45f-utilities\") pod \"community-operators-nh772\" (UID: \"96c46a5f-2a97-42e9-bf04-4a4b83caf45f\") " pod="openshift-marketplace/community-operators-nh772" Dec 01 14:00:35 crc kubenswrapper[4585]: I1201 14:00:35.073437 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96c46a5f-2a97-42e9-bf04-4a4b83caf45f-catalog-content\") pod \"community-operators-nh772\" (UID: \"96c46a5f-2a97-42e9-bf04-4a4b83caf45f\") " pod="openshift-marketplace/community-operators-nh772" Dec 01 14:00:35 crc kubenswrapper[4585]: E1201 14:00:35.077204 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 14:00:35.577190138 +0000 UTC m=+149.561403993 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrbjm" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:35 crc kubenswrapper[4585]: I1201 14:00:35.104727 4585 patch_prober.go:28] interesting pod/router-default-5444994796-fxg9h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 14:00:35 crc kubenswrapper[4585]: [-]has-synced failed: reason withheld Dec 01 14:00:35 crc kubenswrapper[4585]: [+]process-running ok Dec 01 14:00:35 crc kubenswrapper[4585]: healthz check failed Dec 01 14:00:35 crc kubenswrapper[4585]: I1201 14:00:35.104787 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fxg9h" podUID="e42acc88-ce5b-4fc0-b4a6-c3c78fcf8428" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 14:00:35 crc kubenswrapper[4585]: I1201 14:00:35.174410 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 14:00:35 crc kubenswrapper[4585]: I1201 14:00:35.174625 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wsmr\" (UniqueName: \"kubernetes.io/projected/2d07da8c-6db8-49c6-be0f-ebc0f4c0e8c3-kube-api-access-9wsmr\") pod \"certified-operators-qzv99\" (UID: \"2d07da8c-6db8-49c6-be0f-ebc0f4c0e8c3\") " pod="openshift-marketplace/certified-operators-qzv99" Dec 01 14:00:35 crc kubenswrapper[4585]: I1201 14:00:35.174695 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfzqw\" (UniqueName: \"kubernetes.io/projected/96c46a5f-2a97-42e9-bf04-4a4b83caf45f-kube-api-access-dfzqw\") pod \"community-operators-nh772\" (UID: \"96c46a5f-2a97-42e9-bf04-4a4b83caf45f\") " pod="openshift-marketplace/community-operators-nh772" Dec 01 14:00:35 crc kubenswrapper[4585]: I1201 14:00:35.174720 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96c46a5f-2a97-42e9-bf04-4a4b83caf45f-utilities\") pod \"community-operators-nh772\" (UID: \"96c46a5f-2a97-42e9-bf04-4a4b83caf45f\") " pod="openshift-marketplace/community-operators-nh772" Dec 01 14:00:35 crc kubenswrapper[4585]: I1201 14:00:35.174768 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d07da8c-6db8-49c6-be0f-ebc0f4c0e8c3-utilities\") pod \"certified-operators-qzv99\" (UID: \"2d07da8c-6db8-49c6-be0f-ebc0f4c0e8c3\") " pod="openshift-marketplace/certified-operators-qzv99" Dec 01 14:00:35 crc kubenswrapper[4585]: I1201 14:00:35.174791 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96c46a5f-2a97-42e9-bf04-4a4b83caf45f-catalog-content\") pod \"community-operators-nh772\" (UID: \"96c46a5f-2a97-42e9-bf04-4a4b83caf45f\") " pod="openshift-marketplace/community-operators-nh772" Dec 01 14:00:35 crc kubenswrapper[4585]: I1201 14:00:35.174810 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d07da8c-6db8-49c6-be0f-ebc0f4c0e8c3-catalog-content\") pod \"certified-operators-qzv99\" (UID: \"2d07da8c-6db8-49c6-be0f-ebc0f4c0e8c3\") " pod="openshift-marketplace/certified-operators-qzv99" Dec 01 14:00:35 crc kubenswrapper[4585]: E1201 14:00:35.174932 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 14:00:35.674916716 +0000 UTC m=+149.659130571 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:35 crc kubenswrapper[4585]: I1201 14:00:35.175585 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96c46a5f-2a97-42e9-bf04-4a4b83caf45f-utilities\") pod \"community-operators-nh772\" (UID: \"96c46a5f-2a97-42e9-bf04-4a4b83caf45f\") " pod="openshift-marketplace/community-operators-nh772" Dec 01 14:00:35 crc kubenswrapper[4585]: I1201 14:00:35.175844 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96c46a5f-2a97-42e9-bf04-4a4b83caf45f-catalog-content\") pod \"community-operators-nh772\" (UID: \"96c46a5f-2a97-42e9-bf04-4a4b83caf45f\") " pod="openshift-marketplace/community-operators-nh772" Dec 01 14:00:35 crc kubenswrapper[4585]: I1201 14:00:35.221685 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfzqw\" (UniqueName: \"kubernetes.io/projected/96c46a5f-2a97-42e9-bf04-4a4b83caf45f-kube-api-access-dfzqw\") pod \"community-operators-nh772\" (UID: \"96c46a5f-2a97-42e9-bf04-4a4b83caf45f\") " pod="openshift-marketplace/community-operators-nh772" Dec 01 14:00:35 crc kubenswrapper[4585]: I1201 14:00:35.275482 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d07da8c-6db8-49c6-be0f-ebc0f4c0e8c3-utilities\") pod \"certified-operators-qzv99\" (UID: \"2d07da8c-6db8-49c6-be0f-ebc0f4c0e8c3\") " pod="openshift-marketplace/certified-operators-qzv99" Dec 01 14:00:35 crc kubenswrapper[4585]: I1201 14:00:35.275525 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d07da8c-6db8-49c6-be0f-ebc0f4c0e8c3-catalog-content\") pod \"certified-operators-qzv99\" (UID: \"2d07da8c-6db8-49c6-be0f-ebc0f4c0e8c3\") " pod="openshift-marketplace/certified-operators-qzv99" Dec 01 14:00:35 crc kubenswrapper[4585]: I1201 14:00:35.275566 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wsmr\" (UniqueName: \"kubernetes.io/projected/2d07da8c-6db8-49c6-be0f-ebc0f4c0e8c3-kube-api-access-9wsmr\") pod \"certified-operators-qzv99\" (UID: \"2d07da8c-6db8-49c6-be0f-ebc0f4c0e8c3\") " pod="openshift-marketplace/certified-operators-qzv99" Dec 01 14:00:35 crc kubenswrapper[4585]: I1201 14:00:35.275595 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:35 crc kubenswrapper[4585]: I1201 14:00:35.276798 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d07da8c-6db8-49c6-be0f-ebc0f4c0e8c3-catalog-content\") pod \"certified-operators-qzv99\" (UID: \"2d07da8c-6db8-49c6-be0f-ebc0f4c0e8c3\") " pod="openshift-marketplace/certified-operators-qzv99" Dec 01 14:00:35 crc kubenswrapper[4585]: I1201 14:00:35.277026 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d07da8c-6db8-49c6-be0f-ebc0f4c0e8c3-utilities\") pod \"certified-operators-qzv99\" (UID: \"2d07da8c-6db8-49c6-be0f-ebc0f4c0e8c3\") " pod="openshift-marketplace/certified-operators-qzv99" Dec 01 14:00:35 crc kubenswrapper[4585]: E1201 14:00:35.275965 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 14:00:35.77595372 +0000 UTC m=+149.760167575 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrbjm" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:35 crc kubenswrapper[4585]: I1201 14:00:35.277307 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nh772" Dec 01 14:00:35 crc kubenswrapper[4585]: I1201 14:00:35.314463 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wsmr\" (UniqueName: \"kubernetes.io/projected/2d07da8c-6db8-49c6-be0f-ebc0f4c0e8c3-kube-api-access-9wsmr\") pod \"certified-operators-qzv99\" (UID: \"2d07da8c-6db8-49c6-be0f-ebc0f4c0e8c3\") " pod="openshift-marketplace/certified-operators-qzv99" Dec 01 14:00:35 crc kubenswrapper[4585]: I1201 14:00:35.377191 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 14:00:35 crc kubenswrapper[4585]: E1201 14:00:35.377821 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 14:00:35.877805751 +0000 UTC m=+149.862019606 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:35 crc kubenswrapper[4585]: I1201 14:00:35.479028 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:35 crc kubenswrapper[4585]: E1201 14:00:35.479382 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 14:00:35.979371182 +0000 UTC m=+149.963585037 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrbjm" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:35 crc kubenswrapper[4585]: I1201 14:00:35.484188 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 14:00:35 crc kubenswrapper[4585]: I1201 14:00:35.552926 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-c5f9s" event={"ID":"bbfadd6e-84a7-4fa8-9766-0358345cf2e2","Type":"ContainerStarted","Data":"b6e48b15915db4d06d2e97a0be3032237135e186dcd72a39db95e555f52f5424"} Dec 01 14:00:35 crc kubenswrapper[4585]: I1201 14:00:35.553948 4585 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-hdrjx container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/healthz\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Dec 01 14:00:35 crc kubenswrapper[4585]: I1201 14:00:35.554008 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-hdrjx" podUID="b615253a-f52e-4607-a63c-7cf1c07dab6b" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.23:8080/healthz\": dial tcp 10.217.0.23:8080: connect: connection refused" Dec 01 14:00:35 crc kubenswrapper[4585]: I1201 14:00:35.584046 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 14:00:35 crc kubenswrapper[4585]: E1201 14:00:35.584711 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 14:00:36.084697015 +0000 UTC m=+150.068910860 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:35 crc kubenswrapper[4585]: I1201 14:00:35.586747 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 01 14:00:35 crc kubenswrapper[4585]: I1201 14:00:35.591124 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qzv99" Dec 01 14:00:35 crc kubenswrapper[4585]: I1201 14:00:35.592894 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8pvrj" Dec 01 14:00:35 crc kubenswrapper[4585]: I1201 14:00:35.686880 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:35 crc kubenswrapper[4585]: E1201 14:00:35.688339 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 14:00:36.188325713 +0000 UTC m=+150.172539568 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrbjm" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:35 crc kubenswrapper[4585]: I1201 14:00:35.709615 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-m8p5b" Dec 01 14:00:35 crc kubenswrapper[4585]: I1201 14:00:35.709648 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-m8p5b" Dec 01 14:00:35 crc kubenswrapper[4585]: I1201 14:00:35.749764 4585 patch_prober.go:28] interesting pod/apiserver-76f77b778f-m8p5b container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 01 14:00:35 crc kubenswrapper[4585]: [+]log ok Dec 01 14:00:35 crc kubenswrapper[4585]: [+]etcd ok Dec 01 14:00:35 crc kubenswrapper[4585]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 01 14:00:35 crc kubenswrapper[4585]: [+]poststarthook/generic-apiserver-start-informers ok Dec 01 14:00:35 crc kubenswrapper[4585]: [+]poststarthook/max-in-flight-filter ok Dec 01 14:00:35 crc kubenswrapper[4585]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 01 14:00:35 crc kubenswrapper[4585]: [+]poststarthook/image.openshift.io-apiserver-caches ok Dec 01 14:00:35 crc kubenswrapper[4585]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Dec 01 14:00:35 crc kubenswrapper[4585]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Dec 01 14:00:35 crc kubenswrapper[4585]: [+]poststarthook/project.openshift.io-projectcache ok Dec 01 14:00:35 crc kubenswrapper[4585]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Dec 01 14:00:35 crc kubenswrapper[4585]: [+]poststarthook/openshift.io-startinformers ok Dec 01 14:00:35 crc kubenswrapper[4585]: [+]poststarthook/openshift.io-restmapperupdater ok Dec 01 14:00:35 crc kubenswrapper[4585]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 01 14:00:35 crc kubenswrapper[4585]: livez check failed Dec 01 14:00:35 crc kubenswrapper[4585]: I1201 14:00:35.750000 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-m8p5b" podUID="fe562f92-5985-4fbf-a5b9-8359e7a044d9" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 14:00:35 crc kubenswrapper[4585]: I1201 14:00:35.788454 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 14:00:35 crc kubenswrapper[4585]: E1201 14:00:35.788784 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 14:00:36.288770118 +0000 UTC m=+150.272983973 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:35 crc kubenswrapper[4585]: I1201 14:00:35.889946 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:35 crc kubenswrapper[4585]: E1201 14:00:35.891331 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 14:00:36.391318562 +0000 UTC m=+150.375532417 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrbjm" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:35 crc kubenswrapper[4585]: I1201 14:00:35.991571 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 14:00:35 crc kubenswrapper[4585]: E1201 14:00:35.991755 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 14:00:36.491728716 +0000 UTC m=+150.475942571 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:35 crc kubenswrapper[4585]: I1201 14:00:35.992182 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:35 crc kubenswrapper[4585]: E1201 14:00:35.992524 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 14:00:36.492514891 +0000 UTC m=+150.476728746 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrbjm" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:36 crc kubenswrapper[4585]: W1201 14:00:36.001842 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-709f474d66b28fd65fe820df148ddabf22c6628ec9933b7f27c87cc3dc3c0445 WatchSource:0}: Error finding container 709f474d66b28fd65fe820df148ddabf22c6628ec9933b7f27c87cc3dc3c0445: Status 404 returned error can't find the container with id 709f474d66b28fd65fe820df148ddabf22c6628ec9933b7f27c87cc3dc3c0445 Dec 01 14:00:36 crc kubenswrapper[4585]: I1201 14:00:36.084590 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-fxg9h" Dec 01 14:00:36 crc kubenswrapper[4585]: I1201 14:00:36.090399 4585 patch_prober.go:28] interesting pod/router-default-5444994796-fxg9h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 14:00:36 crc kubenswrapper[4585]: [-]has-synced failed: reason withheld Dec 01 14:00:36 crc kubenswrapper[4585]: [+]process-running ok Dec 01 14:00:36 crc kubenswrapper[4585]: healthz check failed Dec 01 14:00:36 crc kubenswrapper[4585]: I1201 14:00:36.090473 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fxg9h" podUID="e42acc88-ce5b-4fc0-b4a6-c3c78fcf8428" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 14:00:36 crc kubenswrapper[4585]: I1201 14:00:36.093743 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 14:00:36 crc kubenswrapper[4585]: E1201 14:00:36.094344 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 14:00:36.594322361 +0000 UTC m=+150.578536216 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:36 crc kubenswrapper[4585]: I1201 14:00:36.195788 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:36 crc kubenswrapper[4585]: E1201 14:00:36.197608 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 14:00:36.697596448 +0000 UTC m=+150.681810303 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrbjm" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:36 crc kubenswrapper[4585]: I1201 14:00:36.297269 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 14:00:36 crc kubenswrapper[4585]: E1201 14:00:36.297655 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 14:00:36.79763565 +0000 UTC m=+150.781849505 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:36 crc kubenswrapper[4585]: I1201 14:00:36.383800 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-hdrjx" Dec 01 14:00:36 crc kubenswrapper[4585]: I1201 14:00:36.430682 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:36 crc kubenswrapper[4585]: E1201 14:00:36.431297 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 14:00:36.931284155 +0000 UTC m=+150.915498010 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrbjm" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:36 crc kubenswrapper[4585]: I1201 14:00:36.532081 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 14:00:36 crc kubenswrapper[4585]: E1201 14:00:36.532435 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 14:00:37.032420962 +0000 UTC m=+151.016634817 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:36 crc kubenswrapper[4585]: I1201 14:00:36.579503 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"4894f8ede234d9cbbea34be17890215f58624154beea6908669ef1ccd1645a01"} Dec 01 14:00:36 crc kubenswrapper[4585]: I1201 14:00:36.579682 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"709f474d66b28fd65fe820df148ddabf22c6628ec9933b7f27c87cc3dc3c0445"} Dec 01 14:00:36 crc kubenswrapper[4585]: I1201 14:00:36.602215 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"d09071fafae658a9ebf584d8d6ad402a3dd9931d061fd6dc51abc5a9ffa41e6d"} Dec 01 14:00:36 crc kubenswrapper[4585]: I1201 14:00:36.602777 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"742c533edff5ff04da05315c5c6aa4b66a5070f41a1bfa8eca11f233e5374d9e"} Dec 01 14:00:36 crc kubenswrapper[4585]: I1201 14:00:36.630814 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-c5f9s" event={"ID":"bbfadd6e-84a7-4fa8-9766-0358345cf2e2","Type":"ContainerStarted","Data":"ad274ff2df8d9b80df1be09a2a97f02be0d3f99bd3de0bbd856b8268f5167a92"} Dec 01 14:00:36 crc kubenswrapper[4585]: I1201 14:00:36.635804 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:36 crc kubenswrapper[4585]: E1201 14:00:36.636170 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 14:00:37.136155874 +0000 UTC m=+151.120369729 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nrbjm" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:36 crc kubenswrapper[4585]: I1201 14:00:36.647952 4585 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 01 14:00:36 crc kubenswrapper[4585]: I1201 14:00:36.672078 4585 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-01T14:00:36.647984325Z","Handler":null,"Name":""} Dec 01 14:00:36 crc kubenswrapper[4585]: I1201 14:00:36.736639 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 14:00:36 crc kubenswrapper[4585]: E1201 14:00:36.738016 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 14:00:37.237999434 +0000 UTC m=+151.222213289 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 14:00:36 crc kubenswrapper[4585]: I1201 14:00:36.779514 4585 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 01 14:00:36 crc kubenswrapper[4585]: I1201 14:00:36.779546 4585 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 01 14:00:36 crc kubenswrapper[4585]: I1201 14:00:36.838221 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:36 crc kubenswrapper[4585]: I1201 14:00:36.890814 4585 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 01 14:00:36 crc kubenswrapper[4585]: I1201 14:00:36.890855 4585 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:36 crc kubenswrapper[4585]: I1201 14:00:36.930198 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rl7qq"] Dec 01 14:00:36 crc kubenswrapper[4585]: I1201 14:00:36.931188 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rl7qq" Dec 01 14:00:36 crc kubenswrapper[4585]: I1201 14:00:36.970137 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 01 14:00:36 crc kubenswrapper[4585]: I1201 14:00:36.985951 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ng9zf"] Dec 01 14:00:37 crc kubenswrapper[4585]: W1201 14:00:37.005435 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod611e970e_43b2_43b8_b2d6_6302693b7c88.slice/crio-aef7438efd8ace8fa277fbdb03645f2c9a7d40aa19fd30b6a4c34977955d51e7 WatchSource:0}: Error finding container aef7438efd8ace8fa277fbdb03645f2c9a7d40aa19fd30b6a4c34977955d51e7: Status 404 returned error can't find the container with id aef7438efd8ace8fa277fbdb03645f2c9a7d40aa19fd30b6a4c34977955d51e7 Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.041439 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12d64d5a-7b7e-49c9-985a-14efebb14506-utilities\") pod \"redhat-marketplace-rl7qq\" (UID: \"12d64d5a-7b7e-49c9-985a-14efebb14506\") " pod="openshift-marketplace/redhat-marketplace-rl7qq" Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.041608 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12d64d5a-7b7e-49c9-985a-14efebb14506-catalog-content\") pod \"redhat-marketplace-rl7qq\" (UID: \"12d64d5a-7b7e-49c9-985a-14efebb14506\") " pod="openshift-marketplace/redhat-marketplace-rl7qq" Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.041717 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25jx9\" (UniqueName: \"kubernetes.io/projected/12d64d5a-7b7e-49c9-985a-14efebb14506-kube-api-access-25jx9\") pod \"redhat-marketplace-rl7qq\" (UID: \"12d64d5a-7b7e-49c9-985a-14efebb14506\") " pod="openshift-marketplace/redhat-marketplace-rl7qq" Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.049911 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rl7qq"] Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.097650 4585 patch_prober.go:28] interesting pod/router-default-5444994796-fxg9h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 14:00:37 crc kubenswrapper[4585]: [-]has-synced failed: reason withheld Dec 01 14:00:37 crc kubenswrapper[4585]: [+]process-running ok Dec 01 14:00:37 crc kubenswrapper[4585]: healthz check failed Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.097834 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fxg9h" podUID="e42acc88-ce5b-4fc0-b4a6-c3c78fcf8428" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.122132 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qzv99"] Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.131337 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.132012 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.141568 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.141596 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.154503 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12d64d5a-7b7e-49c9-985a-14efebb14506-catalog-content\") pod \"redhat-marketplace-rl7qq\" (UID: \"12d64d5a-7b7e-49c9-985a-14efebb14506\") " pod="openshift-marketplace/redhat-marketplace-rl7qq" Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.154567 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25jx9\" (UniqueName: \"kubernetes.io/projected/12d64d5a-7b7e-49c9-985a-14efebb14506-kube-api-access-25jx9\") pod \"redhat-marketplace-rl7qq\" (UID: \"12d64d5a-7b7e-49c9-985a-14efebb14506\") " pod="openshift-marketplace/redhat-marketplace-rl7qq" Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.154593 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12d64d5a-7b7e-49c9-985a-14efebb14506-utilities\") pod \"redhat-marketplace-rl7qq\" (UID: \"12d64d5a-7b7e-49c9-985a-14efebb14506\") " pod="openshift-marketplace/redhat-marketplace-rl7qq" Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.155044 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12d64d5a-7b7e-49c9-985a-14efebb14506-utilities\") pod \"redhat-marketplace-rl7qq\" (UID: \"12d64d5a-7b7e-49c9-985a-14efebb14506\") " pod="openshift-marketplace/redhat-marketplace-rl7qq" Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.155258 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12d64d5a-7b7e-49c9-985a-14efebb14506-catalog-content\") pod \"redhat-marketplace-rl7qq\" (UID: \"12d64d5a-7b7e-49c9-985a-14efebb14506\") " pod="openshift-marketplace/redhat-marketplace-rl7qq" Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.186898 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.256330 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b5c9948e-47d0-4af1-b570-de1439b3184a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"b5c9948e-47d0-4af1-b570-de1439b3184a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.256382 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b5c9948e-47d0-4af1-b570-de1439b3184a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"b5c9948e-47d0-4af1-b570-de1439b3184a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.286937 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-s4g6p"] Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.295163 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s4g6p" Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.313397 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25jx9\" (UniqueName: \"kubernetes.io/projected/12d64d5a-7b7e-49c9-985a-14efebb14506-kube-api-access-25jx9\") pod \"redhat-marketplace-rl7qq\" (UID: \"12d64d5a-7b7e-49c9-985a-14efebb14506\") " pod="openshift-marketplace/redhat-marketplace-rl7qq" Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.315886 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s4g6p"] Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.365257 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nh772"] Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.366749 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b5c9948e-47d0-4af1-b570-de1439b3184a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"b5c9948e-47d0-4af1-b570-de1439b3184a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.366861 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b5c9948e-47d0-4af1-b570-de1439b3184a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"b5c9948e-47d0-4af1-b570-de1439b3184a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.367175 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b5c9948e-47d0-4af1-b570-de1439b3184a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"b5c9948e-47d0-4af1-b570-de1439b3184a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.401334 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b5c9948e-47d0-4af1-b570-de1439b3184a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"b5c9948e-47d0-4af1-b570-de1439b3184a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.468363 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7239ff6c-5a7b-4ac8-a6aa-def2b76e5b95-catalog-content\") pod \"redhat-marketplace-s4g6p\" (UID: \"7239ff6c-5a7b-4ac8-a6aa-def2b76e5b95\") " pod="openshift-marketplace/redhat-marketplace-s4g6p" Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.468399 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7239ff6c-5a7b-4ac8-a6aa-def2b76e5b95-utilities\") pod \"redhat-marketplace-s4g6p\" (UID: \"7239ff6c-5a7b-4ac8-a6aa-def2b76e5b95\") " pod="openshift-marketplace/redhat-marketplace-s4g6p" Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.468654 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trbch\" (UniqueName: \"kubernetes.io/projected/7239ff6c-5a7b-4ac8-a6aa-def2b76e5b95-kube-api-access-trbch\") pod \"redhat-marketplace-s4g6p\" (UID: \"7239ff6c-5a7b-4ac8-a6aa-def2b76e5b95\") " pod="openshift-marketplace/redhat-marketplace-s4g6p" Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.547273 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.571845 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trbch\" (UniqueName: \"kubernetes.io/projected/7239ff6c-5a7b-4ac8-a6aa-def2b76e5b95-kube-api-access-trbch\") pod \"redhat-marketplace-s4g6p\" (UID: \"7239ff6c-5a7b-4ac8-a6aa-def2b76e5b95\") " pod="openshift-marketplace/redhat-marketplace-s4g6p" Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.571922 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7239ff6c-5a7b-4ac8-a6aa-def2b76e5b95-catalog-content\") pod \"redhat-marketplace-s4g6p\" (UID: \"7239ff6c-5a7b-4ac8-a6aa-def2b76e5b95\") " pod="openshift-marketplace/redhat-marketplace-s4g6p" Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.571937 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7239ff6c-5a7b-4ac8-a6aa-def2b76e5b95-utilities\") pod \"redhat-marketplace-s4g6p\" (UID: \"7239ff6c-5a7b-4ac8-a6aa-def2b76e5b95\") " pod="openshift-marketplace/redhat-marketplace-s4g6p" Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.573016 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7239ff6c-5a7b-4ac8-a6aa-def2b76e5b95-catalog-content\") pod \"redhat-marketplace-s4g6p\" (UID: \"7239ff6c-5a7b-4ac8-a6aa-def2b76e5b95\") " pod="openshift-marketplace/redhat-marketplace-s4g6p" Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.573079 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rl7qq" Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.576616 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7239ff6c-5a7b-4ac8-a6aa-def2b76e5b95-utilities\") pod \"redhat-marketplace-s4g6p\" (UID: \"7239ff6c-5a7b-4ac8-a6aa-def2b76e5b95\") " pod="openshift-marketplace/redhat-marketplace-s4g6p" Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.643750 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-sbnd6"] Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.644790 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sbnd6" Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.670176 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.674304 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sbnd6"] Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.692860 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trbch\" (UniqueName: \"kubernetes.io/projected/7239ff6c-5a7b-4ac8-a6aa-def2b76e5b95-kube-api-access-trbch\") pod \"redhat-marketplace-s4g6p\" (UID: \"7239ff6c-5a7b-4ac8-a6aa-def2b76e5b95\") " pod="openshift-marketplace/redhat-marketplace-s4g6p" Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.703173 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ng9zf" event={"ID":"611e970e-43b2-43b8-b2d6-6302693b7c88","Type":"ContainerStarted","Data":"9665767658d3db1016c0e68db54b47778b270fb65d0d0a1536a3fe78adff7af2"} Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.703225 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ng9zf" event={"ID":"611e970e-43b2-43b8-b2d6-6302693b7c88","Type":"ContainerStarted","Data":"aef7438efd8ace8fa277fbdb03645f2c9a7d40aa19fd30b6a4c34977955d51e7"} Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.741087 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"08f575bd8d688b29df67ab19da288d17c71996561e9e3dad5ce030e731d5898f"} Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.741226 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"c046f89a82a0c0b46e5bfc4ae0b5f83e67af860c5ba9c5e15e5bb4296cc5fcb6"} Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.741768 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.756720 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8pvrj"] Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.774198 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-c5f9s" event={"ID":"bbfadd6e-84a7-4fa8-9766-0358345cf2e2","Type":"ContainerStarted","Data":"9a8b85f02f57a5436d69e34a2d64b2986e5aecb1b166540b989967f4e3d065c4"} Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.775715 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nh772" event={"ID":"96c46a5f-2a97-42e9-bf04-4a4b83caf45f","Type":"ContainerStarted","Data":"972f5610170271d990f83d12c032bddc53a87fa929c57566f90950dcf7922b95"} Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.780450 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qzv99" event={"ID":"2d07da8c-6db8-49c6-be0f-ebc0f4c0e8c3","Type":"ContainerStarted","Data":"4d3994497a947dddb6ecb9255261b2b0e714bcc07ca9dc90f5fc160119883898"} Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.785736 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzp4k\" (UniqueName: \"kubernetes.io/projected/51b10af7-6b4e-49b2-81cd-50b2a5d5fd91-kube-api-access-pzp4k\") pod \"redhat-operators-sbnd6\" (UID: \"51b10af7-6b4e-49b2-81cd-50b2a5d5fd91\") " pod="openshift-marketplace/redhat-operators-sbnd6" Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.785785 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51b10af7-6b4e-49b2-81cd-50b2a5d5fd91-catalog-content\") pod \"redhat-operators-sbnd6\" (UID: \"51b10af7-6b4e-49b2-81cd-50b2a5d5fd91\") " pod="openshift-marketplace/redhat-operators-sbnd6" Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.785813 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51b10af7-6b4e-49b2-81cd-50b2a5d5fd91-utilities\") pod \"redhat-operators-sbnd6\" (UID: \"51b10af7-6b4e-49b2-81cd-50b2a5d5fd91\") " pod="openshift-marketplace/redhat-operators-sbnd6" Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.857476 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nrbjm\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.865531 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gjnxb"] Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.866839 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gjnxb" Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.888125 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gjnxb"] Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.888585 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.888759 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzp4k\" (UniqueName: \"kubernetes.io/projected/51b10af7-6b4e-49b2-81cd-50b2a5d5fd91-kube-api-access-pzp4k\") pod \"redhat-operators-sbnd6\" (UID: \"51b10af7-6b4e-49b2-81cd-50b2a5d5fd91\") " pod="openshift-marketplace/redhat-operators-sbnd6" Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.888790 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51b10af7-6b4e-49b2-81cd-50b2a5d5fd91-catalog-content\") pod \"redhat-operators-sbnd6\" (UID: \"51b10af7-6b4e-49b2-81cd-50b2a5d5fd91\") " pod="openshift-marketplace/redhat-operators-sbnd6" Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.888830 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51b10af7-6b4e-49b2-81cd-50b2a5d5fd91-utilities\") pod \"redhat-operators-sbnd6\" (UID: \"51b10af7-6b4e-49b2-81cd-50b2a5d5fd91\") " pod="openshift-marketplace/redhat-operators-sbnd6" Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.889689 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51b10af7-6b4e-49b2-81cd-50b2a5d5fd91-catalog-content\") pod \"redhat-operators-sbnd6\" (UID: \"51b10af7-6b4e-49b2-81cd-50b2a5d5fd91\") " pod="openshift-marketplace/redhat-operators-sbnd6" Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.903240 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51b10af7-6b4e-49b2-81cd-50b2a5d5fd91-utilities\") pod \"redhat-operators-sbnd6\" (UID: \"51b10af7-6b4e-49b2-81cd-50b2a5d5fd91\") " pod="openshift-marketplace/redhat-operators-sbnd6" Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.924334 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.926921 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-c5f9s" podStartSLOduration=15.92690226 podStartE2EDuration="15.92690226s" podCreationTimestamp="2025-12-01 14:00:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:00:37.924591116 +0000 UTC m=+151.908804971" watchObservedRunningTime="2025-12-01 14:00:37.92690226 +0000 UTC m=+151.911116115" Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.938946 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s4g6p" Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.947735 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.957100 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzp4k\" (UniqueName: \"kubernetes.io/projected/51b10af7-6b4e-49b2-81cd-50b2a5d5fd91-kube-api-access-pzp4k\") pod \"redhat-operators-sbnd6\" (UID: \"51b10af7-6b4e-49b2-81cd-50b2a5d5fd91\") " pod="openshift-marketplace/redhat-operators-sbnd6" Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.997106 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sbnd6" Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.998705 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1162041e-acb6-4788-aaf3-841a80c7ec48-catalog-content\") pod \"redhat-operators-gjnxb\" (UID: \"1162041e-acb6-4788-aaf3-841a80c7ec48\") " pod="openshift-marketplace/redhat-operators-gjnxb" Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.998810 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1162041e-acb6-4788-aaf3-841a80c7ec48-utilities\") pod \"redhat-operators-gjnxb\" (UID: \"1162041e-acb6-4788-aaf3-841a80c7ec48\") " pod="openshift-marketplace/redhat-operators-gjnxb" Dec 01 14:00:37 crc kubenswrapper[4585]: I1201 14:00:37.998999 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhk9g\" (UniqueName: \"kubernetes.io/projected/1162041e-acb6-4788-aaf3-841a80c7ec48-kube-api-access-rhk9g\") pod \"redhat-operators-gjnxb\" (UID: \"1162041e-acb6-4788-aaf3-841a80c7ec48\") " pod="openshift-marketplace/redhat-operators-gjnxb" Dec 01 14:00:38 crc kubenswrapper[4585]: I1201 14:00:38.099859 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhk9g\" (UniqueName: \"kubernetes.io/projected/1162041e-acb6-4788-aaf3-841a80c7ec48-kube-api-access-rhk9g\") pod \"redhat-operators-gjnxb\" (UID: \"1162041e-acb6-4788-aaf3-841a80c7ec48\") " pod="openshift-marketplace/redhat-operators-gjnxb" Dec 01 14:00:38 crc kubenswrapper[4585]: I1201 14:00:38.099918 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1162041e-acb6-4788-aaf3-841a80c7ec48-catalog-content\") pod \"redhat-operators-gjnxb\" (UID: \"1162041e-acb6-4788-aaf3-841a80c7ec48\") " pod="openshift-marketplace/redhat-operators-gjnxb" Dec 01 14:00:38 crc kubenswrapper[4585]: I1201 14:00:38.100020 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1162041e-acb6-4788-aaf3-841a80c7ec48-utilities\") pod \"redhat-operators-gjnxb\" (UID: \"1162041e-acb6-4788-aaf3-841a80c7ec48\") " pod="openshift-marketplace/redhat-operators-gjnxb" Dec 01 14:00:38 crc kubenswrapper[4585]: I1201 14:00:38.100444 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1162041e-acb6-4788-aaf3-841a80c7ec48-utilities\") pod \"redhat-operators-gjnxb\" (UID: \"1162041e-acb6-4788-aaf3-841a80c7ec48\") " pod="openshift-marketplace/redhat-operators-gjnxb" Dec 01 14:00:38 crc kubenswrapper[4585]: I1201 14:00:38.101246 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1162041e-acb6-4788-aaf3-841a80c7ec48-catalog-content\") pod \"redhat-operators-gjnxb\" (UID: \"1162041e-acb6-4788-aaf3-841a80c7ec48\") " pod="openshift-marketplace/redhat-operators-gjnxb" Dec 01 14:00:38 crc kubenswrapper[4585]: I1201 14:00:38.105239 4585 patch_prober.go:28] interesting pod/router-default-5444994796-fxg9h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 14:00:38 crc kubenswrapper[4585]: [-]has-synced failed: reason withheld Dec 01 14:00:38 crc kubenswrapper[4585]: [+]process-running ok Dec 01 14:00:38 crc kubenswrapper[4585]: healthz check failed Dec 01 14:00:38 crc kubenswrapper[4585]: I1201 14:00:38.105502 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fxg9h" podUID="e42acc88-ce5b-4fc0-b4a6-c3c78fcf8428" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 14:00:38 crc kubenswrapper[4585]: I1201 14:00:38.232334 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhk9g\" (UniqueName: \"kubernetes.io/projected/1162041e-acb6-4788-aaf3-841a80c7ec48-kube-api-access-rhk9g\") pod \"redhat-operators-gjnxb\" (UID: \"1162041e-acb6-4788-aaf3-841a80c7ec48\") " pod="openshift-marketplace/redhat-operators-gjnxb" Dec 01 14:00:38 crc kubenswrapper[4585]: I1201 14:00:38.318423 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gjnxb" Dec 01 14:00:38 crc kubenswrapper[4585]: I1201 14:00:38.456716 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 01 14:00:38 crc kubenswrapper[4585]: I1201 14:00:38.792319 4585 generic.go:334] "Generic (PLEG): container finished" podID="dd194ad4-dd93-47aa-8c18-afc2426825ac" containerID="ae2f108021ebf97e8cd2c5dc8ab1b5cf2d664d2922308c721d789096af5d88d8" exitCode=0 Dec 01 14:00:38 crc kubenswrapper[4585]: I1201 14:00:38.792447 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8pvrj" event={"ID":"dd194ad4-dd93-47aa-8c18-afc2426825ac","Type":"ContainerDied","Data":"ae2f108021ebf97e8cd2c5dc8ab1b5cf2d664d2922308c721d789096af5d88d8"} Dec 01 14:00:38 crc kubenswrapper[4585]: I1201 14:00:38.792475 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8pvrj" event={"ID":"dd194ad4-dd93-47aa-8c18-afc2426825ac","Type":"ContainerStarted","Data":"92226c221b99c5fa7f5e92fa0dc6f25ab350dee8f1cacca6867f935ff2768120"} Dec 01 14:00:38 crc kubenswrapper[4585]: I1201 14:00:38.795102 4585 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 14:00:38 crc kubenswrapper[4585]: I1201 14:00:38.795662 4585 generic.go:334] "Generic (PLEG): container finished" podID="2d07da8c-6db8-49c6-be0f-ebc0f4c0e8c3" containerID="3e1aeb8ba987c1d96923807ee1bf14e219bf47e0baf5f36664c22136bd8c8893" exitCode=0 Dec 01 14:00:38 crc kubenswrapper[4585]: I1201 14:00:38.796307 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qzv99" event={"ID":"2d07da8c-6db8-49c6-be0f-ebc0f4c0e8c3","Type":"ContainerDied","Data":"3e1aeb8ba987c1d96923807ee1bf14e219bf47e0baf5f36664c22136bd8c8893"} Dec 01 14:00:38 crc kubenswrapper[4585]: I1201 14:00:38.817714 4585 generic.go:334] "Generic (PLEG): container finished" podID="611e970e-43b2-43b8-b2d6-6302693b7c88" containerID="9665767658d3db1016c0e68db54b47778b270fb65d0d0a1536a3fe78adff7af2" exitCode=0 Dec 01 14:00:38 crc kubenswrapper[4585]: I1201 14:00:38.818099 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ng9zf" event={"ID":"611e970e-43b2-43b8-b2d6-6302693b7c88","Type":"ContainerDied","Data":"9665767658d3db1016c0e68db54b47778b270fb65d0d0a1536a3fe78adff7af2"} Dec 01 14:00:38 crc kubenswrapper[4585]: I1201 14:00:38.834333 4585 generic.go:334] "Generic (PLEG): container finished" podID="96c46a5f-2a97-42e9-bf04-4a4b83caf45f" containerID="8abd7558631eb33790615dcc525ea5f882b0360ec23eefd7bbe98d11669dfb62" exitCode=0 Dec 01 14:00:38 crc kubenswrapper[4585]: I1201 14:00:38.834738 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nh772" event={"ID":"96c46a5f-2a97-42e9-bf04-4a4b83caf45f","Type":"ContainerDied","Data":"8abd7558631eb33790615dcc525ea5f882b0360ec23eefd7bbe98d11669dfb62"} Dec 01 14:00:38 crc kubenswrapper[4585]: I1201 14:00:38.994601 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 01 14:00:39 crc kubenswrapper[4585]: I1201 14:00:39.097190 4585 patch_prober.go:28] interesting pod/router-default-5444994796-fxg9h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 14:00:39 crc kubenswrapper[4585]: [-]has-synced failed: reason withheld Dec 01 14:00:39 crc kubenswrapper[4585]: [+]process-running ok Dec 01 14:00:39 crc kubenswrapper[4585]: healthz check failed Dec 01 14:00:39 crc kubenswrapper[4585]: I1201 14:00:39.097244 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fxg9h" podUID="e42acc88-ce5b-4fc0-b4a6-c3c78fcf8428" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 14:00:39 crc kubenswrapper[4585]: I1201 14:00:39.387281 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rl7qq"] Dec 01 14:00:39 crc kubenswrapper[4585]: I1201 14:00:39.468476 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sbnd6"] Dec 01 14:00:39 crc kubenswrapper[4585]: I1201 14:00:39.747618 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-nrbjm"] Dec 01 14:00:39 crc kubenswrapper[4585]: W1201 14:00:39.771045 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6763aabd_f571_4b13_82fd_3a4a9bdf8406.slice/crio-3e1e827a43f74634644da947859b40ccd0b30f29fb64bd562cb1a345965e8529 WatchSource:0}: Error finding container 3e1e827a43f74634644da947859b40ccd0b30f29fb64bd562cb1a345965e8529: Status 404 returned error can't find the container with id 3e1e827a43f74634644da947859b40ccd0b30f29fb64bd562cb1a345965e8529 Dec 01 14:00:39 crc kubenswrapper[4585]: I1201 14:00:39.851993 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gjnxb"] Dec 01 14:00:39 crc kubenswrapper[4585]: I1201 14:00:39.867913 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s4g6p"] Dec 01 14:00:39 crc kubenswrapper[4585]: I1201 14:00:39.889222 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rl7qq" event={"ID":"12d64d5a-7b7e-49c9-985a-14efebb14506","Type":"ContainerStarted","Data":"5acf4ef146e95a69e2b55fa9d37a8eac8fdb433279d3462de56b02b6b7550e9d"} Dec 01 14:00:39 crc kubenswrapper[4585]: I1201 14:00:39.896052 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" event={"ID":"6763aabd-f571-4b13-82fd-3a4a9bdf8406","Type":"ContainerStarted","Data":"3e1e827a43f74634644da947859b40ccd0b30f29fb64bd562cb1a345965e8529"} Dec 01 14:00:39 crc kubenswrapper[4585]: I1201 14:00:39.918910 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"b5c9948e-47d0-4af1-b570-de1439b3184a","Type":"ContainerStarted","Data":"bb6a2d7c64fc8df86cf8277caa9571995bb88c5503387c8445415a576f99738d"} Dec 01 14:00:39 crc kubenswrapper[4585]: I1201 14:00:39.932266 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sbnd6" event={"ID":"51b10af7-6b4e-49b2-81cd-50b2a5d5fd91","Type":"ContainerStarted","Data":"16b6bc1e6d590246c20ad047ad80e802c5d041c9192f7ff13246213eef9d78ef"} Dec 01 14:00:40 crc kubenswrapper[4585]: I1201 14:00:40.096789 4585 patch_prober.go:28] interesting pod/router-default-5444994796-fxg9h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 14:00:40 crc kubenswrapper[4585]: [-]has-synced failed: reason withheld Dec 01 14:00:40 crc kubenswrapper[4585]: [+]process-running ok Dec 01 14:00:40 crc kubenswrapper[4585]: healthz check failed Dec 01 14:00:40 crc kubenswrapper[4585]: I1201 14:00:40.096846 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fxg9h" podUID="e42acc88-ce5b-4fc0-b4a6-c3c78fcf8428" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 14:00:40 crc kubenswrapper[4585]: I1201 14:00:40.732638 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-m8p5b" Dec 01 14:00:40 crc kubenswrapper[4585]: I1201 14:00:40.738903 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-m8p5b" Dec 01 14:00:40 crc kubenswrapper[4585]: I1201 14:00:40.971168 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" event={"ID":"6763aabd-f571-4b13-82fd-3a4a9bdf8406","Type":"ContainerStarted","Data":"bdadbd4ee03b3680b9bc0694394f0f784e6166981338a2c45679477dca2777cb"} Dec 01 14:00:40 crc kubenswrapper[4585]: I1201 14:00:40.972145 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:40 crc kubenswrapper[4585]: I1201 14:00:40.987119 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"b5c9948e-47d0-4af1-b570-de1439b3184a","Type":"ContainerStarted","Data":"37d3a58918c7130cb3cf6cc4f8a1d40ed3a5d5e91242ccb5cddb86f64b772666"} Dec 01 14:00:41 crc kubenswrapper[4585]: I1201 14:00:41.015297 4585 generic.go:334] "Generic (PLEG): container finished" podID="1162041e-acb6-4788-aaf3-841a80c7ec48" containerID="8ad45edb727d3dd1c1754597f1f88d3f21815af349298555d3fdedb2f8194f88" exitCode=0 Dec 01 14:00:41 crc kubenswrapper[4585]: I1201 14:00:41.016576 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjnxb" event={"ID":"1162041e-acb6-4788-aaf3-841a80c7ec48","Type":"ContainerDied","Data":"8ad45edb727d3dd1c1754597f1f88d3f21815af349298555d3fdedb2f8194f88"} Dec 01 14:00:41 crc kubenswrapper[4585]: I1201 14:00:41.016770 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjnxb" event={"ID":"1162041e-acb6-4788-aaf3-841a80c7ec48","Type":"ContainerStarted","Data":"c94ef80bb8391f109e3915b0fee608ea4dfbee36bbe16ecc6b402eb85e9761c6"} Dec 01 14:00:41 crc kubenswrapper[4585]: I1201 14:00:41.035459 4585 generic.go:334] "Generic (PLEG): container finished" podID="51b10af7-6b4e-49b2-81cd-50b2a5d5fd91" containerID="62d79488af697b9eb7d206b75aafc7e5cbe6d7e2b205e09fbc90a009d90474cc" exitCode=0 Dec 01 14:00:41 crc kubenswrapper[4585]: I1201 14:00:41.035592 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sbnd6" event={"ID":"51b10af7-6b4e-49b2-81cd-50b2a5d5fd91","Type":"ContainerDied","Data":"62d79488af697b9eb7d206b75aafc7e5cbe6d7e2b205e09fbc90a009d90474cc"} Dec 01 14:00:41 crc kubenswrapper[4585]: I1201 14:00:41.047075 4585 generic.go:334] "Generic (PLEG): container finished" podID="7239ff6c-5a7b-4ac8-a6aa-def2b76e5b95" containerID="90f45ad98aad2380aa74c734506295f309d0279ff869879f8e35293acb499e10" exitCode=0 Dec 01 14:00:41 crc kubenswrapper[4585]: I1201 14:00:41.047176 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s4g6p" event={"ID":"7239ff6c-5a7b-4ac8-a6aa-def2b76e5b95","Type":"ContainerDied","Data":"90f45ad98aad2380aa74c734506295f309d0279ff869879f8e35293acb499e10"} Dec 01 14:00:41 crc kubenswrapper[4585]: I1201 14:00:41.047227 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s4g6p" event={"ID":"7239ff6c-5a7b-4ac8-a6aa-def2b76e5b95","Type":"ContainerStarted","Data":"81dd475d0c14c39c18cdf168b644a441438be656ce59b467e4c8809a5c707774"} Dec 01 14:00:41 crc kubenswrapper[4585]: I1201 14:00:41.068376 4585 generic.go:334] "Generic (PLEG): container finished" podID="12d64d5a-7b7e-49c9-985a-14efebb14506" containerID="288de8d26d7811f16a133b6c85dabc0c848450d5e43bdb3d4ac31f5d68d7b62e" exitCode=0 Dec 01 14:00:41 crc kubenswrapper[4585]: I1201 14:00:41.068930 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rl7qq" event={"ID":"12d64d5a-7b7e-49c9-985a-14efebb14506","Type":"ContainerDied","Data":"288de8d26d7811f16a133b6c85dabc0c848450d5e43bdb3d4ac31f5d68d7b62e"} Dec 01 14:00:41 crc kubenswrapper[4585]: I1201 14:00:41.085102 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" podStartSLOduration=135.08508305 podStartE2EDuration="2m15.08508305s" podCreationTimestamp="2025-12-01 13:58:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:00:41.026514783 +0000 UTC m=+155.010728638" watchObservedRunningTime="2025-12-01 14:00:41.08508305 +0000 UTC m=+155.069296905" Dec 01 14:00:41 crc kubenswrapper[4585]: I1201 14:00:41.095210 4585 patch_prober.go:28] interesting pod/router-default-5444994796-fxg9h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 14:00:41 crc kubenswrapper[4585]: [-]has-synced failed: reason withheld Dec 01 14:00:41 crc kubenswrapper[4585]: [+]process-running ok Dec 01 14:00:41 crc kubenswrapper[4585]: healthz check failed Dec 01 14:00:41 crc kubenswrapper[4585]: I1201 14:00:41.095284 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fxg9h" podUID="e42acc88-ce5b-4fc0-b4a6-c3c78fcf8428" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 14:00:41 crc kubenswrapper[4585]: I1201 14:00:41.132536 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=4.132519408 podStartE2EDuration="4.132519408s" podCreationTimestamp="2025-12-01 14:00:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:00:41.095756953 +0000 UTC m=+155.079970808" watchObservedRunningTime="2025-12-01 14:00:41.132519408 +0000 UTC m=+155.116733263" Dec 01 14:00:41 crc kubenswrapper[4585]: I1201 14:00:41.725283 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 01 14:00:41 crc kubenswrapper[4585]: I1201 14:00:41.726427 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 14:00:41 crc kubenswrapper[4585]: I1201 14:00:41.737899 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 01 14:00:41 crc kubenswrapper[4585]: I1201 14:00:41.738124 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 01 14:00:41 crc kubenswrapper[4585]: I1201 14:00:41.751491 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 01 14:00:41 crc kubenswrapper[4585]: I1201 14:00:41.790035 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f315616d-10a1-46a1-93d5-179ab1773b2c-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"f315616d-10a1-46a1-93d5-179ab1773b2c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 14:00:41 crc kubenswrapper[4585]: I1201 14:00:41.790108 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f315616d-10a1-46a1-93d5-179ab1773b2c-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"f315616d-10a1-46a1-93d5-179ab1773b2c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 14:00:41 crc kubenswrapper[4585]: I1201 14:00:41.891475 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f315616d-10a1-46a1-93d5-179ab1773b2c-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"f315616d-10a1-46a1-93d5-179ab1773b2c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 14:00:41 crc kubenswrapper[4585]: I1201 14:00:41.891546 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f315616d-10a1-46a1-93d5-179ab1773b2c-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"f315616d-10a1-46a1-93d5-179ab1773b2c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 14:00:41 crc kubenswrapper[4585]: I1201 14:00:41.891713 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f315616d-10a1-46a1-93d5-179ab1773b2c-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"f315616d-10a1-46a1-93d5-179ab1773b2c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 14:00:41 crc kubenswrapper[4585]: I1201 14:00:41.941615 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f315616d-10a1-46a1-93d5-179ab1773b2c-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"f315616d-10a1-46a1-93d5-179ab1773b2c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 14:00:42 crc kubenswrapper[4585]: I1201 14:00:42.059931 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 14:00:42 crc kubenswrapper[4585]: I1201 14:00:42.110632 4585 patch_prober.go:28] interesting pod/router-default-5444994796-fxg9h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 14:00:42 crc kubenswrapper[4585]: [-]has-synced failed: reason withheld Dec 01 14:00:42 crc kubenswrapper[4585]: [+]process-running ok Dec 01 14:00:42 crc kubenswrapper[4585]: healthz check failed Dec 01 14:00:42 crc kubenswrapper[4585]: I1201 14:00:42.111182 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fxg9h" podUID="e42acc88-ce5b-4fc0-b4a6-c3c78fcf8428" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 14:00:42 crc kubenswrapper[4585]: I1201 14:00:42.148275 4585 generic.go:334] "Generic (PLEG): container finished" podID="b5c9948e-47d0-4af1-b570-de1439b3184a" containerID="37d3a58918c7130cb3cf6cc4f8a1d40ed3a5d5e91242ccb5cddb86f64b772666" exitCode=0 Dec 01 14:00:42 crc kubenswrapper[4585]: I1201 14:00:42.149924 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"b5c9948e-47d0-4af1-b570-de1439b3184a","Type":"ContainerDied","Data":"37d3a58918c7130cb3cf6cc4f8a1d40ed3a5d5e91242ccb5cddb86f64b772666"} Dec 01 14:00:43 crc kubenswrapper[4585]: I1201 14:00:43.057276 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 01 14:00:43 crc kubenswrapper[4585]: I1201 14:00:43.093792 4585 patch_prober.go:28] interesting pod/router-default-5444994796-fxg9h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 14:00:43 crc kubenswrapper[4585]: [-]has-synced failed: reason withheld Dec 01 14:00:43 crc kubenswrapper[4585]: [+]process-running ok Dec 01 14:00:43 crc kubenswrapper[4585]: healthz check failed Dec 01 14:00:43 crc kubenswrapper[4585]: I1201 14:00:43.093849 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fxg9h" podUID="e42acc88-ce5b-4fc0-b4a6-c3c78fcf8428" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 14:00:43 crc kubenswrapper[4585]: I1201 14:00:43.726670 4585 patch_prober.go:28] interesting pod/machine-config-daemon-lj9gs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 14:00:43 crc kubenswrapper[4585]: I1201 14:00:43.727052 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 14:00:43 crc kubenswrapper[4585]: I1201 14:00:43.825753 4585 patch_prober.go:28] interesting pod/console-f9d7485db-f9k95 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.9:8443/health\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Dec 01 14:00:43 crc kubenswrapper[4585]: I1201 14:00:43.825870 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-f9k95" podUID="bb6e47d0-5966-48d3-be81-97265e7e7a4f" containerName="console" probeResult="failure" output="Get \"https://10.217.0.9:8443/health\": dial tcp 10.217.0.9:8443: connect: connection refused" Dec 01 14:00:44 crc kubenswrapper[4585]: I1201 14:00:44.042791 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 14:00:44 crc kubenswrapper[4585]: I1201 14:00:44.088693 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b5c9948e-47d0-4af1-b570-de1439b3184a-kubelet-dir\") pod \"b5c9948e-47d0-4af1-b570-de1439b3184a\" (UID: \"b5c9948e-47d0-4af1-b570-de1439b3184a\") " Dec 01 14:00:44 crc kubenswrapper[4585]: I1201 14:00:44.088814 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b5c9948e-47d0-4af1-b570-de1439b3184a-kube-api-access\") pod \"b5c9948e-47d0-4af1-b570-de1439b3184a\" (UID: \"b5c9948e-47d0-4af1-b570-de1439b3184a\") " Dec 01 14:00:44 crc kubenswrapper[4585]: I1201 14:00:44.088825 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b5c9948e-47d0-4af1-b570-de1439b3184a-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b5c9948e-47d0-4af1-b570-de1439b3184a" (UID: "b5c9948e-47d0-4af1-b570-de1439b3184a"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 14:00:44 crc kubenswrapper[4585]: I1201 14:00:44.089178 4585 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b5c9948e-47d0-4af1-b570-de1439b3184a-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 01 14:00:44 crc kubenswrapper[4585]: I1201 14:00:44.096516 4585 patch_prober.go:28] interesting pod/router-default-5444994796-fxg9h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 14:00:44 crc kubenswrapper[4585]: [-]has-synced failed: reason withheld Dec 01 14:00:44 crc kubenswrapper[4585]: [+]process-running ok Dec 01 14:00:44 crc kubenswrapper[4585]: healthz check failed Dec 01 14:00:44 crc kubenswrapper[4585]: I1201 14:00:44.097528 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fxg9h" podUID="e42acc88-ce5b-4fc0-b4a6-c3c78fcf8428" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 14:00:44 crc kubenswrapper[4585]: I1201 14:00:44.111800 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5c9948e-47d0-4af1-b570-de1439b3184a-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b5c9948e-47d0-4af1-b570-de1439b3184a" (UID: "b5c9948e-47d0-4af1-b570-de1439b3184a"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:00:44 crc kubenswrapper[4585]: I1201 14:00:44.191054 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b5c9948e-47d0-4af1-b570-de1439b3184a-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 14:00:44 crc kubenswrapper[4585]: I1201 14:00:44.219469 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"b5c9948e-47d0-4af1-b570-de1439b3184a","Type":"ContainerDied","Data":"bb6a2d7c64fc8df86cf8277caa9571995bb88c5503387c8445415a576f99738d"} Dec 01 14:00:44 crc kubenswrapper[4585]: I1201 14:00:44.219519 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb6a2d7c64fc8df86cf8277caa9571995bb88c5503387c8445415a576f99738d" Dec 01 14:00:44 crc kubenswrapper[4585]: I1201 14:00:44.219680 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 14:00:44 crc kubenswrapper[4585]: I1201 14:00:44.235268 4585 generic.go:334] "Generic (PLEG): container finished" podID="1ca012d6-094a-4703-b8cc-d9d53fa9886d" containerID="589913f8e0eccdf800c0ca0f20d5850b40b34cbd7ee4a27991847f30a8b4690f" exitCode=0 Dec 01 14:00:44 crc kubenswrapper[4585]: I1201 14:00:44.236993 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409960-6f6w5" event={"ID":"1ca012d6-094a-4703-b8cc-d9d53fa9886d","Type":"ContainerDied","Data":"589913f8e0eccdf800c0ca0f20d5850b40b34cbd7ee4a27991847f30a8b4690f"} Dec 01 14:00:44 crc kubenswrapper[4585]: I1201 14:00:44.252361 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"f315616d-10a1-46a1-93d5-179ab1773b2c","Type":"ContainerStarted","Data":"78d6ebd97bb7cbe0bcfb424f0769f29ecfacaaf56dee5b67e8ca9990b952a6e5"} Dec 01 14:00:44 crc kubenswrapper[4585]: I1201 14:00:44.728673 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-d2nj6" Dec 01 14:00:44 crc kubenswrapper[4585]: I1201 14:00:44.822230 4585 patch_prober.go:28] interesting pod/downloads-7954f5f757-dsfs8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Dec 01 14:00:44 crc kubenswrapper[4585]: I1201 14:00:44.822301 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dsfs8" podUID="c239d6eb-535e-442b-a67a-f8227313ceb4" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Dec 01 14:00:44 crc kubenswrapper[4585]: I1201 14:00:44.822590 4585 patch_prober.go:28] interesting pod/downloads-7954f5f757-dsfs8 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Dec 01 14:00:44 crc kubenswrapper[4585]: I1201 14:00:44.822709 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-dsfs8" podUID="c239d6eb-535e-442b-a67a-f8227313ceb4" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Dec 01 14:00:45 crc kubenswrapper[4585]: I1201 14:00:45.095654 4585 patch_prober.go:28] interesting pod/router-default-5444994796-fxg9h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 14:00:45 crc kubenswrapper[4585]: [-]has-synced failed: reason withheld Dec 01 14:00:45 crc kubenswrapper[4585]: [+]process-running ok Dec 01 14:00:45 crc kubenswrapper[4585]: healthz check failed Dec 01 14:00:45 crc kubenswrapper[4585]: I1201 14:00:45.095723 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fxg9h" podUID="e42acc88-ce5b-4fc0-b4a6-c3c78fcf8428" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 14:00:45 crc kubenswrapper[4585]: I1201 14:00:45.283075 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"f315616d-10a1-46a1-93d5-179ab1773b2c","Type":"ContainerStarted","Data":"f80477d0f0ccdb3a0d3aec03e9867b9edf5b92ce2fcbd5ae081be65ca38e2ac0"} Dec 01 14:00:45 crc kubenswrapper[4585]: I1201 14:00:45.309957 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=4.309939908 podStartE2EDuration="4.309939908s" podCreationTimestamp="2025-12-01 14:00:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:00:45.306765376 +0000 UTC m=+159.290979231" watchObservedRunningTime="2025-12-01 14:00:45.309939908 +0000 UTC m=+159.294153753" Dec 01 14:00:46 crc kubenswrapper[4585]: I1201 14:00:46.289464 4585 patch_prober.go:28] interesting pod/router-default-5444994796-fxg9h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 14:00:46 crc kubenswrapper[4585]: [-]has-synced failed: reason withheld Dec 01 14:00:46 crc kubenswrapper[4585]: [+]process-running ok Dec 01 14:00:46 crc kubenswrapper[4585]: healthz check failed Dec 01 14:00:46 crc kubenswrapper[4585]: I1201 14:00:46.289527 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fxg9h" podUID="e42acc88-ce5b-4fc0-b4a6-c3c78fcf8428" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 14:00:46 crc kubenswrapper[4585]: I1201 14:00:46.452416 4585 generic.go:334] "Generic (PLEG): container finished" podID="f315616d-10a1-46a1-93d5-179ab1773b2c" containerID="f80477d0f0ccdb3a0d3aec03e9867b9edf5b92ce2fcbd5ae081be65ca38e2ac0" exitCode=0 Dec 01 14:00:46 crc kubenswrapper[4585]: I1201 14:00:46.452475 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"f315616d-10a1-46a1-93d5-179ab1773b2c","Type":"ContainerDied","Data":"f80477d0f0ccdb3a0d3aec03e9867b9edf5b92ce2fcbd5ae081be65ca38e2ac0"} Dec 01 14:00:46 crc kubenswrapper[4585]: I1201 14:00:46.623267 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409960-6f6w5" Dec 01 14:00:46 crc kubenswrapper[4585]: I1201 14:00:46.701582 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1ca012d6-094a-4703-b8cc-d9d53fa9886d-config-volume\") pod \"1ca012d6-094a-4703-b8cc-d9d53fa9886d\" (UID: \"1ca012d6-094a-4703-b8cc-d9d53fa9886d\") " Dec 01 14:00:46 crc kubenswrapper[4585]: I1201 14:00:46.701785 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1ca012d6-094a-4703-b8cc-d9d53fa9886d-secret-volume\") pod \"1ca012d6-094a-4703-b8cc-d9d53fa9886d\" (UID: \"1ca012d6-094a-4703-b8cc-d9d53fa9886d\") " Dec 01 14:00:46 crc kubenswrapper[4585]: I1201 14:00:46.701898 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsrrk\" (UniqueName: \"kubernetes.io/projected/1ca012d6-094a-4703-b8cc-d9d53fa9886d-kube-api-access-qsrrk\") pod \"1ca012d6-094a-4703-b8cc-d9d53fa9886d\" (UID: \"1ca012d6-094a-4703-b8cc-d9d53fa9886d\") " Dec 01 14:00:46 crc kubenswrapper[4585]: I1201 14:00:46.714095 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ca012d6-094a-4703-b8cc-d9d53fa9886d-config-volume" (OuterVolumeSpecName: "config-volume") pod "1ca012d6-094a-4703-b8cc-d9d53fa9886d" (UID: "1ca012d6-094a-4703-b8cc-d9d53fa9886d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:00:46 crc kubenswrapper[4585]: I1201 14:00:46.742524 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ca012d6-094a-4703-b8cc-d9d53fa9886d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1ca012d6-094a-4703-b8cc-d9d53fa9886d" (UID: "1ca012d6-094a-4703-b8cc-d9d53fa9886d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:00:46 crc kubenswrapper[4585]: I1201 14:00:46.745862 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ca012d6-094a-4703-b8cc-d9d53fa9886d-kube-api-access-qsrrk" (OuterVolumeSpecName: "kube-api-access-qsrrk") pod "1ca012d6-094a-4703-b8cc-d9d53fa9886d" (UID: "1ca012d6-094a-4703-b8cc-d9d53fa9886d"). InnerVolumeSpecName "kube-api-access-qsrrk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:00:46 crc kubenswrapper[4585]: I1201 14:00:46.803730 4585 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1ca012d6-094a-4703-b8cc-d9d53fa9886d-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 14:00:46 crc kubenswrapper[4585]: I1201 14:00:46.803776 4585 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1ca012d6-094a-4703-b8cc-d9d53fa9886d-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 14:00:46 crc kubenswrapper[4585]: I1201 14:00:46.803788 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsrrk\" (UniqueName: \"kubernetes.io/projected/1ca012d6-094a-4703-b8cc-d9d53fa9886d-kube-api-access-qsrrk\") on node \"crc\" DevicePath \"\"" Dec 01 14:00:47 crc kubenswrapper[4585]: I1201 14:00:47.085580 4585 patch_prober.go:28] interesting pod/router-default-5444994796-fxg9h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 14:00:47 crc kubenswrapper[4585]: [+]has-synced ok Dec 01 14:00:47 crc kubenswrapper[4585]: [+]process-running ok Dec 01 14:00:47 crc kubenswrapper[4585]: healthz check failed Dec 01 14:00:47 crc kubenswrapper[4585]: I1201 14:00:47.085628 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fxg9h" podUID="e42acc88-ce5b-4fc0-b4a6-c3c78fcf8428" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 14:00:47 crc kubenswrapper[4585]: I1201 14:00:47.472936 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409960-6f6w5" Dec 01 14:00:47 crc kubenswrapper[4585]: I1201 14:00:47.472924 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409960-6f6w5" event={"ID":"1ca012d6-094a-4703-b8cc-d9d53fa9886d","Type":"ContainerDied","Data":"a9aad827d52d1f52de173a6dcfb71211259ba5168b0d26b084d15fbd09147e29"} Dec 01 14:00:47 crc kubenswrapper[4585]: I1201 14:00:47.473642 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9aad827d52d1f52de173a6dcfb71211259ba5168b0d26b084d15fbd09147e29" Dec 01 14:00:48 crc kubenswrapper[4585]: I1201 14:00:48.086473 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-fxg9h" Dec 01 14:00:48 crc kubenswrapper[4585]: I1201 14:00:48.089478 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-fxg9h" Dec 01 14:00:48 crc kubenswrapper[4585]: I1201 14:00:48.494332 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"f315616d-10a1-46a1-93d5-179ab1773b2c","Type":"ContainerDied","Data":"78d6ebd97bb7cbe0bcfb424f0769f29ecfacaaf56dee5b67e8ca9990b952a6e5"} Dec 01 14:00:48 crc kubenswrapper[4585]: I1201 14:00:48.494637 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78d6ebd97bb7cbe0bcfb424f0769f29ecfacaaf56dee5b67e8ca9990b952a6e5" Dec 01 14:00:48 crc kubenswrapper[4585]: I1201 14:00:48.509232 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 14:00:48 crc kubenswrapper[4585]: I1201 14:00:48.583006 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f315616d-10a1-46a1-93d5-179ab1773b2c-kubelet-dir\") pod \"f315616d-10a1-46a1-93d5-179ab1773b2c\" (UID: \"f315616d-10a1-46a1-93d5-179ab1773b2c\") " Dec 01 14:00:48 crc kubenswrapper[4585]: I1201 14:00:48.583117 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f315616d-10a1-46a1-93d5-179ab1773b2c-kube-api-access\") pod \"f315616d-10a1-46a1-93d5-179ab1773b2c\" (UID: \"f315616d-10a1-46a1-93d5-179ab1773b2c\") " Dec 01 14:00:48 crc kubenswrapper[4585]: I1201 14:00:48.583541 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f315616d-10a1-46a1-93d5-179ab1773b2c-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "f315616d-10a1-46a1-93d5-179ab1773b2c" (UID: "f315616d-10a1-46a1-93d5-179ab1773b2c"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 14:00:48 crc kubenswrapper[4585]: I1201 14:00:48.598431 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f315616d-10a1-46a1-93d5-179ab1773b2c-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "f315616d-10a1-46a1-93d5-179ab1773b2c" (UID: "f315616d-10a1-46a1-93d5-179ab1773b2c"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:00:48 crc kubenswrapper[4585]: I1201 14:00:48.685132 4585 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f315616d-10a1-46a1-93d5-179ab1773b2c-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 01 14:00:48 crc kubenswrapper[4585]: I1201 14:00:48.685173 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f315616d-10a1-46a1-93d5-179ab1773b2c-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 14:00:49 crc kubenswrapper[4585]: I1201 14:00:49.297116 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f11a95e1-135a-4fd2-9a04-1487c56a18e1-metrics-certs\") pod \"network-metrics-daemon-qrdw5\" (UID: \"f11a95e1-135a-4fd2-9a04-1487c56a18e1\") " pod="openshift-multus/network-metrics-daemon-qrdw5" Dec 01 14:00:49 crc kubenswrapper[4585]: I1201 14:00:49.304613 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f11a95e1-135a-4fd2-9a04-1487c56a18e1-metrics-certs\") pod \"network-metrics-daemon-qrdw5\" (UID: \"f11a95e1-135a-4fd2-9a04-1487c56a18e1\") " pod="openshift-multus/network-metrics-daemon-qrdw5" Dec 01 14:00:49 crc kubenswrapper[4585]: I1201 14:00:49.343210 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qrdw5" Dec 01 14:00:49 crc kubenswrapper[4585]: I1201 14:00:49.522998 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 14:00:50 crc kubenswrapper[4585]: I1201 14:00:50.790875 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qrdw5"] Dec 01 14:00:51 crc kubenswrapper[4585]: I1201 14:00:51.544692 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qrdw5" event={"ID":"f11a95e1-135a-4fd2-9a04-1487c56a18e1","Type":"ContainerStarted","Data":"0e13b4045d2a50539182ba6ac5e244186ef4948836681bbf238f892c412146d5"} Dec 01 14:00:53 crc kubenswrapper[4585]: I1201 14:00:53.933349 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-f9k95" Dec 01 14:00:53 crc kubenswrapper[4585]: I1201 14:00:53.942618 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-f9k95" Dec 01 14:00:54 crc kubenswrapper[4585]: I1201 14:00:54.834908 4585 patch_prober.go:28] interesting pod/downloads-7954f5f757-dsfs8 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Dec 01 14:00:54 crc kubenswrapper[4585]: I1201 14:00:54.835246 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-dsfs8" podUID="c239d6eb-535e-442b-a67a-f8227313ceb4" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Dec 01 14:00:54 crc kubenswrapper[4585]: I1201 14:00:54.835312 4585 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-dsfs8" Dec 01 14:00:54 crc kubenswrapper[4585]: I1201 14:00:54.835902 4585 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"b49db3d9ac3c761d310e04175bc8ae16aeae4f17d66fafb39d0c663aacb5789b"} pod="openshift-console/downloads-7954f5f757-dsfs8" containerMessage="Container download-server failed liveness probe, will be restarted" Dec 01 14:00:54 crc kubenswrapper[4585]: I1201 14:00:54.836009 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-dsfs8" podUID="c239d6eb-535e-442b-a67a-f8227313ceb4" containerName="download-server" containerID="cri-o://b49db3d9ac3c761d310e04175bc8ae16aeae4f17d66fafb39d0c663aacb5789b" gracePeriod=2 Dec 01 14:00:54 crc kubenswrapper[4585]: I1201 14:00:54.838995 4585 patch_prober.go:28] interesting pod/downloads-7954f5f757-dsfs8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Dec 01 14:00:54 crc kubenswrapper[4585]: I1201 14:00:54.839130 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dsfs8" podUID="c239d6eb-535e-442b-a67a-f8227313ceb4" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Dec 01 14:00:54 crc kubenswrapper[4585]: I1201 14:00:54.840005 4585 patch_prober.go:28] interesting pod/downloads-7954f5f757-dsfs8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Dec 01 14:00:54 crc kubenswrapper[4585]: I1201 14:00:54.840074 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dsfs8" podUID="c239d6eb-535e-442b-a67a-f8227313ceb4" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Dec 01 14:00:55 crc kubenswrapper[4585]: I1201 14:00:55.681849 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qrdw5" event={"ID":"f11a95e1-135a-4fd2-9a04-1487c56a18e1","Type":"ContainerStarted","Data":"b2c3c5dc33a71e0f0fd3e009c3da2a7596a28a83b8a40628a1b30cf96068ce89"} Dec 01 14:00:55 crc kubenswrapper[4585]: I1201 14:00:55.691567 4585 generic.go:334] "Generic (PLEG): container finished" podID="c239d6eb-535e-442b-a67a-f8227313ceb4" containerID="b49db3d9ac3c761d310e04175bc8ae16aeae4f17d66fafb39d0c663aacb5789b" exitCode=0 Dec 01 14:00:55 crc kubenswrapper[4585]: I1201 14:00:55.691617 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-dsfs8" event={"ID":"c239d6eb-535e-442b-a67a-f8227313ceb4","Type":"ContainerDied","Data":"b49db3d9ac3c761d310e04175bc8ae16aeae4f17d66fafb39d0c663aacb5789b"} Dec 01 14:00:57 crc kubenswrapper[4585]: I1201 14:00:57.959305 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:00:57 crc kubenswrapper[4585]: I1201 14:00:57.961532 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qrdw5" event={"ID":"f11a95e1-135a-4fd2-9a04-1487c56a18e1","Type":"ContainerStarted","Data":"6b5768d580440ac9e76138e3f00641bfafd2e3cc1ba96bb227cb521626aa0c89"} Dec 01 14:01:04 crc kubenswrapper[4585]: I1201 14:01:04.825796 4585 patch_prober.go:28] interesting pod/downloads-7954f5f757-dsfs8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Dec 01 14:01:04 crc kubenswrapper[4585]: I1201 14:01:04.826645 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dsfs8" podUID="c239d6eb-535e-442b-a67a-f8227313ceb4" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Dec 01 14:01:06 crc kubenswrapper[4585]: I1201 14:01:06.373053 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-79qpf" Dec 01 14:01:06 crc kubenswrapper[4585]: I1201 14:01:06.547714 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-qrdw5" podStartSLOduration=160.547697275 podStartE2EDuration="2m40.547697275s" podCreationTimestamp="2025-12-01 13:58:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:00:58.441819893 +0000 UTC m=+172.426033748" watchObservedRunningTime="2025-12-01 14:01:06.547697275 +0000 UTC m=+180.531911130" Dec 01 14:01:13 crc kubenswrapper[4585]: I1201 14:01:13.719408 4585 patch_prober.go:28] interesting pod/machine-config-daemon-lj9gs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 14:01:13 crc kubenswrapper[4585]: I1201 14:01:13.719918 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 14:01:13 crc kubenswrapper[4585]: I1201 14:01:13.757761 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 14:01:14 crc kubenswrapper[4585]: I1201 14:01:14.822793 4585 patch_prober.go:28] interesting pod/downloads-7954f5f757-dsfs8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Dec 01 14:01:14 crc kubenswrapper[4585]: I1201 14:01:14.823193 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dsfs8" podUID="c239d6eb-535e-442b-a67a-f8227313ceb4" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Dec 01 14:01:16 crc kubenswrapper[4585]: I1201 14:01:16.291442 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 01 14:01:16 crc kubenswrapper[4585]: E1201 14:01:16.291789 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5c9948e-47d0-4af1-b570-de1439b3184a" containerName="pruner" Dec 01 14:01:16 crc kubenswrapper[4585]: I1201 14:01:16.291802 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5c9948e-47d0-4af1-b570-de1439b3184a" containerName="pruner" Dec 01 14:01:16 crc kubenswrapper[4585]: E1201 14:01:16.291816 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f315616d-10a1-46a1-93d5-179ab1773b2c" containerName="pruner" Dec 01 14:01:16 crc kubenswrapper[4585]: I1201 14:01:16.291823 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="f315616d-10a1-46a1-93d5-179ab1773b2c" containerName="pruner" Dec 01 14:01:16 crc kubenswrapper[4585]: E1201 14:01:16.291836 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ca012d6-094a-4703-b8cc-d9d53fa9886d" containerName="collect-profiles" Dec 01 14:01:16 crc kubenswrapper[4585]: I1201 14:01:16.291843 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ca012d6-094a-4703-b8cc-d9d53fa9886d" containerName="collect-profiles" Dec 01 14:01:16 crc kubenswrapper[4585]: I1201 14:01:16.291962 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ca012d6-094a-4703-b8cc-d9d53fa9886d" containerName="collect-profiles" Dec 01 14:01:16 crc kubenswrapper[4585]: I1201 14:01:16.291990 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5c9948e-47d0-4af1-b570-de1439b3184a" containerName="pruner" Dec 01 14:01:16 crc kubenswrapper[4585]: I1201 14:01:16.291999 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="f315616d-10a1-46a1-93d5-179ab1773b2c" containerName="pruner" Dec 01 14:01:16 crc kubenswrapper[4585]: I1201 14:01:16.292558 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 14:01:16 crc kubenswrapper[4585]: I1201 14:01:16.296453 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 01 14:01:16 crc kubenswrapper[4585]: I1201 14:01:16.297576 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 01 14:01:16 crc kubenswrapper[4585]: I1201 14:01:16.301345 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 01 14:01:16 crc kubenswrapper[4585]: I1201 14:01:16.306997 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f89b57e3-25af-4211-9f70-7f24ce074512-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f89b57e3-25af-4211-9f70-7f24ce074512\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 14:01:16 crc kubenswrapper[4585]: I1201 14:01:16.307045 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f89b57e3-25af-4211-9f70-7f24ce074512-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f89b57e3-25af-4211-9f70-7f24ce074512\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 14:01:16 crc kubenswrapper[4585]: I1201 14:01:16.408049 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f89b57e3-25af-4211-9f70-7f24ce074512-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f89b57e3-25af-4211-9f70-7f24ce074512\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 14:01:16 crc kubenswrapper[4585]: I1201 14:01:16.408102 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f89b57e3-25af-4211-9f70-7f24ce074512-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f89b57e3-25af-4211-9f70-7f24ce074512\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 14:01:16 crc kubenswrapper[4585]: I1201 14:01:16.408191 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f89b57e3-25af-4211-9f70-7f24ce074512-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f89b57e3-25af-4211-9f70-7f24ce074512\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 14:01:16 crc kubenswrapper[4585]: I1201 14:01:16.438954 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f89b57e3-25af-4211-9f70-7f24ce074512-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f89b57e3-25af-4211-9f70-7f24ce074512\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 14:01:16 crc kubenswrapper[4585]: I1201 14:01:16.618307 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 14:01:22 crc kubenswrapper[4585]: I1201 14:01:22.092869 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 01 14:01:22 crc kubenswrapper[4585]: I1201 14:01:22.110240 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 01 14:01:22 crc kubenswrapper[4585]: I1201 14:01:22.110469 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 01 14:01:22 crc kubenswrapper[4585]: I1201 14:01:22.234722 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/13110086-95d8-4bf3-bc4f-7a7d2f2f7ae6-var-lock\") pod \"installer-9-crc\" (UID: \"13110086-95d8-4bf3-bc4f-7a7d2f2f7ae6\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 14:01:22 crc kubenswrapper[4585]: I1201 14:01:22.235566 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/13110086-95d8-4bf3-bc4f-7a7d2f2f7ae6-kube-api-access\") pod \"installer-9-crc\" (UID: \"13110086-95d8-4bf3-bc4f-7a7d2f2f7ae6\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 14:01:22 crc kubenswrapper[4585]: I1201 14:01:22.235638 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/13110086-95d8-4bf3-bc4f-7a7d2f2f7ae6-kubelet-dir\") pod \"installer-9-crc\" (UID: \"13110086-95d8-4bf3-bc4f-7a7d2f2f7ae6\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 14:01:22 crc kubenswrapper[4585]: I1201 14:01:22.338141 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/13110086-95d8-4bf3-bc4f-7a7d2f2f7ae6-kubelet-dir\") pod \"installer-9-crc\" (UID: \"13110086-95d8-4bf3-bc4f-7a7d2f2f7ae6\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 14:01:22 crc kubenswrapper[4585]: I1201 14:01:22.338223 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/13110086-95d8-4bf3-bc4f-7a7d2f2f7ae6-var-lock\") pod \"installer-9-crc\" (UID: \"13110086-95d8-4bf3-bc4f-7a7d2f2f7ae6\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 14:01:22 crc kubenswrapper[4585]: I1201 14:01:22.338248 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/13110086-95d8-4bf3-bc4f-7a7d2f2f7ae6-kube-api-access\") pod \"installer-9-crc\" (UID: \"13110086-95d8-4bf3-bc4f-7a7d2f2f7ae6\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 14:01:22 crc kubenswrapper[4585]: I1201 14:01:22.338713 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/13110086-95d8-4bf3-bc4f-7a7d2f2f7ae6-var-lock\") pod \"installer-9-crc\" (UID: \"13110086-95d8-4bf3-bc4f-7a7d2f2f7ae6\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 14:01:22 crc kubenswrapper[4585]: I1201 14:01:22.339151 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/13110086-95d8-4bf3-bc4f-7a7d2f2f7ae6-kubelet-dir\") pod \"installer-9-crc\" (UID: \"13110086-95d8-4bf3-bc4f-7a7d2f2f7ae6\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 14:01:22 crc kubenswrapper[4585]: I1201 14:01:22.374875 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/13110086-95d8-4bf3-bc4f-7a7d2f2f7ae6-kube-api-access\") pod \"installer-9-crc\" (UID: \"13110086-95d8-4bf3-bc4f-7a7d2f2f7ae6\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 14:01:22 crc kubenswrapper[4585]: I1201 14:01:22.442773 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 01 14:01:24 crc kubenswrapper[4585]: I1201 14:01:24.823614 4585 patch_prober.go:28] interesting pod/downloads-7954f5f757-dsfs8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Dec 01 14:01:24 crc kubenswrapper[4585]: I1201 14:01:24.824247 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dsfs8" podUID="c239d6eb-535e-442b-a67a-f8227313ceb4" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Dec 01 14:01:31 crc kubenswrapper[4585]: E1201 14:01:31.062480 4585 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 01 14:01:31 crc kubenswrapper[4585]: E1201 14:01:31.063742 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v5fhb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-ng9zf_openshift-marketplace(611e970e-43b2-43b8-b2d6-6302693b7c88): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 14:01:31 crc kubenswrapper[4585]: E1201 14:01:31.065104 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-ng9zf" podUID="611e970e-43b2-43b8-b2d6-6302693b7c88" Dec 01 14:01:32 crc kubenswrapper[4585]: E1201 14:01:32.447885 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-ng9zf" podUID="611e970e-43b2-43b8-b2d6-6302693b7c88" Dec 01 14:01:32 crc kubenswrapper[4585]: E1201 14:01:32.528043 4585 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 01 14:01:32 crc kubenswrapper[4585]: E1201 14:01:32.528275 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-25jx9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-rl7qq_openshift-marketplace(12d64d5a-7b7e-49c9-985a-14efebb14506): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 14:01:32 crc kubenswrapper[4585]: E1201 14:01:32.529458 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-rl7qq" podUID="12d64d5a-7b7e-49c9-985a-14efebb14506" Dec 01 14:01:32 crc kubenswrapper[4585]: E1201 14:01:32.554110 4585 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 01 14:01:32 crc kubenswrapper[4585]: E1201 14:01:32.554418 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-trbch,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-s4g6p_openshift-marketplace(7239ff6c-5a7b-4ac8-a6aa-def2b76e5b95): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 14:01:32 crc kubenswrapper[4585]: E1201 14:01:32.555656 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-s4g6p" podUID="7239ff6c-5a7b-4ac8-a6aa-def2b76e5b95" Dec 01 14:01:34 crc kubenswrapper[4585]: I1201 14:01:34.822163 4585 patch_prober.go:28] interesting pod/downloads-7954f5f757-dsfs8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Dec 01 14:01:34 crc kubenswrapper[4585]: I1201 14:01:34.823106 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dsfs8" podUID="c239d6eb-535e-442b-a67a-f8227313ceb4" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Dec 01 14:01:38 crc kubenswrapper[4585]: E1201 14:01:38.066304 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-rl7qq" podUID="12d64d5a-7b7e-49c9-985a-14efebb14506" Dec 01 14:01:38 crc kubenswrapper[4585]: E1201 14:01:38.066872 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-s4g6p" podUID="7239ff6c-5a7b-4ac8-a6aa-def2b76e5b95" Dec 01 14:01:38 crc kubenswrapper[4585]: E1201 14:01:38.228171 4585 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 01 14:01:38 crc kubenswrapper[4585]: E1201 14:01:38.228321 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rhk9g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-gjnxb_openshift-marketplace(1162041e-acb6-4788-aaf3-841a80c7ec48): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 14:01:38 crc kubenswrapper[4585]: E1201 14:01:38.230233 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-gjnxb" podUID="1162041e-acb6-4788-aaf3-841a80c7ec48" Dec 01 14:01:38 crc kubenswrapper[4585]: E1201 14:01:38.253209 4585 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 01 14:01:38 crc kubenswrapper[4585]: E1201 14:01:38.253366 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dfzqw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-nh772_openshift-marketplace(96c46a5f-2a97-42e9-bf04-4a4b83caf45f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 14:01:38 crc kubenswrapper[4585]: E1201 14:01:38.254774 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-nh772" podUID="96c46a5f-2a97-42e9-bf04-4a4b83caf45f" Dec 01 14:01:38 crc kubenswrapper[4585]: E1201 14:01:38.273552 4585 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 01 14:01:38 crc kubenswrapper[4585]: E1201 14:01:38.273717 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9wsmr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-qzv99_openshift-marketplace(2d07da8c-6db8-49c6-be0f-ebc0f4c0e8c3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 14:01:38 crc kubenswrapper[4585]: E1201 14:01:38.275500 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-qzv99" podUID="2d07da8c-6db8-49c6-be0f-ebc0f4c0e8c3" Dec 01 14:01:38 crc kubenswrapper[4585]: E1201 14:01:38.290428 4585 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 01 14:01:38 crc kubenswrapper[4585]: E1201 14:01:38.290624 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pzp4k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-sbnd6_openshift-marketplace(51b10af7-6b4e-49b2-81cd-50b2a5d5fd91): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 14:01:38 crc kubenswrapper[4585]: E1201 14:01:38.293291 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-sbnd6" podUID="51b10af7-6b4e-49b2-81cd-50b2a5d5fd91" Dec 01 14:01:38 crc kubenswrapper[4585]: I1201 14:01:38.490347 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 01 14:01:38 crc kubenswrapper[4585]: I1201 14:01:38.798519 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 01 14:01:38 crc kubenswrapper[4585]: E1201 14:01:38.805419 4585 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd194ad4_dd93_47aa_8c18_afc2426825ac.slice/crio-4c399aadea5fc5f39423bfdf9310c21cb26ebc1e0d326fe334f70cc64e5a2773.scope\": RecentStats: unable to find data in memory cache]" Dec 01 14:01:38 crc kubenswrapper[4585]: I1201 14:01:38.962513 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"13110086-95d8-4bf3-bc4f-7a7d2f2f7ae6","Type":"ContainerStarted","Data":"17b74ee182723b757e0fde498312ce4aea036e9c77fa30090ca7dcadbc91d3b6"} Dec 01 14:01:38 crc kubenswrapper[4585]: I1201 14:01:38.964952 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"f89b57e3-25af-4211-9f70-7f24ce074512","Type":"ContainerStarted","Data":"791fa9463ebd4f97ae1a92d1d6e991c9bbef9869f8962b9a0871c3f4cd9f52aa"} Dec 01 14:01:38 crc kubenswrapper[4585]: I1201 14:01:38.968251 4585 generic.go:334] "Generic (PLEG): container finished" podID="dd194ad4-dd93-47aa-8c18-afc2426825ac" containerID="4c399aadea5fc5f39423bfdf9310c21cb26ebc1e0d326fe334f70cc64e5a2773" exitCode=0 Dec 01 14:01:38 crc kubenswrapper[4585]: I1201 14:01:38.968318 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8pvrj" event={"ID":"dd194ad4-dd93-47aa-8c18-afc2426825ac","Type":"ContainerDied","Data":"4c399aadea5fc5f39423bfdf9310c21cb26ebc1e0d326fe334f70cc64e5a2773"} Dec 01 14:01:38 crc kubenswrapper[4585]: I1201 14:01:38.978072 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-dsfs8" event={"ID":"c239d6eb-535e-442b-a67a-f8227313ceb4","Type":"ContainerStarted","Data":"a58c62c9f1adba8e274dea7efaf32bb1394178be6976293533670a34f1c9ed51"} Dec 01 14:01:38 crc kubenswrapper[4585]: I1201 14:01:38.978698 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-dsfs8" Dec 01 14:01:38 crc kubenswrapper[4585]: I1201 14:01:38.978879 4585 patch_prober.go:28] interesting pod/downloads-7954f5f757-dsfs8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Dec 01 14:01:38 crc kubenswrapper[4585]: I1201 14:01:38.978930 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dsfs8" podUID="c239d6eb-535e-442b-a67a-f8227313ceb4" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Dec 01 14:01:38 crc kubenswrapper[4585]: E1201 14:01:38.980606 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-nh772" podUID="96c46a5f-2a97-42e9-bf04-4a4b83caf45f" Dec 01 14:01:38 crc kubenswrapper[4585]: E1201 14:01:38.980886 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-gjnxb" podUID="1162041e-acb6-4788-aaf3-841a80c7ec48" Dec 01 14:01:38 crc kubenswrapper[4585]: E1201 14:01:38.982257 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qzv99" podUID="2d07da8c-6db8-49c6-be0f-ebc0f4c0e8c3" Dec 01 14:01:38 crc kubenswrapper[4585]: E1201 14:01:38.984106 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-sbnd6" podUID="51b10af7-6b4e-49b2-81cd-50b2a5d5fd91" Dec 01 14:01:39 crc kubenswrapper[4585]: I1201 14:01:39.988472 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"13110086-95d8-4bf3-bc4f-7a7d2f2f7ae6","Type":"ContainerStarted","Data":"3ec704010cac7e317585e795ee58f832edb80dd79118da87562bb2da53f0201f"} Dec 01 14:01:39 crc kubenswrapper[4585]: I1201 14:01:39.991384 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"f89b57e3-25af-4211-9f70-7f24ce074512","Type":"ContainerStarted","Data":"f619b58e688dcbf248623c5ad16c4175fb0dd75b736138ab69daac3029f1d821"} Dec 01 14:01:39 crc kubenswrapper[4585]: I1201 14:01:39.991935 4585 patch_prober.go:28] interesting pod/downloads-7954f5f757-dsfs8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Dec 01 14:01:39 crc kubenswrapper[4585]: I1201 14:01:39.991988 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dsfs8" podUID="c239d6eb-535e-442b-a67a-f8227313ceb4" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Dec 01 14:01:40 crc kubenswrapper[4585]: I1201 14:01:40.008734 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=18.008711733 podStartE2EDuration="18.008711733s" podCreationTimestamp="2025-12-01 14:01:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:01:40.002645357 +0000 UTC m=+213.986859212" watchObservedRunningTime="2025-12-01 14:01:40.008711733 +0000 UTC m=+213.992925588" Dec 01 14:01:40 crc kubenswrapper[4585]: I1201 14:01:40.025224 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=24.025003001 podStartE2EDuration="24.025003001s" podCreationTimestamp="2025-12-01 14:01:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:01:40.024519816 +0000 UTC m=+214.008733671" watchObservedRunningTime="2025-12-01 14:01:40.025003001 +0000 UTC m=+214.009216856" Dec 01 14:01:40 crc kubenswrapper[4585]: I1201 14:01:40.995863 4585 generic.go:334] "Generic (PLEG): container finished" podID="f89b57e3-25af-4211-9f70-7f24ce074512" containerID="f619b58e688dcbf248623c5ad16c4175fb0dd75b736138ab69daac3029f1d821" exitCode=0 Dec 01 14:01:40 crc kubenswrapper[4585]: I1201 14:01:40.996723 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"f89b57e3-25af-4211-9f70-7f24ce074512","Type":"ContainerDied","Data":"f619b58e688dcbf248623c5ad16c4175fb0dd75b736138ab69daac3029f1d821"} Dec 01 14:01:42 crc kubenswrapper[4585]: I1201 14:01:42.265961 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 14:01:42 crc kubenswrapper[4585]: I1201 14:01:42.301836 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f89b57e3-25af-4211-9f70-7f24ce074512-kube-api-access\") pod \"f89b57e3-25af-4211-9f70-7f24ce074512\" (UID: \"f89b57e3-25af-4211-9f70-7f24ce074512\") " Dec 01 14:01:42 crc kubenswrapper[4585]: I1201 14:01:42.302056 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f89b57e3-25af-4211-9f70-7f24ce074512-kubelet-dir\") pod \"f89b57e3-25af-4211-9f70-7f24ce074512\" (UID: \"f89b57e3-25af-4211-9f70-7f24ce074512\") " Dec 01 14:01:42 crc kubenswrapper[4585]: I1201 14:01:42.302200 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f89b57e3-25af-4211-9f70-7f24ce074512-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "f89b57e3-25af-4211-9f70-7f24ce074512" (UID: "f89b57e3-25af-4211-9f70-7f24ce074512"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 14:01:42 crc kubenswrapper[4585]: I1201 14:01:42.302469 4585 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f89b57e3-25af-4211-9f70-7f24ce074512-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 01 14:01:42 crc kubenswrapper[4585]: I1201 14:01:42.316008 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f89b57e3-25af-4211-9f70-7f24ce074512-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "f89b57e3-25af-4211-9f70-7f24ce074512" (UID: "f89b57e3-25af-4211-9f70-7f24ce074512"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:01:42 crc kubenswrapper[4585]: I1201 14:01:42.402841 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f89b57e3-25af-4211-9f70-7f24ce074512-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 14:01:43 crc kubenswrapper[4585]: I1201 14:01:43.054807 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"f89b57e3-25af-4211-9f70-7f24ce074512","Type":"ContainerDied","Data":"791fa9463ebd4f97ae1a92d1d6e991c9bbef9869f8962b9a0871c3f4cd9f52aa"} Dec 01 14:01:43 crc kubenswrapper[4585]: I1201 14:01:43.054843 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="791fa9463ebd4f97ae1a92d1d6e991c9bbef9869f8962b9a0871c3f4cd9f52aa" Dec 01 14:01:43 crc kubenswrapper[4585]: I1201 14:01:43.054866 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 14:01:43 crc kubenswrapper[4585]: I1201 14:01:43.057220 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8pvrj" event={"ID":"dd194ad4-dd93-47aa-8c18-afc2426825ac","Type":"ContainerStarted","Data":"4f4e4af5ef279dde1bdad376628171947106224bcb50a5cb0bb243de18e71d34"} Dec 01 14:01:43 crc kubenswrapper[4585]: I1201 14:01:43.716453 4585 patch_prober.go:28] interesting pod/machine-config-daemon-lj9gs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 14:01:43 crc kubenswrapper[4585]: I1201 14:01:43.716802 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 14:01:43 crc kubenswrapper[4585]: I1201 14:01:43.716924 4585 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" Dec 01 14:01:43 crc kubenswrapper[4585]: I1201 14:01:43.717595 4585 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4c2cf1c1e4965502ab56b6d9bbf7840d00225b831b1e613295befad0748d7d00"} pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 14:01:43 crc kubenswrapper[4585]: I1201 14:01:43.717650 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerName="machine-config-daemon" containerID="cri-o://4c2cf1c1e4965502ab56b6d9bbf7840d00225b831b1e613295befad0748d7d00" gracePeriod=600 Dec 01 14:01:44 crc kubenswrapper[4585]: I1201 14:01:44.821663 4585 patch_prober.go:28] interesting pod/downloads-7954f5f757-dsfs8 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Dec 01 14:01:44 crc kubenswrapper[4585]: I1201 14:01:44.821742 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-dsfs8" podUID="c239d6eb-535e-442b-a67a-f8227313ceb4" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Dec 01 14:01:44 crc kubenswrapper[4585]: I1201 14:01:44.821663 4585 patch_prober.go:28] interesting pod/downloads-7954f5f757-dsfs8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Dec 01 14:01:44 crc kubenswrapper[4585]: I1201 14:01:44.821842 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dsfs8" podUID="c239d6eb-535e-442b-a67a-f8227313ceb4" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Dec 01 14:01:45 crc kubenswrapper[4585]: I1201 14:01:45.594100 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8pvrj" Dec 01 14:01:45 crc kubenswrapper[4585]: I1201 14:01:45.594163 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8pvrj" Dec 01 14:01:46 crc kubenswrapper[4585]: I1201 14:01:46.076280 4585 generic.go:334] "Generic (PLEG): container finished" podID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerID="4c2cf1c1e4965502ab56b6d9bbf7840d00225b831b1e613295befad0748d7d00" exitCode=0 Dec 01 14:01:46 crc kubenswrapper[4585]: I1201 14:01:46.076325 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" event={"ID":"f7beb40d-bcd0-43c8-a9fe-c32408790a4c","Type":"ContainerDied","Data":"4c2cf1c1e4965502ab56b6d9bbf7840d00225b831b1e613295befad0748d7d00"} Dec 01 14:01:46 crc kubenswrapper[4585]: I1201 14:01:46.979320 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-8pvrj" podUID="dd194ad4-dd93-47aa-8c18-afc2426825ac" containerName="registry-server" probeResult="failure" output=< Dec 01 14:01:46 crc kubenswrapper[4585]: timeout: failed to connect service ":50051" within 1s Dec 01 14:01:46 crc kubenswrapper[4585]: > Dec 01 14:01:47 crc kubenswrapper[4585]: I1201 14:01:47.086065 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" event={"ID":"f7beb40d-bcd0-43c8-a9fe-c32408790a4c","Type":"ContainerStarted","Data":"8d257e580d36e30a7107591145ef0a5ff804617e4fa8a607c1d45bf357edd6a4"} Dec 01 14:01:47 crc kubenswrapper[4585]: I1201 14:01:47.106171 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8pvrj" podStartSLOduration=9.856013522 podStartE2EDuration="1m13.106152101s" podCreationTimestamp="2025-12-01 14:00:34 +0000 UTC" firstStartedPulling="2025-12-01 14:00:38.794817107 +0000 UTC m=+152.779030962" lastFinishedPulling="2025-12-01 14:01:42.044955696 +0000 UTC m=+216.029169541" observedRunningTime="2025-12-01 14:01:43.093757051 +0000 UTC m=+217.077970906" watchObservedRunningTime="2025-12-01 14:01:47.106152101 +0000 UTC m=+221.090365956" Dec 01 14:01:50 crc kubenswrapper[4585]: I1201 14:01:50.107500 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ng9zf" event={"ID":"611e970e-43b2-43b8-b2d6-6302693b7c88","Type":"ContainerStarted","Data":"63d88b49d8954654528f1e4b5b5f533b29b2f08473dc1c3c574a338d044563a7"} Dec 01 14:01:51 crc kubenswrapper[4585]: I1201 14:01:51.115899 4585 generic.go:334] "Generic (PLEG): container finished" podID="611e970e-43b2-43b8-b2d6-6302693b7c88" containerID="63d88b49d8954654528f1e4b5b5f533b29b2f08473dc1c3c574a338d044563a7" exitCode=0 Dec 01 14:01:51 crc kubenswrapper[4585]: I1201 14:01:51.115961 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ng9zf" event={"ID":"611e970e-43b2-43b8-b2d6-6302693b7c88","Type":"ContainerDied","Data":"63d88b49d8954654528f1e4b5b5f533b29b2f08473dc1c3c574a338d044563a7"} Dec 01 14:01:52 crc kubenswrapper[4585]: I1201 14:01:52.165886 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ng9zf" event={"ID":"611e970e-43b2-43b8-b2d6-6302693b7c88","Type":"ContainerStarted","Data":"7041f1f0c8846077d11065328a96929403ac3a75facb56cff62e8907b7b575c6"} Dec 01 14:01:52 crc kubenswrapper[4585]: I1201 14:01:52.192437 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ng9zf" podStartSLOduration=5.34764355 podStartE2EDuration="1m18.192412842s" podCreationTimestamp="2025-12-01 14:00:34 +0000 UTC" firstStartedPulling="2025-12-01 14:00:38.826593171 +0000 UTC m=+152.810807026" lastFinishedPulling="2025-12-01 14:01:51.671362443 +0000 UTC m=+225.655576318" observedRunningTime="2025-12-01 14:01:52.189435621 +0000 UTC m=+226.173649476" watchObservedRunningTime="2025-12-01 14:01:52.192412842 +0000 UTC m=+226.176626697" Dec 01 14:01:53 crc kubenswrapper[4585]: I1201 14:01:53.176324 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rl7qq" event={"ID":"12d64d5a-7b7e-49c9-985a-14efebb14506","Type":"ContainerStarted","Data":"25249b6ec947d6528b9a42f33b00c7eccf54e78b8dc280cf8fb8a977b3bb4d71"} Dec 01 14:01:54 crc kubenswrapper[4585]: I1201 14:01:54.213135 4585 generic.go:334] "Generic (PLEG): container finished" podID="12d64d5a-7b7e-49c9-985a-14efebb14506" containerID="25249b6ec947d6528b9a42f33b00c7eccf54e78b8dc280cf8fb8a977b3bb4d71" exitCode=0 Dec 01 14:01:54 crc kubenswrapper[4585]: I1201 14:01:54.213210 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rl7qq" event={"ID":"12d64d5a-7b7e-49c9-985a-14efebb14506","Type":"ContainerDied","Data":"25249b6ec947d6528b9a42f33b00c7eccf54e78b8dc280cf8fb8a977b3bb4d71"} Dec 01 14:01:54 crc kubenswrapper[4585]: I1201 14:01:54.233137 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sbnd6" event={"ID":"51b10af7-6b4e-49b2-81cd-50b2a5d5fd91","Type":"ContainerStarted","Data":"5ffd1f9a2b8eaf70cf7ece20438894648d313b347b383cc6d739d279439a2c36"} Dec 01 14:01:54 crc kubenswrapper[4585]: I1201 14:01:54.765027 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ng9zf" Dec 01 14:01:54 crc kubenswrapper[4585]: I1201 14:01:54.765447 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ng9zf" Dec 01 14:01:54 crc kubenswrapper[4585]: I1201 14:01:54.833438 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-dsfs8" Dec 01 14:01:54 crc kubenswrapper[4585]: I1201 14:01:54.917698 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ng9zf" Dec 01 14:01:55 crc kubenswrapper[4585]: I1201 14:01:55.664477 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8pvrj" Dec 01 14:01:55 crc kubenswrapper[4585]: I1201 14:01:55.733618 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8pvrj" Dec 01 14:01:57 crc kubenswrapper[4585]: I1201 14:01:57.254157 4585 generic.go:334] "Generic (PLEG): container finished" podID="51b10af7-6b4e-49b2-81cd-50b2a5d5fd91" containerID="5ffd1f9a2b8eaf70cf7ece20438894648d313b347b383cc6d739d279439a2c36" exitCode=0 Dec 01 14:01:57 crc kubenswrapper[4585]: I1201 14:01:57.254239 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sbnd6" event={"ID":"51b10af7-6b4e-49b2-81cd-50b2a5d5fd91","Type":"ContainerDied","Data":"5ffd1f9a2b8eaf70cf7ece20438894648d313b347b383cc6d739d279439a2c36"} Dec 01 14:02:00 crc kubenswrapper[4585]: I1201 14:02:00.300984 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rl7qq" event={"ID":"12d64d5a-7b7e-49c9-985a-14efebb14506","Type":"ContainerStarted","Data":"3936e4f44cdc152844f8fba7c32b857b09f198c698e40bc39f378235451d2a64"} Dec 01 14:02:00 crc kubenswrapper[4585]: I1201 14:02:00.305963 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjnxb" event={"ID":"1162041e-acb6-4788-aaf3-841a80c7ec48","Type":"ContainerStarted","Data":"bfa20b646fc6d8e8bfa11e5686aa09c0ae3d372cd15389c1273737b8fdd6cac4"} Dec 01 14:02:00 crc kubenswrapper[4585]: I1201 14:02:00.308249 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s4g6p" event={"ID":"7239ff6c-5a7b-4ac8-a6aa-def2b76e5b95","Type":"ContainerStarted","Data":"16cd0e6102abbffedb681bf756d6e41f46664d88d70b5deec895ca09f217be61"} Dec 01 14:02:00 crc kubenswrapper[4585]: I1201 14:02:00.310067 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nh772" event={"ID":"96c46a5f-2a97-42e9-bf04-4a4b83caf45f","Type":"ContainerStarted","Data":"c0edc4779e0f1f58941c7f63066e7ff2ad039460585f94200895e58302c41f6c"} Dec 01 14:02:00 crc kubenswrapper[4585]: I1201 14:02:00.386035 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rl7qq" podStartSLOduration=6.658562913 podStartE2EDuration="1m24.386014103s" podCreationTimestamp="2025-12-01 14:00:36 +0000 UTC" firstStartedPulling="2025-12-01 14:00:41.077517586 +0000 UTC m=+155.061731441" lastFinishedPulling="2025-12-01 14:01:58.804968736 +0000 UTC m=+232.789182631" observedRunningTime="2025-12-01 14:02:00.385572829 +0000 UTC m=+234.369786694" watchObservedRunningTime="2025-12-01 14:02:00.386014103 +0000 UTC m=+234.370227948" Dec 01 14:02:01 crc kubenswrapper[4585]: I1201 14:02:01.318372 4585 generic.go:334] "Generic (PLEG): container finished" podID="7239ff6c-5a7b-4ac8-a6aa-def2b76e5b95" containerID="16cd0e6102abbffedb681bf756d6e41f46664d88d70b5deec895ca09f217be61" exitCode=0 Dec 01 14:02:01 crc kubenswrapper[4585]: I1201 14:02:01.318837 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s4g6p" event={"ID":"7239ff6c-5a7b-4ac8-a6aa-def2b76e5b95","Type":"ContainerDied","Data":"16cd0e6102abbffedb681bf756d6e41f46664d88d70b5deec895ca09f217be61"} Dec 01 14:02:01 crc kubenswrapper[4585]: I1201 14:02:01.324209 4585 generic.go:334] "Generic (PLEG): container finished" podID="96c46a5f-2a97-42e9-bf04-4a4b83caf45f" containerID="c0edc4779e0f1f58941c7f63066e7ff2ad039460585f94200895e58302c41f6c" exitCode=0 Dec 01 14:02:01 crc kubenswrapper[4585]: I1201 14:02:01.324411 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nh772" event={"ID":"96c46a5f-2a97-42e9-bf04-4a4b83caf45f","Type":"ContainerDied","Data":"c0edc4779e0f1f58941c7f63066e7ff2ad039460585f94200895e58302c41f6c"} Dec 01 14:02:01 crc kubenswrapper[4585]: I1201 14:02:01.328171 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sbnd6" event={"ID":"51b10af7-6b4e-49b2-81cd-50b2a5d5fd91","Type":"ContainerStarted","Data":"34943a310d82b3ede86ddc5c7f53cb9f2334ca164cc1c199a110fff8577a6e1c"} Dec 01 14:02:01 crc kubenswrapper[4585]: I1201 14:02:01.333875 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qzv99" event={"ID":"2d07da8c-6db8-49c6-be0f-ebc0f4c0e8c3","Type":"ContainerStarted","Data":"11f057ae178774a983f91822d56a3aeebfddcca8a49e3b29136664115950cae2"} Dec 01 14:02:01 crc kubenswrapper[4585]: I1201 14:02:01.371077 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-sbnd6" podStartSLOduration=4.909553619 podStartE2EDuration="1m24.371038948s" podCreationTimestamp="2025-12-01 14:00:37 +0000 UTC" firstStartedPulling="2025-12-01 14:00:41.037849228 +0000 UTC m=+155.022063083" lastFinishedPulling="2025-12-01 14:02:00.499334557 +0000 UTC m=+234.483548412" observedRunningTime="2025-12-01 14:02:01.367254282 +0000 UTC m=+235.351468137" watchObservedRunningTime="2025-12-01 14:02:01.371038948 +0000 UTC m=+235.355252803" Dec 01 14:02:02 crc kubenswrapper[4585]: I1201 14:02:02.383169 4585 generic.go:334] "Generic (PLEG): container finished" podID="2d07da8c-6db8-49c6-be0f-ebc0f4c0e8c3" containerID="11f057ae178774a983f91822d56a3aeebfddcca8a49e3b29136664115950cae2" exitCode=0 Dec 01 14:02:02 crc kubenswrapper[4585]: I1201 14:02:02.383274 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qzv99" event={"ID":"2d07da8c-6db8-49c6-be0f-ebc0f4c0e8c3","Type":"ContainerDied","Data":"11f057ae178774a983f91822d56a3aeebfddcca8a49e3b29136664115950cae2"} Dec 01 14:02:02 crc kubenswrapper[4585]: I1201 14:02:02.387271 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s4g6p" event={"ID":"7239ff6c-5a7b-4ac8-a6aa-def2b76e5b95","Type":"ContainerStarted","Data":"424a27fae70b183518da9eab83bb57bf6f46aa7d2ba8bf33cca5b6f7611efbf6"} Dec 01 14:02:02 crc kubenswrapper[4585]: I1201 14:02:02.392037 4585 generic.go:334] "Generic (PLEG): container finished" podID="1162041e-acb6-4788-aaf3-841a80c7ec48" containerID="bfa20b646fc6d8e8bfa11e5686aa09c0ae3d372cd15389c1273737b8fdd6cac4" exitCode=0 Dec 01 14:02:02 crc kubenswrapper[4585]: I1201 14:02:02.392069 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjnxb" event={"ID":"1162041e-acb6-4788-aaf3-841a80c7ec48","Type":"ContainerDied","Data":"bfa20b646fc6d8e8bfa11e5686aa09c0ae3d372cd15389c1273737b8fdd6cac4"} Dec 01 14:02:02 crc kubenswrapper[4585]: I1201 14:02:02.469612 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-s4g6p" podStartSLOduration=4.728647475 podStartE2EDuration="1m25.469587014s" podCreationTimestamp="2025-12-01 14:00:37 +0000 UTC" firstStartedPulling="2025-12-01 14:00:41.048653226 +0000 UTC m=+155.032867071" lastFinishedPulling="2025-12-01 14:02:01.789592755 +0000 UTC m=+235.773806610" observedRunningTime="2025-12-01 14:02:02.448319944 +0000 UTC m=+236.432533819" watchObservedRunningTime="2025-12-01 14:02:02.469587014 +0000 UTC m=+236.453800859" Dec 01 14:02:03 crc kubenswrapper[4585]: I1201 14:02:03.399926 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nh772" event={"ID":"96c46a5f-2a97-42e9-bf04-4a4b83caf45f","Type":"ContainerStarted","Data":"59fbb58dc1dd14c82672f85a806470dca7d93b0bb7dcee644623eeba82b20fff"} Dec 01 14:02:03 crc kubenswrapper[4585]: I1201 14:02:03.403179 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjnxb" event={"ID":"1162041e-acb6-4788-aaf3-841a80c7ec48","Type":"ContainerStarted","Data":"b190a93b2678bc9a87fc2232b868e2b3f249e065563b465c6620aaf431f850cc"} Dec 01 14:02:03 crc kubenswrapper[4585]: I1201 14:02:03.406510 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qzv99" event={"ID":"2d07da8c-6db8-49c6-be0f-ebc0f4c0e8c3","Type":"ContainerStarted","Data":"7fa5347d2b5ddb33c479e08c0e229c32f42f7d7306c4b14160e18a7e9203780e"} Dec 01 14:02:03 crc kubenswrapper[4585]: I1201 14:02:03.513003 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nh772" podStartSLOduration=5.581331973 podStartE2EDuration="1m29.512954673s" podCreationTimestamp="2025-12-01 14:00:34 +0000 UTC" firstStartedPulling="2025-12-01 14:00:38.837029797 +0000 UTC m=+152.821243652" lastFinishedPulling="2025-12-01 14:02:02.768652497 +0000 UTC m=+236.752866352" observedRunningTime="2025-12-01 14:02:03.510290401 +0000 UTC m=+237.494504266" watchObservedRunningTime="2025-12-01 14:02:03.512954673 +0000 UTC m=+237.497168528" Dec 01 14:02:03 crc kubenswrapper[4585]: I1201 14:02:03.580768 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qzv99" podStartSLOduration=5.514993337 podStartE2EDuration="1m29.580747485s" podCreationTimestamp="2025-12-01 14:00:34 +0000 UTC" firstStartedPulling="2025-12-01 14:00:38.808185808 +0000 UTC m=+152.792399663" lastFinishedPulling="2025-12-01 14:02:02.873939956 +0000 UTC m=+236.858153811" observedRunningTime="2025-12-01 14:02:03.553167592 +0000 UTC m=+237.537381477" watchObservedRunningTime="2025-12-01 14:02:03.580747485 +0000 UTC m=+237.564961340" Dec 01 14:02:03 crc kubenswrapper[4585]: I1201 14:02:03.582721 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gjnxb" podStartSLOduration=4.606894056 podStartE2EDuration="1m26.582666864s" podCreationTimestamp="2025-12-01 14:00:37 +0000 UTC" firstStartedPulling="2025-12-01 14:00:41.019962492 +0000 UTC m=+155.004176347" lastFinishedPulling="2025-12-01 14:02:02.9957353 +0000 UTC m=+236.979949155" observedRunningTime="2025-12-01 14:02:03.580268141 +0000 UTC m=+237.564481996" watchObservedRunningTime="2025-12-01 14:02:03.582666864 +0000 UTC m=+237.566880719" Dec 01 14:02:04 crc kubenswrapper[4585]: I1201 14:02:04.826455 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ng9zf" Dec 01 14:02:05 crc kubenswrapper[4585]: I1201 14:02:05.281271 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nh772" Dec 01 14:02:05 crc kubenswrapper[4585]: I1201 14:02:05.281340 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nh772" Dec 01 14:02:05 crc kubenswrapper[4585]: I1201 14:02:05.595421 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qzv99" Dec 01 14:02:05 crc kubenswrapper[4585]: I1201 14:02:05.595497 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qzv99" Dec 01 14:02:06 crc kubenswrapper[4585]: I1201 14:02:06.324561 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-nh772" podUID="96c46a5f-2a97-42e9-bf04-4a4b83caf45f" containerName="registry-server" probeResult="failure" output=< Dec 01 14:02:06 crc kubenswrapper[4585]: timeout: failed to connect service ":50051" within 1s Dec 01 14:02:06 crc kubenswrapper[4585]: > Dec 01 14:02:06 crc kubenswrapper[4585]: I1201 14:02:06.657379 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-qzv99" podUID="2d07da8c-6db8-49c6-be0f-ebc0f4c0e8c3" containerName="registry-server" probeResult="failure" output=< Dec 01 14:02:06 crc kubenswrapper[4585]: timeout: failed to connect service ":50051" within 1s Dec 01 14:02:06 crc kubenswrapper[4585]: > Dec 01 14:02:07 crc kubenswrapper[4585]: I1201 14:02:07.573705 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rl7qq" Dec 01 14:02:07 crc kubenswrapper[4585]: I1201 14:02:07.574251 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rl7qq" Dec 01 14:02:07 crc kubenswrapper[4585]: I1201 14:02:07.623564 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rl7qq" Dec 01 14:02:07 crc kubenswrapper[4585]: I1201 14:02:07.940420 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-s4g6p" Dec 01 14:02:07 crc kubenswrapper[4585]: I1201 14:02:07.940572 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-s4g6p" Dec 01 14:02:07 crc kubenswrapper[4585]: I1201 14:02:07.991677 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-s4g6p" Dec 01 14:02:07 crc kubenswrapper[4585]: I1201 14:02:07.998381 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-sbnd6" Dec 01 14:02:07 crc kubenswrapper[4585]: I1201 14:02:07.999060 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-sbnd6" Dec 01 14:02:08 crc kubenswrapper[4585]: I1201 14:02:08.052683 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-sbnd6" Dec 01 14:02:08 crc kubenswrapper[4585]: I1201 14:02:08.320011 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gjnxb" Dec 01 14:02:08 crc kubenswrapper[4585]: I1201 14:02:08.320079 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gjnxb" Dec 01 14:02:08 crc kubenswrapper[4585]: I1201 14:02:08.485687 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-s4g6p" Dec 01 14:02:08 crc kubenswrapper[4585]: I1201 14:02:08.486250 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-sbnd6" Dec 01 14:02:08 crc kubenswrapper[4585]: I1201 14:02:08.490746 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rl7qq" Dec 01 14:02:08 crc kubenswrapper[4585]: I1201 14:02:08.956663 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-c7gls"] Dec 01 14:02:09 crc kubenswrapper[4585]: I1201 14:02:09.372878 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gjnxb" podUID="1162041e-acb6-4788-aaf3-841a80c7ec48" containerName="registry-server" probeResult="failure" output=< Dec 01 14:02:09 crc kubenswrapper[4585]: timeout: failed to connect service ":50051" within 1s Dec 01 14:02:09 crc kubenswrapper[4585]: > Dec 01 14:02:11 crc kubenswrapper[4585]: I1201 14:02:11.139489 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8pvrj"] Dec 01 14:02:11 crc kubenswrapper[4585]: I1201 14:02:11.140555 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8pvrj" podUID="dd194ad4-dd93-47aa-8c18-afc2426825ac" containerName="registry-server" containerID="cri-o://4f4e4af5ef279dde1bdad376628171947106224bcb50a5cb0bb243de18e71d34" gracePeriod=30 Dec 01 14:02:11 crc kubenswrapper[4585]: I1201 14:02:11.158882 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qzv99"] Dec 01 14:02:11 crc kubenswrapper[4585]: I1201 14:02:11.159162 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qzv99" podUID="2d07da8c-6db8-49c6-be0f-ebc0f4c0e8c3" containerName="registry-server" containerID="cri-o://7fa5347d2b5ddb33c479e08c0e229c32f42f7d7306c4b14160e18a7e9203780e" gracePeriod=30 Dec 01 14:02:11 crc kubenswrapper[4585]: I1201 14:02:11.170950 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ng9zf"] Dec 01 14:02:11 crc kubenswrapper[4585]: I1201 14:02:11.171218 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ng9zf" podUID="611e970e-43b2-43b8-b2d6-6302693b7c88" containerName="registry-server" containerID="cri-o://7041f1f0c8846077d11065328a96929403ac3a75facb56cff62e8907b7b575c6" gracePeriod=30 Dec 01 14:02:11 crc kubenswrapper[4585]: I1201 14:02:11.180671 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nh772"] Dec 01 14:02:11 crc kubenswrapper[4585]: I1201 14:02:11.181114 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nh772" podUID="96c46a5f-2a97-42e9-bf04-4a4b83caf45f" containerName="registry-server" containerID="cri-o://59fbb58dc1dd14c82672f85a806470dca7d93b0bb7dcee644623eeba82b20fff" gracePeriod=30 Dec 01 14:02:11 crc kubenswrapper[4585]: I1201 14:02:11.196407 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hdrjx"] Dec 01 14:02:11 crc kubenswrapper[4585]: I1201 14:02:11.196740 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-hdrjx" podUID="b615253a-f52e-4607-a63c-7cf1c07dab6b" containerName="marketplace-operator" containerID="cri-o://8657a9e1dd258ca6c13c7fc6d2ca6b96548f6fdfd234e7153f50143454c084d4" gracePeriod=30 Dec 01 14:02:11 crc kubenswrapper[4585]: I1201 14:02:11.214431 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rl7qq"] Dec 01 14:02:11 crc kubenswrapper[4585]: I1201 14:02:11.214658 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rl7qq" podUID="12d64d5a-7b7e-49c9-985a-14efebb14506" containerName="registry-server" containerID="cri-o://3936e4f44cdc152844f8fba7c32b857b09f198c698e40bc39f378235451d2a64" gracePeriod=30 Dec 01 14:02:11 crc kubenswrapper[4585]: I1201 14:02:11.229218 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-s4g6p"] Dec 01 14:02:11 crc kubenswrapper[4585]: I1201 14:02:11.238406 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gjnxb"] Dec 01 14:02:11 crc kubenswrapper[4585]: I1201 14:02:11.238671 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gjnxb" podUID="1162041e-acb6-4788-aaf3-841a80c7ec48" containerName="registry-server" containerID="cri-o://b190a93b2678bc9a87fc2232b868e2b3f249e065563b465c6620aaf431f850cc" gracePeriod=30 Dec 01 14:02:11 crc kubenswrapper[4585]: I1201 14:02:11.266572 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-lwz9j"] Dec 01 14:02:11 crc kubenswrapper[4585]: E1201 14:02:11.267166 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f89b57e3-25af-4211-9f70-7f24ce074512" containerName="pruner" Dec 01 14:02:11 crc kubenswrapper[4585]: I1201 14:02:11.267188 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="f89b57e3-25af-4211-9f70-7f24ce074512" containerName="pruner" Dec 01 14:02:11 crc kubenswrapper[4585]: I1201 14:02:11.267391 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="f89b57e3-25af-4211-9f70-7f24ce074512" containerName="pruner" Dec 01 14:02:11 crc kubenswrapper[4585]: I1201 14:02:11.275379 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-lwz9j" Dec 01 14:02:11 crc kubenswrapper[4585]: I1201 14:02:11.285195 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sbnd6"] Dec 01 14:02:11 crc kubenswrapper[4585]: I1201 14:02:11.289952 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-lwz9j"] Dec 01 14:02:11 crc kubenswrapper[4585]: I1201 14:02:11.437572 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/73887a19-b0ad-43de-a7d3-bda4a7a2a06a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-lwz9j\" (UID: \"73887a19-b0ad-43de-a7d3-bda4a7a2a06a\") " pod="openshift-marketplace/marketplace-operator-79b997595-lwz9j" Dec 01 14:02:11 crc kubenswrapper[4585]: I1201 14:02:11.437677 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddl9d\" (UniqueName: \"kubernetes.io/projected/73887a19-b0ad-43de-a7d3-bda4a7a2a06a-kube-api-access-ddl9d\") pod \"marketplace-operator-79b997595-lwz9j\" (UID: \"73887a19-b0ad-43de-a7d3-bda4a7a2a06a\") " pod="openshift-marketplace/marketplace-operator-79b997595-lwz9j" Dec 01 14:02:11 crc kubenswrapper[4585]: I1201 14:02:11.437702 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/73887a19-b0ad-43de-a7d3-bda4a7a2a06a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-lwz9j\" (UID: \"73887a19-b0ad-43de-a7d3-bda4a7a2a06a\") " pod="openshift-marketplace/marketplace-operator-79b997595-lwz9j" Dec 01 14:02:11 crc kubenswrapper[4585]: I1201 14:02:11.454892 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-sbnd6" podUID="51b10af7-6b4e-49b2-81cd-50b2a5d5fd91" containerName="registry-server" containerID="cri-o://34943a310d82b3ede86ddc5c7f53cb9f2334ca164cc1c199a110fff8577a6e1c" gracePeriod=30 Dec 01 14:02:11 crc kubenswrapper[4585]: I1201 14:02:11.455118 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-s4g6p" podUID="7239ff6c-5a7b-4ac8-a6aa-def2b76e5b95" containerName="registry-server" containerID="cri-o://424a27fae70b183518da9eab83bb57bf6f46aa7d2ba8bf33cca5b6f7611efbf6" gracePeriod=30 Dec 01 14:02:11 crc kubenswrapper[4585]: I1201 14:02:11.539263 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/73887a19-b0ad-43de-a7d3-bda4a7a2a06a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-lwz9j\" (UID: \"73887a19-b0ad-43de-a7d3-bda4a7a2a06a\") " pod="openshift-marketplace/marketplace-operator-79b997595-lwz9j" Dec 01 14:02:11 crc kubenswrapper[4585]: I1201 14:02:11.539380 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/73887a19-b0ad-43de-a7d3-bda4a7a2a06a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-lwz9j\" (UID: \"73887a19-b0ad-43de-a7d3-bda4a7a2a06a\") " pod="openshift-marketplace/marketplace-operator-79b997595-lwz9j" Dec 01 14:02:11 crc kubenswrapper[4585]: I1201 14:02:11.539444 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddl9d\" (UniqueName: \"kubernetes.io/projected/73887a19-b0ad-43de-a7d3-bda4a7a2a06a-kube-api-access-ddl9d\") pod \"marketplace-operator-79b997595-lwz9j\" (UID: \"73887a19-b0ad-43de-a7d3-bda4a7a2a06a\") " pod="openshift-marketplace/marketplace-operator-79b997595-lwz9j" Dec 01 14:02:11 crc kubenswrapper[4585]: I1201 14:02:11.540683 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/73887a19-b0ad-43de-a7d3-bda4a7a2a06a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-lwz9j\" (UID: \"73887a19-b0ad-43de-a7d3-bda4a7a2a06a\") " pod="openshift-marketplace/marketplace-operator-79b997595-lwz9j" Dec 01 14:02:11 crc kubenswrapper[4585]: I1201 14:02:11.550906 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/73887a19-b0ad-43de-a7d3-bda4a7a2a06a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-lwz9j\" (UID: \"73887a19-b0ad-43de-a7d3-bda4a7a2a06a\") " pod="openshift-marketplace/marketplace-operator-79b997595-lwz9j" Dec 01 14:02:11 crc kubenswrapper[4585]: I1201 14:02:11.578503 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddl9d\" (UniqueName: \"kubernetes.io/projected/73887a19-b0ad-43de-a7d3-bda4a7a2a06a-kube-api-access-ddl9d\") pod \"marketplace-operator-79b997595-lwz9j\" (UID: \"73887a19-b0ad-43de-a7d3-bda4a7a2a06a\") " pod="openshift-marketplace/marketplace-operator-79b997595-lwz9j" Dec 01 14:02:11 crc kubenswrapper[4585]: I1201 14:02:11.653793 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-lwz9j" Dec 01 14:02:11 crc kubenswrapper[4585]: I1201 14:02:11.755541 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-s4g6p"] Dec 01 14:02:11 crc kubenswrapper[4585]: I1201 14:02:11.959619 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-lwz9j"] Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.453994 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gjnxb" Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.462611 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s4g6p" Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.491502 4585 generic.go:334] "Generic (PLEG): container finished" podID="1162041e-acb6-4788-aaf3-841a80c7ec48" containerID="b190a93b2678bc9a87fc2232b868e2b3f249e065563b465c6620aaf431f850cc" exitCode=0 Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.491646 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjnxb" event={"ID":"1162041e-acb6-4788-aaf3-841a80c7ec48","Type":"ContainerDied","Data":"b190a93b2678bc9a87fc2232b868e2b3f249e065563b465c6620aaf431f850cc"} Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.492068 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjnxb" event={"ID":"1162041e-acb6-4788-aaf3-841a80c7ec48","Type":"ContainerDied","Data":"c94ef80bb8391f109e3915b0fee608ea4dfbee36bbe16ecc6b402eb85e9761c6"} Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.492096 4585 scope.go:117] "RemoveContainer" containerID="b190a93b2678bc9a87fc2232b868e2b3f249e065563b465c6620aaf431f850cc" Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.491741 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gjnxb" Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.496141 4585 generic.go:334] "Generic (PLEG): container finished" podID="51b10af7-6b4e-49b2-81cd-50b2a5d5fd91" containerID="34943a310d82b3ede86ddc5c7f53cb9f2334ca164cc1c199a110fff8577a6e1c" exitCode=0 Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.496254 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sbnd6" event={"ID":"51b10af7-6b4e-49b2-81cd-50b2a5d5fd91","Type":"ContainerDied","Data":"34943a310d82b3ede86ddc5c7f53cb9f2334ca164cc1c199a110fff8577a6e1c"} Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.496280 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sbnd6" event={"ID":"51b10af7-6b4e-49b2-81cd-50b2a5d5fd91","Type":"ContainerDied","Data":"16b6bc1e6d590246c20ad047ad80e802c5d041c9192f7ff13246213eef9d78ef"} Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.496311 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16b6bc1e6d590246c20ad047ad80e802c5d041c9192f7ff13246213eef9d78ef" Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.499236 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sbnd6" Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.501334 4585 generic.go:334] "Generic (PLEG): container finished" podID="2d07da8c-6db8-49c6-be0f-ebc0f4c0e8c3" containerID="7fa5347d2b5ddb33c479e08c0e229c32f42f7d7306c4b14160e18a7e9203780e" exitCode=0 Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.501450 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qzv99" event={"ID":"2d07da8c-6db8-49c6-be0f-ebc0f4c0e8c3","Type":"ContainerDied","Data":"7fa5347d2b5ddb33c479e08c0e229c32f42f7d7306c4b14160e18a7e9203780e"} Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.505316 4585 generic.go:334] "Generic (PLEG): container finished" podID="dd194ad4-dd93-47aa-8c18-afc2426825ac" containerID="4f4e4af5ef279dde1bdad376628171947106224bcb50a5cb0bb243de18e71d34" exitCode=0 Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.505367 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8pvrj" event={"ID":"dd194ad4-dd93-47aa-8c18-afc2426825ac","Type":"ContainerDied","Data":"4f4e4af5ef279dde1bdad376628171947106224bcb50a5cb0bb243de18e71d34"} Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.528142 4585 generic.go:334] "Generic (PLEG): container finished" podID="611e970e-43b2-43b8-b2d6-6302693b7c88" containerID="7041f1f0c8846077d11065328a96929403ac3a75facb56cff62e8907b7b575c6" exitCode=0 Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.528317 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ng9zf" event={"ID":"611e970e-43b2-43b8-b2d6-6302693b7c88","Type":"ContainerDied","Data":"7041f1f0c8846077d11065328a96929403ac3a75facb56cff62e8907b7b575c6"} Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.536707 4585 generic.go:334] "Generic (PLEG): container finished" podID="7239ff6c-5a7b-4ac8-a6aa-def2b76e5b95" containerID="424a27fae70b183518da9eab83bb57bf6f46aa7d2ba8bf33cca5b6f7611efbf6" exitCode=0 Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.536812 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s4g6p" event={"ID":"7239ff6c-5a7b-4ac8-a6aa-def2b76e5b95","Type":"ContainerDied","Data":"424a27fae70b183518da9eab83bb57bf6f46aa7d2ba8bf33cca5b6f7611efbf6"} Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.536849 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s4g6p" event={"ID":"7239ff6c-5a7b-4ac8-a6aa-def2b76e5b95","Type":"ContainerDied","Data":"81dd475d0c14c39c18cdf168b644a441438be656ce59b467e4c8809a5c707774"} Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.536947 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s4g6p" Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.553758 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hdrjx" event={"ID":"b615253a-f52e-4607-a63c-7cf1c07dab6b","Type":"ContainerDied","Data":"8657a9e1dd258ca6c13c7fc6d2ca6b96548f6fdfd234e7153f50143454c084d4"} Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.553845 4585 generic.go:334] "Generic (PLEG): container finished" podID="b615253a-f52e-4607-a63c-7cf1c07dab6b" containerID="8657a9e1dd258ca6c13c7fc6d2ca6b96548f6fdfd234e7153f50143454c084d4" exitCode=0 Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.556375 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51b10af7-6b4e-49b2-81cd-50b2a5d5fd91-catalog-content\") pod \"51b10af7-6b4e-49b2-81cd-50b2a5d5fd91\" (UID: \"51b10af7-6b4e-49b2-81cd-50b2a5d5fd91\") " Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.556432 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trbch\" (UniqueName: \"kubernetes.io/projected/7239ff6c-5a7b-4ac8-a6aa-def2b76e5b95-kube-api-access-trbch\") pod \"7239ff6c-5a7b-4ac8-a6aa-def2b76e5b95\" (UID: \"7239ff6c-5a7b-4ac8-a6aa-def2b76e5b95\") " Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.556501 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7239ff6c-5a7b-4ac8-a6aa-def2b76e5b95-utilities\") pod \"7239ff6c-5a7b-4ac8-a6aa-def2b76e5b95\" (UID: \"7239ff6c-5a7b-4ac8-a6aa-def2b76e5b95\") " Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.556548 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhk9g\" (UniqueName: \"kubernetes.io/projected/1162041e-acb6-4788-aaf3-841a80c7ec48-kube-api-access-rhk9g\") pod \"1162041e-acb6-4788-aaf3-841a80c7ec48\" (UID: \"1162041e-acb6-4788-aaf3-841a80c7ec48\") " Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.556575 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7239ff6c-5a7b-4ac8-a6aa-def2b76e5b95-catalog-content\") pod \"7239ff6c-5a7b-4ac8-a6aa-def2b76e5b95\" (UID: \"7239ff6c-5a7b-4ac8-a6aa-def2b76e5b95\") " Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.556647 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1162041e-acb6-4788-aaf3-841a80c7ec48-catalog-content\") pod \"1162041e-acb6-4788-aaf3-841a80c7ec48\" (UID: \"1162041e-acb6-4788-aaf3-841a80c7ec48\") " Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.556703 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51b10af7-6b4e-49b2-81cd-50b2a5d5fd91-utilities\") pod \"51b10af7-6b4e-49b2-81cd-50b2a5d5fd91\" (UID: \"51b10af7-6b4e-49b2-81cd-50b2a5d5fd91\") " Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.556773 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1162041e-acb6-4788-aaf3-841a80c7ec48-utilities\") pod \"1162041e-acb6-4788-aaf3-841a80c7ec48\" (UID: \"1162041e-acb6-4788-aaf3-841a80c7ec48\") " Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.556804 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzp4k\" (UniqueName: \"kubernetes.io/projected/51b10af7-6b4e-49b2-81cd-50b2a5d5fd91-kube-api-access-pzp4k\") pod \"51b10af7-6b4e-49b2-81cd-50b2a5d5fd91\" (UID: \"51b10af7-6b4e-49b2-81cd-50b2a5d5fd91\") " Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.561835 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51b10af7-6b4e-49b2-81cd-50b2a5d5fd91-utilities" (OuterVolumeSpecName: "utilities") pod "51b10af7-6b4e-49b2-81cd-50b2a5d5fd91" (UID: "51b10af7-6b4e-49b2-81cd-50b2a5d5fd91"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.562820 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1162041e-acb6-4788-aaf3-841a80c7ec48-utilities" (OuterVolumeSpecName: "utilities") pod "1162041e-acb6-4788-aaf3-841a80c7ec48" (UID: "1162041e-acb6-4788-aaf3-841a80c7ec48"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.567668 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7239ff6c-5a7b-4ac8-a6aa-def2b76e5b95-kube-api-access-trbch" (OuterVolumeSpecName: "kube-api-access-trbch") pod "7239ff6c-5a7b-4ac8-a6aa-def2b76e5b95" (UID: "7239ff6c-5a7b-4ac8-a6aa-def2b76e5b95"). InnerVolumeSpecName "kube-api-access-trbch". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.595650 4585 generic.go:334] "Generic (PLEG): container finished" podID="96c46a5f-2a97-42e9-bf04-4a4b83caf45f" containerID="59fbb58dc1dd14c82672f85a806470dca7d93b0bb7dcee644623eeba82b20fff" exitCode=0 Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.597028 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nh772" event={"ID":"96c46a5f-2a97-42e9-bf04-4a4b83caf45f","Type":"ContainerDied","Data":"59fbb58dc1dd14c82672f85a806470dca7d93b0bb7dcee644623eeba82b20fff"} Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.599156 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1162041e-acb6-4788-aaf3-841a80c7ec48-kube-api-access-rhk9g" (OuterVolumeSpecName: "kube-api-access-rhk9g") pod "1162041e-acb6-4788-aaf3-841a80c7ec48" (UID: "1162041e-acb6-4788-aaf3-841a80c7ec48"). InnerVolumeSpecName "kube-api-access-rhk9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.601230 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51b10af7-6b4e-49b2-81cd-50b2a5d5fd91-kube-api-access-pzp4k" (OuterVolumeSpecName: "kube-api-access-pzp4k") pod "51b10af7-6b4e-49b2-81cd-50b2a5d5fd91" (UID: "51b10af7-6b4e-49b2-81cd-50b2a5d5fd91"). InnerVolumeSpecName "kube-api-access-pzp4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.604131 4585 scope.go:117] "RemoveContainer" containerID="bfa20b646fc6d8e8bfa11e5686aa09c0ae3d372cd15389c1273737b8fdd6cac4" Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.609928 4585 generic.go:334] "Generic (PLEG): container finished" podID="12d64d5a-7b7e-49c9-985a-14efebb14506" containerID="3936e4f44cdc152844f8fba7c32b857b09f198c698e40bc39f378235451d2a64" exitCode=0 Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.610086 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rl7qq" event={"ID":"12d64d5a-7b7e-49c9-985a-14efebb14506","Type":"ContainerDied","Data":"3936e4f44cdc152844f8fba7c32b857b09f198c698e40bc39f378235451d2a64"} Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.613469 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7239ff6c-5a7b-4ac8-a6aa-def2b76e5b95-utilities" (OuterVolumeSpecName: "utilities") pod "7239ff6c-5a7b-4ac8-a6aa-def2b76e5b95" (UID: "7239ff6c-5a7b-4ac8-a6aa-def2b76e5b95"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.614320 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-lwz9j" event={"ID":"73887a19-b0ad-43de-a7d3-bda4a7a2a06a","Type":"ContainerStarted","Data":"051ec2a2bbf5a9dc9d07c8762d7d460c4a399017943648d0a1e9255684d9562e"} Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.653125 4585 scope.go:117] "RemoveContainer" containerID="8ad45edb727d3dd1c1754597f1f88d3f21815af349298555d3fdedb2f8194f88" Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.657879 4585 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51b10af7-6b4e-49b2-81cd-50b2a5d5fd91-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.657907 4585 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1162041e-acb6-4788-aaf3-841a80c7ec48-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.657919 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzp4k\" (UniqueName: \"kubernetes.io/projected/51b10af7-6b4e-49b2-81cd-50b2a5d5fd91-kube-api-access-pzp4k\") on node \"crc\" DevicePath \"\"" Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.657930 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trbch\" (UniqueName: \"kubernetes.io/projected/7239ff6c-5a7b-4ac8-a6aa-def2b76e5b95-kube-api-access-trbch\") on node \"crc\" DevicePath \"\"" Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.657939 4585 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7239ff6c-5a7b-4ac8-a6aa-def2b76e5b95-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.657949 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhk9g\" (UniqueName: \"kubernetes.io/projected/1162041e-acb6-4788-aaf3-841a80c7ec48-kube-api-access-rhk9g\") on node \"crc\" DevicePath \"\"" Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.686833 4585 scope.go:117] "RemoveContainer" containerID="b190a93b2678bc9a87fc2232b868e2b3f249e065563b465c6620aaf431f850cc" Dec 01 14:02:12 crc kubenswrapper[4585]: E1201 14:02:12.687875 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b190a93b2678bc9a87fc2232b868e2b3f249e065563b465c6620aaf431f850cc\": container with ID starting with b190a93b2678bc9a87fc2232b868e2b3f249e065563b465c6620aaf431f850cc not found: ID does not exist" containerID="b190a93b2678bc9a87fc2232b868e2b3f249e065563b465c6620aaf431f850cc" Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.688030 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b190a93b2678bc9a87fc2232b868e2b3f249e065563b465c6620aaf431f850cc"} err="failed to get container status \"b190a93b2678bc9a87fc2232b868e2b3f249e065563b465c6620aaf431f850cc\": rpc error: code = NotFound desc = could not find container \"b190a93b2678bc9a87fc2232b868e2b3f249e065563b465c6620aaf431f850cc\": container with ID starting with b190a93b2678bc9a87fc2232b868e2b3f249e065563b465c6620aaf431f850cc not found: ID does not exist" Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.688057 4585 scope.go:117] "RemoveContainer" containerID="bfa20b646fc6d8e8bfa11e5686aa09c0ae3d372cd15389c1273737b8fdd6cac4" Dec 01 14:02:12 crc kubenswrapper[4585]: E1201 14:02:12.689188 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfa20b646fc6d8e8bfa11e5686aa09c0ae3d372cd15389c1273737b8fdd6cac4\": container with ID starting with bfa20b646fc6d8e8bfa11e5686aa09c0ae3d372cd15389c1273737b8fdd6cac4 not found: ID does not exist" containerID="bfa20b646fc6d8e8bfa11e5686aa09c0ae3d372cd15389c1273737b8fdd6cac4" Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.689284 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfa20b646fc6d8e8bfa11e5686aa09c0ae3d372cd15389c1273737b8fdd6cac4"} err="failed to get container status \"bfa20b646fc6d8e8bfa11e5686aa09c0ae3d372cd15389c1273737b8fdd6cac4\": rpc error: code = NotFound desc = could not find container \"bfa20b646fc6d8e8bfa11e5686aa09c0ae3d372cd15389c1273737b8fdd6cac4\": container with ID starting with bfa20b646fc6d8e8bfa11e5686aa09c0ae3d372cd15389c1273737b8fdd6cac4 not found: ID does not exist" Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.689317 4585 scope.go:117] "RemoveContainer" containerID="8ad45edb727d3dd1c1754597f1f88d3f21815af349298555d3fdedb2f8194f88" Dec 01 14:02:12 crc kubenswrapper[4585]: E1201 14:02:12.689655 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ad45edb727d3dd1c1754597f1f88d3f21815af349298555d3fdedb2f8194f88\": container with ID starting with 8ad45edb727d3dd1c1754597f1f88d3f21815af349298555d3fdedb2f8194f88 not found: ID does not exist" containerID="8ad45edb727d3dd1c1754597f1f88d3f21815af349298555d3fdedb2f8194f88" Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.689696 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ad45edb727d3dd1c1754597f1f88d3f21815af349298555d3fdedb2f8194f88"} err="failed to get container status \"8ad45edb727d3dd1c1754597f1f88d3f21815af349298555d3fdedb2f8194f88\": rpc error: code = NotFound desc = could not find container \"8ad45edb727d3dd1c1754597f1f88d3f21815af349298555d3fdedb2f8194f88\": container with ID starting with 8ad45edb727d3dd1c1754597f1f88d3f21815af349298555d3fdedb2f8194f88 not found: ID does not exist" Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.689777 4585 scope.go:117] "RemoveContainer" containerID="424a27fae70b183518da9eab83bb57bf6f46aa7d2ba8bf33cca5b6f7611efbf6" Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.701765 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7239ff6c-5a7b-4ac8-a6aa-def2b76e5b95-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7239ff6c-5a7b-4ac8-a6aa-def2b76e5b95" (UID: "7239ff6c-5a7b-4ac8-a6aa-def2b76e5b95"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.715733 4585 scope.go:117] "RemoveContainer" containerID="16cd0e6102abbffedb681bf756d6e41f46664d88d70b5deec895ca09f217be61" Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.723541 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51b10af7-6b4e-49b2-81cd-50b2a5d5fd91-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "51b10af7-6b4e-49b2-81cd-50b2a5d5fd91" (UID: "51b10af7-6b4e-49b2-81cd-50b2a5d5fd91"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.744647 4585 scope.go:117] "RemoveContainer" containerID="90f45ad98aad2380aa74c734506295f309d0279ff869879f8e35293acb499e10" Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.760478 4585 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51b10af7-6b4e-49b2-81cd-50b2a5d5fd91-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.760514 4585 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7239ff6c-5a7b-4ac8-a6aa-def2b76e5b95-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.768189 4585 scope.go:117] "RemoveContainer" containerID="424a27fae70b183518da9eab83bb57bf6f46aa7d2ba8bf33cca5b6f7611efbf6" Dec 01 14:02:12 crc kubenswrapper[4585]: E1201 14:02:12.768681 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"424a27fae70b183518da9eab83bb57bf6f46aa7d2ba8bf33cca5b6f7611efbf6\": container with ID starting with 424a27fae70b183518da9eab83bb57bf6f46aa7d2ba8bf33cca5b6f7611efbf6 not found: ID does not exist" containerID="424a27fae70b183518da9eab83bb57bf6f46aa7d2ba8bf33cca5b6f7611efbf6" Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.768718 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"424a27fae70b183518da9eab83bb57bf6f46aa7d2ba8bf33cca5b6f7611efbf6"} err="failed to get container status \"424a27fae70b183518da9eab83bb57bf6f46aa7d2ba8bf33cca5b6f7611efbf6\": rpc error: code = NotFound desc = could not find container \"424a27fae70b183518da9eab83bb57bf6f46aa7d2ba8bf33cca5b6f7611efbf6\": container with ID starting with 424a27fae70b183518da9eab83bb57bf6f46aa7d2ba8bf33cca5b6f7611efbf6 not found: ID does not exist" Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.768749 4585 scope.go:117] "RemoveContainer" containerID="16cd0e6102abbffedb681bf756d6e41f46664d88d70b5deec895ca09f217be61" Dec 01 14:02:12 crc kubenswrapper[4585]: E1201 14:02:12.769149 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16cd0e6102abbffedb681bf756d6e41f46664d88d70b5deec895ca09f217be61\": container with ID starting with 16cd0e6102abbffedb681bf756d6e41f46664d88d70b5deec895ca09f217be61 not found: ID does not exist" containerID="16cd0e6102abbffedb681bf756d6e41f46664d88d70b5deec895ca09f217be61" Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.769167 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16cd0e6102abbffedb681bf756d6e41f46664d88d70b5deec895ca09f217be61"} err="failed to get container status \"16cd0e6102abbffedb681bf756d6e41f46664d88d70b5deec895ca09f217be61\": rpc error: code = NotFound desc = could not find container \"16cd0e6102abbffedb681bf756d6e41f46664d88d70b5deec895ca09f217be61\": container with ID starting with 16cd0e6102abbffedb681bf756d6e41f46664d88d70b5deec895ca09f217be61 not found: ID does not exist" Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.769183 4585 scope.go:117] "RemoveContainer" containerID="90f45ad98aad2380aa74c734506295f309d0279ff869879f8e35293acb499e10" Dec 01 14:02:12 crc kubenswrapper[4585]: E1201 14:02:12.769664 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90f45ad98aad2380aa74c734506295f309d0279ff869879f8e35293acb499e10\": container with ID starting with 90f45ad98aad2380aa74c734506295f309d0279ff869879f8e35293acb499e10 not found: ID does not exist" containerID="90f45ad98aad2380aa74c734506295f309d0279ff869879f8e35293acb499e10" Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.769717 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90f45ad98aad2380aa74c734506295f309d0279ff869879f8e35293acb499e10"} err="failed to get container status \"90f45ad98aad2380aa74c734506295f309d0279ff869879f8e35293acb499e10\": rpc error: code = NotFound desc = could not find container \"90f45ad98aad2380aa74c734506295f309d0279ff869879f8e35293acb499e10\": container with ID starting with 90f45ad98aad2380aa74c734506295f309d0279ff869879f8e35293acb499e10 not found: ID does not exist" Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.772934 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1162041e-acb6-4788-aaf3-841a80c7ec48-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1162041e-acb6-4788-aaf3-841a80c7ec48" (UID: "1162041e-acb6-4788-aaf3-841a80c7ec48"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.861901 4585 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1162041e-acb6-4788-aaf3-841a80c7ec48-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.871068 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gjnxb"] Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.897042 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gjnxb"] Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.906979 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-s4g6p"] Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.919926 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8pvrj" Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.925310 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-s4g6p"] Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.963183 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7cr6\" (UniqueName: \"kubernetes.io/projected/dd194ad4-dd93-47aa-8c18-afc2426825ac-kube-api-access-t7cr6\") pod \"dd194ad4-dd93-47aa-8c18-afc2426825ac\" (UID: \"dd194ad4-dd93-47aa-8c18-afc2426825ac\") " Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.963706 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd194ad4-dd93-47aa-8c18-afc2426825ac-utilities\") pod \"dd194ad4-dd93-47aa-8c18-afc2426825ac\" (UID: \"dd194ad4-dd93-47aa-8c18-afc2426825ac\") " Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.963901 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd194ad4-dd93-47aa-8c18-afc2426825ac-catalog-content\") pod \"dd194ad4-dd93-47aa-8c18-afc2426825ac\" (UID: \"dd194ad4-dd93-47aa-8c18-afc2426825ac\") " Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.966392 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd194ad4-dd93-47aa-8c18-afc2426825ac-utilities" (OuterVolumeSpecName: "utilities") pod "dd194ad4-dd93-47aa-8c18-afc2426825ac" (UID: "dd194ad4-dd93-47aa-8c18-afc2426825ac"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.967415 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ng9zf" Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.970768 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd194ad4-dd93-47aa-8c18-afc2426825ac-kube-api-access-t7cr6" (OuterVolumeSpecName: "kube-api-access-t7cr6") pod "dd194ad4-dd93-47aa-8c18-afc2426825ac" (UID: "dd194ad4-dd93-47aa-8c18-afc2426825ac"). InnerVolumeSpecName "kube-api-access-t7cr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:02:12 crc kubenswrapper[4585]: I1201 14:02:12.980457 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hdrjx" Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.049821 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd194ad4-dd93-47aa-8c18-afc2426825ac-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dd194ad4-dd93-47aa-8c18-afc2426825ac" (UID: "dd194ad4-dd93-47aa-8c18-afc2426825ac"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.069162 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/611e970e-43b2-43b8-b2d6-6302693b7c88-catalog-content\") pod \"611e970e-43b2-43b8-b2d6-6302693b7c88\" (UID: \"611e970e-43b2-43b8-b2d6-6302693b7c88\") " Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.069243 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5fhb\" (UniqueName: \"kubernetes.io/projected/611e970e-43b2-43b8-b2d6-6302693b7c88-kube-api-access-v5fhb\") pod \"611e970e-43b2-43b8-b2d6-6302693b7c88\" (UID: \"611e970e-43b2-43b8-b2d6-6302693b7c88\") " Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.069270 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b615253a-f52e-4607-a63c-7cf1c07dab6b-marketplace-operator-metrics\") pod \"b615253a-f52e-4607-a63c-7cf1c07dab6b\" (UID: \"b615253a-f52e-4607-a63c-7cf1c07dab6b\") " Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.069289 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b615253a-f52e-4607-a63c-7cf1c07dab6b-marketplace-trusted-ca\") pod \"b615253a-f52e-4607-a63c-7cf1c07dab6b\" (UID: \"b615253a-f52e-4607-a63c-7cf1c07dab6b\") " Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.069332 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57v2v\" (UniqueName: \"kubernetes.io/projected/b615253a-f52e-4607-a63c-7cf1c07dab6b-kube-api-access-57v2v\") pod \"b615253a-f52e-4607-a63c-7cf1c07dab6b\" (UID: \"b615253a-f52e-4607-a63c-7cf1c07dab6b\") " Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.069377 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/611e970e-43b2-43b8-b2d6-6302693b7c88-utilities\") pod \"611e970e-43b2-43b8-b2d6-6302693b7c88\" (UID: \"611e970e-43b2-43b8-b2d6-6302693b7c88\") " Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.069587 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7cr6\" (UniqueName: \"kubernetes.io/projected/dd194ad4-dd93-47aa-8c18-afc2426825ac-kube-api-access-t7cr6\") on node \"crc\" DevicePath \"\"" Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.069602 4585 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd194ad4-dd93-47aa-8c18-afc2426825ac-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.069612 4585 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd194ad4-dd93-47aa-8c18-afc2426825ac-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.071454 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b615253a-f52e-4607-a63c-7cf1c07dab6b-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b615253a-f52e-4607-a63c-7cf1c07dab6b" (UID: "b615253a-f52e-4607-a63c-7cf1c07dab6b"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.072052 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/611e970e-43b2-43b8-b2d6-6302693b7c88-utilities" (OuterVolumeSpecName: "utilities") pod "611e970e-43b2-43b8-b2d6-6302693b7c88" (UID: "611e970e-43b2-43b8-b2d6-6302693b7c88"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.077690 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/611e970e-43b2-43b8-b2d6-6302693b7c88-kube-api-access-v5fhb" (OuterVolumeSpecName: "kube-api-access-v5fhb") pod "611e970e-43b2-43b8-b2d6-6302693b7c88" (UID: "611e970e-43b2-43b8-b2d6-6302693b7c88"). InnerVolumeSpecName "kube-api-access-v5fhb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.077831 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b615253a-f52e-4607-a63c-7cf1c07dab6b-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b615253a-f52e-4607-a63c-7cf1c07dab6b" (UID: "b615253a-f52e-4607-a63c-7cf1c07dab6b"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.081394 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b615253a-f52e-4607-a63c-7cf1c07dab6b-kube-api-access-57v2v" (OuterVolumeSpecName: "kube-api-access-57v2v") pod "b615253a-f52e-4607-a63c-7cf1c07dab6b" (UID: "b615253a-f52e-4607-a63c-7cf1c07dab6b"). InnerVolumeSpecName "kube-api-access-57v2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.085230 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qzv99" Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.095017 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nh772" Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.138669 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/611e970e-43b2-43b8-b2d6-6302693b7c88-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "611e970e-43b2-43b8-b2d6-6302693b7c88" (UID: "611e970e-43b2-43b8-b2d6-6302693b7c88"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.171997 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d07da8c-6db8-49c6-be0f-ebc0f4c0e8c3-catalog-content\") pod \"2d07da8c-6db8-49c6-be0f-ebc0f4c0e8c3\" (UID: \"2d07da8c-6db8-49c6-be0f-ebc0f4c0e8c3\") " Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.172140 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d07da8c-6db8-49c6-be0f-ebc0f4c0e8c3-utilities\") pod \"2d07da8c-6db8-49c6-be0f-ebc0f4c0e8c3\" (UID: \"2d07da8c-6db8-49c6-be0f-ebc0f4c0e8c3\") " Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.172186 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96c46a5f-2a97-42e9-bf04-4a4b83caf45f-utilities\") pod \"96c46a5f-2a97-42e9-bf04-4a4b83caf45f\" (UID: \"96c46a5f-2a97-42e9-bf04-4a4b83caf45f\") " Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.172235 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wsmr\" (UniqueName: \"kubernetes.io/projected/2d07da8c-6db8-49c6-be0f-ebc0f4c0e8c3-kube-api-access-9wsmr\") pod \"2d07da8c-6db8-49c6-be0f-ebc0f4c0e8c3\" (UID: \"2d07da8c-6db8-49c6-be0f-ebc0f4c0e8c3\") " Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.172290 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfzqw\" (UniqueName: \"kubernetes.io/projected/96c46a5f-2a97-42e9-bf04-4a4b83caf45f-kube-api-access-dfzqw\") pod \"96c46a5f-2a97-42e9-bf04-4a4b83caf45f\" (UID: \"96c46a5f-2a97-42e9-bf04-4a4b83caf45f\") " Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.172319 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96c46a5f-2a97-42e9-bf04-4a4b83caf45f-catalog-content\") pod \"96c46a5f-2a97-42e9-bf04-4a4b83caf45f\" (UID: \"96c46a5f-2a97-42e9-bf04-4a4b83caf45f\") " Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.172554 4585 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/611e970e-43b2-43b8-b2d6-6302693b7c88-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.172577 4585 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/611e970e-43b2-43b8-b2d6-6302693b7c88-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.172591 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5fhb\" (UniqueName: \"kubernetes.io/projected/611e970e-43b2-43b8-b2d6-6302693b7c88-kube-api-access-v5fhb\") on node \"crc\" DevicePath \"\"" Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.172605 4585 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b615253a-f52e-4607-a63c-7cf1c07dab6b-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.172620 4585 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b615253a-f52e-4607-a63c-7cf1c07dab6b-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.172632 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57v2v\" (UniqueName: \"kubernetes.io/projected/b615253a-f52e-4607-a63c-7cf1c07dab6b-kube-api-access-57v2v\") on node \"crc\" DevicePath \"\"" Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.173682 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96c46a5f-2a97-42e9-bf04-4a4b83caf45f-utilities" (OuterVolumeSpecName: "utilities") pod "96c46a5f-2a97-42e9-bf04-4a4b83caf45f" (UID: "96c46a5f-2a97-42e9-bf04-4a4b83caf45f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.173703 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d07da8c-6db8-49c6-be0f-ebc0f4c0e8c3-utilities" (OuterVolumeSpecName: "utilities") pod "2d07da8c-6db8-49c6-be0f-ebc0f4c0e8c3" (UID: "2d07da8c-6db8-49c6-be0f-ebc0f4c0e8c3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.185047 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96c46a5f-2a97-42e9-bf04-4a4b83caf45f-kube-api-access-dfzqw" (OuterVolumeSpecName: "kube-api-access-dfzqw") pod "96c46a5f-2a97-42e9-bf04-4a4b83caf45f" (UID: "96c46a5f-2a97-42e9-bf04-4a4b83caf45f"). InnerVolumeSpecName "kube-api-access-dfzqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.199915 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d07da8c-6db8-49c6-be0f-ebc0f4c0e8c3-kube-api-access-9wsmr" (OuterVolumeSpecName: "kube-api-access-9wsmr") pod "2d07da8c-6db8-49c6-be0f-ebc0f4c0e8c3" (UID: "2d07da8c-6db8-49c6-be0f-ebc0f4c0e8c3"). InnerVolumeSpecName "kube-api-access-9wsmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.258028 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96c46a5f-2a97-42e9-bf04-4a4b83caf45f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "96c46a5f-2a97-42e9-bf04-4a4b83caf45f" (UID: "96c46a5f-2a97-42e9-bf04-4a4b83caf45f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.274101 4585 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d07da8c-6db8-49c6-be0f-ebc0f4c0e8c3-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.274564 4585 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96c46a5f-2a97-42e9-bf04-4a4b83caf45f-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.274575 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9wsmr\" (UniqueName: \"kubernetes.io/projected/2d07da8c-6db8-49c6-be0f-ebc0f4c0e8c3-kube-api-access-9wsmr\") on node \"crc\" DevicePath \"\"" Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.274592 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfzqw\" (UniqueName: \"kubernetes.io/projected/96c46a5f-2a97-42e9-bf04-4a4b83caf45f-kube-api-access-dfzqw\") on node \"crc\" DevicePath \"\"" Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.274602 4585 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96c46a5f-2a97-42e9-bf04-4a4b83caf45f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.274690 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d07da8c-6db8-49c6-be0f-ebc0f4c0e8c3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2d07da8c-6db8-49c6-be0f-ebc0f4c0e8c3" (UID: "2d07da8c-6db8-49c6-be0f-ebc0f4c0e8c3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.286665 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rl7qq" Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.378890 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12d64d5a-7b7e-49c9-985a-14efebb14506-utilities\") pod \"12d64d5a-7b7e-49c9-985a-14efebb14506\" (UID: \"12d64d5a-7b7e-49c9-985a-14efebb14506\") " Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.379039 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25jx9\" (UniqueName: \"kubernetes.io/projected/12d64d5a-7b7e-49c9-985a-14efebb14506-kube-api-access-25jx9\") pod \"12d64d5a-7b7e-49c9-985a-14efebb14506\" (UID: \"12d64d5a-7b7e-49c9-985a-14efebb14506\") " Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.379065 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12d64d5a-7b7e-49c9-985a-14efebb14506-catalog-content\") pod \"12d64d5a-7b7e-49c9-985a-14efebb14506\" (UID: \"12d64d5a-7b7e-49c9-985a-14efebb14506\") " Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.379753 4585 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d07da8c-6db8-49c6-be0f-ebc0f4c0e8c3-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.380474 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12d64d5a-7b7e-49c9-985a-14efebb14506-utilities" (OuterVolumeSpecName: "utilities") pod "12d64d5a-7b7e-49c9-985a-14efebb14506" (UID: "12d64d5a-7b7e-49c9-985a-14efebb14506"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.384277 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12d64d5a-7b7e-49c9-985a-14efebb14506-kube-api-access-25jx9" (OuterVolumeSpecName: "kube-api-access-25jx9") pod "12d64d5a-7b7e-49c9-985a-14efebb14506" (UID: "12d64d5a-7b7e-49c9-985a-14efebb14506"). InnerVolumeSpecName "kube-api-access-25jx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.396801 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12d64d5a-7b7e-49c9-985a-14efebb14506-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "12d64d5a-7b7e-49c9-985a-14efebb14506" (UID: "12d64d5a-7b7e-49c9-985a-14efebb14506"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.481422 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25jx9\" (UniqueName: \"kubernetes.io/projected/12d64d5a-7b7e-49c9-985a-14efebb14506-kube-api-access-25jx9\") on node \"crc\" DevicePath \"\"" Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.481478 4585 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12d64d5a-7b7e-49c9-985a-14efebb14506-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.481518 4585 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12d64d5a-7b7e-49c9-985a-14efebb14506-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.623785 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hdrjx" Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.623692 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hdrjx" event={"ID":"b615253a-f52e-4607-a63c-7cf1c07dab6b","Type":"ContainerDied","Data":"33f4eb70234be22f1a89b6a1b7f15ceb7661bb524881e415d3dd8edbec9a3dd0"} Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.624100 4585 scope.go:117] "RemoveContainer" containerID="8657a9e1dd258ca6c13c7fc6d2ca6b96548f6fdfd234e7153f50143454c084d4" Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.627043 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nh772" event={"ID":"96c46a5f-2a97-42e9-bf04-4a4b83caf45f","Type":"ContainerDied","Data":"972f5610170271d990f83d12c032bddc53a87fa929c57566f90950dcf7922b95"} Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.627173 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nh772" Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.634555 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rl7qq" event={"ID":"12d64d5a-7b7e-49c9-985a-14efebb14506","Type":"ContainerDied","Data":"5acf4ef146e95a69e2b55fa9d37a8eac8fdb433279d3462de56b02b6b7550e9d"} Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.634603 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rl7qq" Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.636223 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-lwz9j" event={"ID":"73887a19-b0ad-43de-a7d3-bda4a7a2a06a","Type":"ContainerStarted","Data":"ab8cc279cba75bc2bda292d3df794823a2e639b0497305a3f74579c7cf5f84c7"} Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.637353 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-lwz9j" Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.640125 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-lwz9j" Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.640452 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8pvrj" Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.640461 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8pvrj" event={"ID":"dd194ad4-dd93-47aa-8c18-afc2426825ac","Type":"ContainerDied","Data":"92226c221b99c5fa7f5e92fa0dc6f25ab350dee8f1cacca6867f935ff2768120"} Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.643950 4585 scope.go:117] "RemoveContainer" containerID="59fbb58dc1dd14c82672f85a806470dca7d93b0bb7dcee644623eeba82b20fff" Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.648318 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ng9zf" event={"ID":"611e970e-43b2-43b8-b2d6-6302693b7c88","Type":"ContainerDied","Data":"aef7438efd8ace8fa277fbdb03645f2c9a7d40aa19fd30b6a4c34977955d51e7"} Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.648463 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ng9zf" Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.678484 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qzv99" event={"ID":"2d07da8c-6db8-49c6-be0f-ebc0f4c0e8c3","Type":"ContainerDied","Data":"4d3994497a947dddb6ecb9255261b2b0e714bcc07ca9dc90f5fc160119883898"} Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.678515 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qzv99" Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.678563 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sbnd6" Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.691382 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-lwz9j" podStartSLOduration=2.691355765 podStartE2EDuration="2.691355765s" podCreationTimestamp="2025-12-01 14:02:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:02:13.678157121 +0000 UTC m=+247.662370996" watchObservedRunningTime="2025-12-01 14:02:13.691355765 +0000 UTC m=+247.675569620" Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.691129 4585 scope.go:117] "RemoveContainer" containerID="c0edc4779e0f1f58941c7f63066e7ff2ad039460585f94200895e58302c41f6c" Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.742065 4585 scope.go:117] "RemoveContainer" containerID="8abd7558631eb33790615dcc525ea5f882b0360ec23eefd7bbe98d11669dfb62" Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.761396 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nh772"] Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.772676 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nh772"] Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.776490 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rl7qq"] Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.786115 4585 scope.go:117] "RemoveContainer" containerID="3936e4f44cdc152844f8fba7c32b857b09f198c698e40bc39f378235451d2a64" Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.786406 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rl7qq"] Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.793068 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hdrjx"] Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.805856 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hdrjx"] Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.812714 4585 scope.go:117] "RemoveContainer" containerID="25249b6ec947d6528b9a42f33b00c7eccf54e78b8dc280cf8fb8a977b3bb4d71" Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.821820 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qzv99"] Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.829348 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qzv99"] Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.834244 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8pvrj"] Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.838010 4585 scope.go:117] "RemoveContainer" containerID="288de8d26d7811f16a133b6c85dabc0c848450d5e43bdb3d4ac31f5d68d7b62e" Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.848182 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8pvrj"] Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.853318 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sbnd6"] Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.856592 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-sbnd6"] Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.870154 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ng9zf"] Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.871522 4585 scope.go:117] "RemoveContainer" containerID="4f4e4af5ef279dde1bdad376628171947106224bcb50a5cb0bb243de18e71d34" Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.875224 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ng9zf"] Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.889111 4585 scope.go:117] "RemoveContainer" containerID="4c399aadea5fc5f39423bfdf9310c21cb26ebc1e0d326fe334f70cc64e5a2773" Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.909843 4585 scope.go:117] "RemoveContainer" containerID="ae2f108021ebf97e8cd2c5dc8ab1b5cf2d664d2922308c721d789096af5d88d8" Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.928027 4585 scope.go:117] "RemoveContainer" containerID="7041f1f0c8846077d11065328a96929403ac3a75facb56cff62e8907b7b575c6" Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.943642 4585 scope.go:117] "RemoveContainer" containerID="63d88b49d8954654528f1e4b5b5f533b29b2f08473dc1c3c574a338d044563a7" Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.965151 4585 scope.go:117] "RemoveContainer" containerID="9665767658d3db1016c0e68db54b47778b270fb65d0d0a1536a3fe78adff7af2" Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.981337 4585 scope.go:117] "RemoveContainer" containerID="7fa5347d2b5ddb33c479e08c0e229c32f42f7d7306c4b14160e18a7e9203780e" Dec 01 14:02:13 crc kubenswrapper[4585]: I1201 14:02:13.997558 4585 scope.go:117] "RemoveContainer" containerID="11f057ae178774a983f91822d56a3aeebfddcca8a49e3b29136664115950cae2" Dec 01 14:02:14 crc kubenswrapper[4585]: I1201 14:02:14.039955 4585 scope.go:117] "RemoveContainer" containerID="3e1aeb8ba987c1d96923807ee1bf14e219bf47e0baf5f36664c22136bd8c8893" Dec 01 14:02:14 crc kubenswrapper[4585]: I1201 14:02:14.420305 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1162041e-acb6-4788-aaf3-841a80c7ec48" path="/var/lib/kubelet/pods/1162041e-acb6-4788-aaf3-841a80c7ec48/volumes" Dec 01 14:02:14 crc kubenswrapper[4585]: I1201 14:02:14.421133 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12d64d5a-7b7e-49c9-985a-14efebb14506" path="/var/lib/kubelet/pods/12d64d5a-7b7e-49c9-985a-14efebb14506/volumes" Dec 01 14:02:14 crc kubenswrapper[4585]: I1201 14:02:14.421962 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d07da8c-6db8-49c6-be0f-ebc0f4c0e8c3" path="/var/lib/kubelet/pods/2d07da8c-6db8-49c6-be0f-ebc0f4c0e8c3/volumes" Dec 01 14:02:14 crc kubenswrapper[4585]: I1201 14:02:14.423446 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51b10af7-6b4e-49b2-81cd-50b2a5d5fd91" path="/var/lib/kubelet/pods/51b10af7-6b4e-49b2-81cd-50b2a5d5fd91/volumes" Dec 01 14:02:14 crc kubenswrapper[4585]: I1201 14:02:14.424425 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="611e970e-43b2-43b8-b2d6-6302693b7c88" path="/var/lib/kubelet/pods/611e970e-43b2-43b8-b2d6-6302693b7c88/volumes" Dec 01 14:02:14 crc kubenswrapper[4585]: I1201 14:02:14.425711 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7239ff6c-5a7b-4ac8-a6aa-def2b76e5b95" path="/var/lib/kubelet/pods/7239ff6c-5a7b-4ac8-a6aa-def2b76e5b95/volumes" Dec 01 14:02:14 crc kubenswrapper[4585]: I1201 14:02:14.426508 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96c46a5f-2a97-42e9-bf04-4a4b83caf45f" path="/var/lib/kubelet/pods/96c46a5f-2a97-42e9-bf04-4a4b83caf45f/volumes" Dec 01 14:02:14 crc kubenswrapper[4585]: I1201 14:02:14.427884 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b615253a-f52e-4607-a63c-7cf1c07dab6b" path="/var/lib/kubelet/pods/b615253a-f52e-4607-a63c-7cf1c07dab6b/volumes" Dec 01 14:02:14 crc kubenswrapper[4585]: I1201 14:02:14.428558 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd194ad4-dd93-47aa-8c18-afc2426825ac" path="/var/lib/kubelet/pods/dd194ad4-dd93-47aa-8c18-afc2426825ac/volumes" Dec 01 14:02:15 crc kubenswrapper[4585]: I1201 14:02:15.185300 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-558pq"] Dec 01 14:02:15 crc kubenswrapper[4585]: E1201 14:02:15.185599 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51b10af7-6b4e-49b2-81cd-50b2a5d5fd91" containerName="extract-utilities" Dec 01 14:02:15 crc kubenswrapper[4585]: I1201 14:02:15.185618 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="51b10af7-6b4e-49b2-81cd-50b2a5d5fd91" containerName="extract-utilities" Dec 01 14:02:15 crc kubenswrapper[4585]: E1201 14:02:15.185642 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12d64d5a-7b7e-49c9-985a-14efebb14506" containerName="extract-utilities" Dec 01 14:02:15 crc kubenswrapper[4585]: I1201 14:02:15.185650 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="12d64d5a-7b7e-49c9-985a-14efebb14506" containerName="extract-utilities" Dec 01 14:02:15 crc kubenswrapper[4585]: E1201 14:02:15.185658 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96c46a5f-2a97-42e9-bf04-4a4b83caf45f" containerName="registry-server" Dec 01 14:02:15 crc kubenswrapper[4585]: I1201 14:02:15.185668 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="96c46a5f-2a97-42e9-bf04-4a4b83caf45f" containerName="registry-server" Dec 01 14:02:15 crc kubenswrapper[4585]: E1201 14:02:15.185682 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b615253a-f52e-4607-a63c-7cf1c07dab6b" containerName="marketplace-operator" Dec 01 14:02:15 crc kubenswrapper[4585]: I1201 14:02:15.185690 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="b615253a-f52e-4607-a63c-7cf1c07dab6b" containerName="marketplace-operator" Dec 01 14:02:15 crc kubenswrapper[4585]: E1201 14:02:15.185700 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1162041e-acb6-4788-aaf3-841a80c7ec48" containerName="extract-content" Dec 01 14:02:15 crc kubenswrapper[4585]: I1201 14:02:15.185708 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="1162041e-acb6-4788-aaf3-841a80c7ec48" containerName="extract-content" Dec 01 14:02:15 crc kubenswrapper[4585]: E1201 14:02:15.185718 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd194ad4-dd93-47aa-8c18-afc2426825ac" containerName="extract-content" Dec 01 14:02:15 crc kubenswrapper[4585]: I1201 14:02:15.185727 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd194ad4-dd93-47aa-8c18-afc2426825ac" containerName="extract-content" Dec 01 14:02:15 crc kubenswrapper[4585]: E1201 14:02:15.185739 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7239ff6c-5a7b-4ac8-a6aa-def2b76e5b95" containerName="extract-content" Dec 01 14:02:15 crc kubenswrapper[4585]: I1201 14:02:15.185747 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="7239ff6c-5a7b-4ac8-a6aa-def2b76e5b95" containerName="extract-content" Dec 01 14:02:15 crc kubenswrapper[4585]: E1201 14:02:15.185756 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7239ff6c-5a7b-4ac8-a6aa-def2b76e5b95" containerName="registry-server" Dec 01 14:02:15 crc kubenswrapper[4585]: I1201 14:02:15.185765 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="7239ff6c-5a7b-4ac8-a6aa-def2b76e5b95" containerName="registry-server" Dec 01 14:02:15 crc kubenswrapper[4585]: E1201 14:02:15.185777 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="611e970e-43b2-43b8-b2d6-6302693b7c88" containerName="extract-utilities" Dec 01 14:02:15 crc kubenswrapper[4585]: I1201 14:02:15.185785 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="611e970e-43b2-43b8-b2d6-6302693b7c88" containerName="extract-utilities" Dec 01 14:02:15 crc kubenswrapper[4585]: E1201 14:02:15.185796 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d07da8c-6db8-49c6-be0f-ebc0f4c0e8c3" containerName="registry-server" Dec 01 14:02:15 crc kubenswrapper[4585]: I1201 14:02:15.185804 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d07da8c-6db8-49c6-be0f-ebc0f4c0e8c3" containerName="registry-server" Dec 01 14:02:15 crc kubenswrapper[4585]: E1201 14:02:15.185813 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d07da8c-6db8-49c6-be0f-ebc0f4c0e8c3" containerName="extract-utilities" Dec 01 14:02:15 crc kubenswrapper[4585]: I1201 14:02:15.185821 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d07da8c-6db8-49c6-be0f-ebc0f4c0e8c3" containerName="extract-utilities" Dec 01 14:02:15 crc kubenswrapper[4585]: E1201 14:02:15.185833 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51b10af7-6b4e-49b2-81cd-50b2a5d5fd91" containerName="extract-content" Dec 01 14:02:15 crc kubenswrapper[4585]: I1201 14:02:15.185841 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="51b10af7-6b4e-49b2-81cd-50b2a5d5fd91" containerName="extract-content" Dec 01 14:02:15 crc kubenswrapper[4585]: E1201 14:02:15.185850 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51b10af7-6b4e-49b2-81cd-50b2a5d5fd91" containerName="registry-server" Dec 01 14:02:15 crc kubenswrapper[4585]: I1201 14:02:15.185858 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="51b10af7-6b4e-49b2-81cd-50b2a5d5fd91" containerName="registry-server" Dec 01 14:02:15 crc kubenswrapper[4585]: E1201 14:02:15.185869 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="611e970e-43b2-43b8-b2d6-6302693b7c88" containerName="extract-content" Dec 01 14:02:15 crc kubenswrapper[4585]: I1201 14:02:15.185877 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="611e970e-43b2-43b8-b2d6-6302693b7c88" containerName="extract-content" Dec 01 14:02:15 crc kubenswrapper[4585]: E1201 14:02:15.185889 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12d64d5a-7b7e-49c9-985a-14efebb14506" containerName="registry-server" Dec 01 14:02:15 crc kubenswrapper[4585]: I1201 14:02:15.185897 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="12d64d5a-7b7e-49c9-985a-14efebb14506" containerName="registry-server" Dec 01 14:02:15 crc kubenswrapper[4585]: E1201 14:02:15.185905 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d07da8c-6db8-49c6-be0f-ebc0f4c0e8c3" containerName="extract-content" Dec 01 14:02:15 crc kubenswrapper[4585]: I1201 14:02:15.185913 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d07da8c-6db8-49c6-be0f-ebc0f4c0e8c3" containerName="extract-content" Dec 01 14:02:15 crc kubenswrapper[4585]: E1201 14:02:15.185921 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1162041e-acb6-4788-aaf3-841a80c7ec48" containerName="registry-server" Dec 01 14:02:15 crc kubenswrapper[4585]: I1201 14:02:15.185928 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="1162041e-acb6-4788-aaf3-841a80c7ec48" containerName="registry-server" Dec 01 14:02:15 crc kubenswrapper[4585]: E1201 14:02:15.185938 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12d64d5a-7b7e-49c9-985a-14efebb14506" containerName="extract-content" Dec 01 14:02:15 crc kubenswrapper[4585]: I1201 14:02:15.185945 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="12d64d5a-7b7e-49c9-985a-14efebb14506" containerName="extract-content" Dec 01 14:02:15 crc kubenswrapper[4585]: E1201 14:02:15.185953 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96c46a5f-2a97-42e9-bf04-4a4b83caf45f" containerName="extract-content" Dec 01 14:02:15 crc kubenswrapper[4585]: I1201 14:02:15.185960 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="96c46a5f-2a97-42e9-bf04-4a4b83caf45f" containerName="extract-content" Dec 01 14:02:15 crc kubenswrapper[4585]: E1201 14:02:15.185989 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1162041e-acb6-4788-aaf3-841a80c7ec48" containerName="extract-utilities" Dec 01 14:02:15 crc kubenswrapper[4585]: I1201 14:02:15.185997 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="1162041e-acb6-4788-aaf3-841a80c7ec48" containerName="extract-utilities" Dec 01 14:02:15 crc kubenswrapper[4585]: E1201 14:02:15.186008 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7239ff6c-5a7b-4ac8-a6aa-def2b76e5b95" containerName="extract-utilities" Dec 01 14:02:15 crc kubenswrapper[4585]: I1201 14:02:15.186015 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="7239ff6c-5a7b-4ac8-a6aa-def2b76e5b95" containerName="extract-utilities" Dec 01 14:02:15 crc kubenswrapper[4585]: E1201 14:02:15.186023 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="611e970e-43b2-43b8-b2d6-6302693b7c88" containerName="registry-server" Dec 01 14:02:15 crc kubenswrapper[4585]: I1201 14:02:15.186031 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="611e970e-43b2-43b8-b2d6-6302693b7c88" containerName="registry-server" Dec 01 14:02:15 crc kubenswrapper[4585]: E1201 14:02:15.186041 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd194ad4-dd93-47aa-8c18-afc2426825ac" containerName="extract-utilities" Dec 01 14:02:15 crc kubenswrapper[4585]: I1201 14:02:15.186051 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd194ad4-dd93-47aa-8c18-afc2426825ac" containerName="extract-utilities" Dec 01 14:02:15 crc kubenswrapper[4585]: E1201 14:02:15.186062 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96c46a5f-2a97-42e9-bf04-4a4b83caf45f" containerName="extract-utilities" Dec 01 14:02:15 crc kubenswrapper[4585]: I1201 14:02:15.186071 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="96c46a5f-2a97-42e9-bf04-4a4b83caf45f" containerName="extract-utilities" Dec 01 14:02:15 crc kubenswrapper[4585]: E1201 14:02:15.186082 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd194ad4-dd93-47aa-8c18-afc2426825ac" containerName="registry-server" Dec 01 14:02:15 crc kubenswrapper[4585]: I1201 14:02:15.186089 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd194ad4-dd93-47aa-8c18-afc2426825ac" containerName="registry-server" Dec 01 14:02:15 crc kubenswrapper[4585]: I1201 14:02:15.186200 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="1162041e-acb6-4788-aaf3-841a80c7ec48" containerName="registry-server" Dec 01 14:02:15 crc kubenswrapper[4585]: I1201 14:02:15.186215 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="51b10af7-6b4e-49b2-81cd-50b2a5d5fd91" containerName="registry-server" Dec 01 14:02:15 crc kubenswrapper[4585]: I1201 14:02:15.186226 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="96c46a5f-2a97-42e9-bf04-4a4b83caf45f" containerName="registry-server" Dec 01 14:02:15 crc kubenswrapper[4585]: I1201 14:02:15.186237 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="12d64d5a-7b7e-49c9-985a-14efebb14506" containerName="registry-server" Dec 01 14:02:15 crc kubenswrapper[4585]: I1201 14:02:15.186247 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d07da8c-6db8-49c6-be0f-ebc0f4c0e8c3" containerName="registry-server" Dec 01 14:02:15 crc kubenswrapper[4585]: I1201 14:02:15.186259 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd194ad4-dd93-47aa-8c18-afc2426825ac" containerName="registry-server" Dec 01 14:02:15 crc kubenswrapper[4585]: I1201 14:02:15.186270 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="7239ff6c-5a7b-4ac8-a6aa-def2b76e5b95" containerName="registry-server" Dec 01 14:02:15 crc kubenswrapper[4585]: I1201 14:02:15.186281 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="611e970e-43b2-43b8-b2d6-6302693b7c88" containerName="registry-server" Dec 01 14:02:15 crc kubenswrapper[4585]: I1201 14:02:15.186293 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="b615253a-f52e-4607-a63c-7cf1c07dab6b" containerName="marketplace-operator" Dec 01 14:02:15 crc kubenswrapper[4585]: I1201 14:02:15.187207 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-558pq" Dec 01 14:02:15 crc kubenswrapper[4585]: I1201 14:02:15.190627 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 01 14:02:15 crc kubenswrapper[4585]: I1201 14:02:15.206760 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-558pq"] Dec 01 14:02:15 crc kubenswrapper[4585]: I1201 14:02:15.313761 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/820218ea-5c55-45de-a8f8-1a512cf30252-utilities\") pod \"community-operators-558pq\" (UID: \"820218ea-5c55-45de-a8f8-1a512cf30252\") " pod="openshift-marketplace/community-operators-558pq" Dec 01 14:02:15 crc kubenswrapper[4585]: I1201 14:02:15.313858 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsbw6\" (UniqueName: \"kubernetes.io/projected/820218ea-5c55-45de-a8f8-1a512cf30252-kube-api-access-nsbw6\") pod \"community-operators-558pq\" (UID: \"820218ea-5c55-45de-a8f8-1a512cf30252\") " pod="openshift-marketplace/community-operators-558pq" Dec 01 14:02:15 crc kubenswrapper[4585]: I1201 14:02:15.313903 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/820218ea-5c55-45de-a8f8-1a512cf30252-catalog-content\") pod \"community-operators-558pq\" (UID: \"820218ea-5c55-45de-a8f8-1a512cf30252\") " pod="openshift-marketplace/community-operators-558pq" Dec 01 14:02:15 crc kubenswrapper[4585]: I1201 14:02:15.414992 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsbw6\" (UniqueName: \"kubernetes.io/projected/820218ea-5c55-45de-a8f8-1a512cf30252-kube-api-access-nsbw6\") pod \"community-operators-558pq\" (UID: \"820218ea-5c55-45de-a8f8-1a512cf30252\") " pod="openshift-marketplace/community-operators-558pq" Dec 01 14:02:15 crc kubenswrapper[4585]: I1201 14:02:15.415082 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/820218ea-5c55-45de-a8f8-1a512cf30252-catalog-content\") pod \"community-operators-558pq\" (UID: \"820218ea-5c55-45de-a8f8-1a512cf30252\") " pod="openshift-marketplace/community-operators-558pq" Dec 01 14:02:15 crc kubenswrapper[4585]: I1201 14:02:15.415121 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/820218ea-5c55-45de-a8f8-1a512cf30252-utilities\") pod \"community-operators-558pq\" (UID: \"820218ea-5c55-45de-a8f8-1a512cf30252\") " pod="openshift-marketplace/community-operators-558pq" Dec 01 14:02:15 crc kubenswrapper[4585]: I1201 14:02:15.415651 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/820218ea-5c55-45de-a8f8-1a512cf30252-utilities\") pod \"community-operators-558pq\" (UID: \"820218ea-5c55-45de-a8f8-1a512cf30252\") " pod="openshift-marketplace/community-operators-558pq" Dec 01 14:02:15 crc kubenswrapper[4585]: I1201 14:02:15.415767 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/820218ea-5c55-45de-a8f8-1a512cf30252-catalog-content\") pod \"community-operators-558pq\" (UID: \"820218ea-5c55-45de-a8f8-1a512cf30252\") " pod="openshift-marketplace/community-operators-558pq" Dec 01 14:02:15 crc kubenswrapper[4585]: I1201 14:02:15.442214 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsbw6\" (UniqueName: \"kubernetes.io/projected/820218ea-5c55-45de-a8f8-1a512cf30252-kube-api-access-nsbw6\") pod \"community-operators-558pq\" (UID: \"820218ea-5c55-45de-a8f8-1a512cf30252\") " pod="openshift-marketplace/community-operators-558pq" Dec 01 14:02:15 crc kubenswrapper[4585]: I1201 14:02:15.516018 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-558pq" Dec 01 14:02:15 crc kubenswrapper[4585]: I1201 14:02:15.806615 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-558pq"] Dec 01 14:02:16 crc kubenswrapper[4585]: I1201 14:02:16.174390 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vnkff"] Dec 01 14:02:16 crc kubenswrapper[4585]: I1201 14:02:16.176014 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vnkff" Dec 01 14:02:16 crc kubenswrapper[4585]: I1201 14:02:16.179888 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 01 14:02:16 crc kubenswrapper[4585]: I1201 14:02:16.184385 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vnkff"] Dec 01 14:02:16 crc kubenswrapper[4585]: I1201 14:02:16.231982 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7e0be82-9218-49a5-a141-605615d845a8-catalog-content\") pod \"redhat-marketplace-vnkff\" (UID: \"f7e0be82-9218-49a5-a141-605615d845a8\") " pod="openshift-marketplace/redhat-marketplace-vnkff" Dec 01 14:02:16 crc kubenswrapper[4585]: I1201 14:02:16.234441 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7e0be82-9218-49a5-a141-605615d845a8-utilities\") pod \"redhat-marketplace-vnkff\" (UID: \"f7e0be82-9218-49a5-a141-605615d845a8\") " pod="openshift-marketplace/redhat-marketplace-vnkff" Dec 01 14:02:16 crc kubenswrapper[4585]: I1201 14:02:16.234711 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6w6bg\" (UniqueName: \"kubernetes.io/projected/f7e0be82-9218-49a5-a141-605615d845a8-kube-api-access-6w6bg\") pod \"redhat-marketplace-vnkff\" (UID: \"f7e0be82-9218-49a5-a141-605615d845a8\") " pod="openshift-marketplace/redhat-marketplace-vnkff" Dec 01 14:02:16 crc kubenswrapper[4585]: I1201 14:02:16.336289 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6w6bg\" (UniqueName: \"kubernetes.io/projected/f7e0be82-9218-49a5-a141-605615d845a8-kube-api-access-6w6bg\") pod \"redhat-marketplace-vnkff\" (UID: \"f7e0be82-9218-49a5-a141-605615d845a8\") " pod="openshift-marketplace/redhat-marketplace-vnkff" Dec 01 14:02:16 crc kubenswrapper[4585]: I1201 14:02:16.336450 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7e0be82-9218-49a5-a141-605615d845a8-catalog-content\") pod \"redhat-marketplace-vnkff\" (UID: \"f7e0be82-9218-49a5-a141-605615d845a8\") " pod="openshift-marketplace/redhat-marketplace-vnkff" Dec 01 14:02:16 crc kubenswrapper[4585]: I1201 14:02:16.336505 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7e0be82-9218-49a5-a141-605615d845a8-utilities\") pod \"redhat-marketplace-vnkff\" (UID: \"f7e0be82-9218-49a5-a141-605615d845a8\") " pod="openshift-marketplace/redhat-marketplace-vnkff" Dec 01 14:02:16 crc kubenswrapper[4585]: I1201 14:02:16.337088 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7e0be82-9218-49a5-a141-605615d845a8-catalog-content\") pod \"redhat-marketplace-vnkff\" (UID: \"f7e0be82-9218-49a5-a141-605615d845a8\") " pod="openshift-marketplace/redhat-marketplace-vnkff" Dec 01 14:02:16 crc kubenswrapper[4585]: I1201 14:02:16.337297 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7e0be82-9218-49a5-a141-605615d845a8-utilities\") pod \"redhat-marketplace-vnkff\" (UID: \"f7e0be82-9218-49a5-a141-605615d845a8\") " pod="openshift-marketplace/redhat-marketplace-vnkff" Dec 01 14:02:16 crc kubenswrapper[4585]: I1201 14:02:16.362080 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6w6bg\" (UniqueName: \"kubernetes.io/projected/f7e0be82-9218-49a5-a141-605615d845a8-kube-api-access-6w6bg\") pod \"redhat-marketplace-vnkff\" (UID: \"f7e0be82-9218-49a5-a141-605615d845a8\") " pod="openshift-marketplace/redhat-marketplace-vnkff" Dec 01 14:02:16 crc kubenswrapper[4585]: I1201 14:02:16.537647 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vnkff" Dec 01 14:02:16 crc kubenswrapper[4585]: I1201 14:02:16.700907 4585 generic.go:334] "Generic (PLEG): container finished" podID="820218ea-5c55-45de-a8f8-1a512cf30252" containerID="44b7f196db22ed0219d5430739fa98b4186fe15596d53442742e9809c6013d29" exitCode=0 Dec 01 14:02:16 crc kubenswrapper[4585]: I1201 14:02:16.701140 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-558pq" event={"ID":"820218ea-5c55-45de-a8f8-1a512cf30252","Type":"ContainerDied","Data":"44b7f196db22ed0219d5430739fa98b4186fe15596d53442742e9809c6013d29"} Dec 01 14:02:16 crc kubenswrapper[4585]: I1201 14:02:16.701572 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-558pq" event={"ID":"820218ea-5c55-45de-a8f8-1a512cf30252","Type":"ContainerStarted","Data":"ef83dc908a5d72992f32df6dd1aa3e54553f0898a1bc5e75cadbaa6d520f8d75"} Dec 01 14:02:16 crc kubenswrapper[4585]: I1201 14:02:16.957462 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vnkff"] Dec 01 14:02:16 crc kubenswrapper[4585]: W1201 14:02:16.967137 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7e0be82_9218_49a5_a141_605615d845a8.slice/crio-e0568fc34cc0a38c3518ac9de62f5d95cb5ebac19ad22d0702eca640d89a185e WatchSource:0}: Error finding container e0568fc34cc0a38c3518ac9de62f5d95cb5ebac19ad22d0702eca640d89a185e: Status 404 returned error can't find the container with id e0568fc34cc0a38c3518ac9de62f5d95cb5ebac19ad22d0702eca640d89a185e Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.010681 4585 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.011537 4585 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.011873 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://b40ee42d7d1c9043b0b5c0dea3baacc46c24842ec64ed3b0d0ae1acfaf65f8e0" gracePeriod=15 Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.012089 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.012599 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://7dc9645fabd7968d92ce9867130dddbb6c72664af68d6b5475cf3271d1975878" gracePeriod=15 Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.012667 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://aa22ac0ee9ad705a8affafffed373135005ef0380c9a646e06229a24d04d20b4" gracePeriod=15 Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.012721 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://949f7fe4da35866e66212aea18ec1ed303bd542b07e53ff6562618c19de112cc" gracePeriod=15 Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.013192 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://b7d7c5d98b7f6dcade92ba77f78aa8345baf8161727831a616192726f6ef6f63" gracePeriod=15 Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.014721 4585 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 01 14:02:17 crc kubenswrapper[4585]: E1201 14:02:17.015212 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.015269 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 01 14:02:17 crc kubenswrapper[4585]: E1201 14:02:17.015287 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.015312 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 01 14:02:17 crc kubenswrapper[4585]: E1201 14:02:17.015325 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.015333 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 01 14:02:17 crc kubenswrapper[4585]: E1201 14:02:17.015347 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.015356 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 01 14:02:17 crc kubenswrapper[4585]: E1201 14:02:17.015392 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.015400 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 01 14:02:17 crc kubenswrapper[4585]: E1201 14:02:17.015415 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.015424 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.015594 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.015605 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.015637 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.015645 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.015654 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.015664 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 01 14:02:17 crc kubenswrapper[4585]: E1201 14:02:17.015783 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.015790 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 01 14:02:17 crc kubenswrapper[4585]: E1201 14:02:17.107359 4585 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.44:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 14:02:17 crc kubenswrapper[4585]: E1201 14:02:17.137152 4585 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.102.83.44:6443: connect: connection refused" event="&Event{ObjectMeta:{redhat-marketplace-vnkff.187d1c4aed73b3a7 openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:redhat-marketplace-vnkff,UID:f7e0be82-9218-49a5-a141-605615d845a8,APIVersion:v1,ResourceVersion:29463,FieldPath:spec.initContainers{extract-utilities},},Reason:Created,Message:Created container extract-utilities,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-01 14:02:17.136042919 +0000 UTC m=+251.120256774,LastTimestamp:2025-12-01 14:02:17.136042919 +0000 UTC m=+251.120256774,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.149340 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.149383 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.149418 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.149435 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.149618 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.149696 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.149721 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.149764 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.251439 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.251498 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.251528 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.251571 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.251598 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.251621 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.251661 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.251684 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.251774 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.251788 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.251821 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.251858 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.251863 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.251895 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.251901 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.251929 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.408798 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 14:02:17 crc kubenswrapper[4585]: W1201 14:02:17.431170 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-d839f05ca34d2dfe29ea426b21f463559e9f81f5592d6946d426e665a3f44c77 WatchSource:0}: Error finding container d839f05ca34d2dfe29ea426b21f463559e9f81f5592d6946d426e665a3f44c77: Status 404 returned error can't find the container with id d839f05ca34d2dfe29ea426b21f463559e9f81f5592d6946d426e665a3f44c77 Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.710342 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-558pq" event={"ID":"820218ea-5c55-45de-a8f8-1a512cf30252","Type":"ContainerStarted","Data":"7e07cfbf95e6c2ba1f18bfafc9b5cdc93a27b597f8047708ef703b94c7a5ce9b"} Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.711877 4585 status_manager.go:851] "Failed to get status for pod" podUID="820218ea-5c55-45de-a8f8-1a512cf30252" pod="openshift-marketplace/community-operators-558pq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-558pq\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.712310 4585 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.712686 4585 generic.go:334] "Generic (PLEG): container finished" podID="f7e0be82-9218-49a5-a141-605615d845a8" containerID="278d0e0001c5c37622ca3db1f2cd681b57dccf0cc99c6129f5993c1e17773eee" exitCode=0 Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.712836 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vnkff" event={"ID":"f7e0be82-9218-49a5-a141-605615d845a8","Type":"ContainerDied","Data":"278d0e0001c5c37622ca3db1f2cd681b57dccf0cc99c6129f5993c1e17773eee"} Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.712878 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vnkff" event={"ID":"f7e0be82-9218-49a5-a141-605615d845a8","Type":"ContainerStarted","Data":"e0568fc34cc0a38c3518ac9de62f5d95cb5ebac19ad22d0702eca640d89a185e"} Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.713277 4585 status_manager.go:851] "Failed to get status for pod" podUID="820218ea-5c55-45de-a8f8-1a512cf30252" pod="openshift-marketplace/community-operators-558pq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-558pq\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.713569 4585 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.713994 4585 status_manager.go:851] "Failed to get status for pod" podUID="f7e0be82-9218-49a5-a141-605615d845a8" pod="openshift-marketplace/redhat-marketplace-vnkff" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-vnkff\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.714392 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"d839f05ca34d2dfe29ea426b21f463559e9f81f5592d6946d426e665a3f44c77"} Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.719035 4585 generic.go:334] "Generic (PLEG): container finished" podID="13110086-95d8-4bf3-bc4f-7a7d2f2f7ae6" containerID="3ec704010cac7e317585e795ee58f832edb80dd79118da87562bb2da53f0201f" exitCode=0 Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.719065 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"13110086-95d8-4bf3-bc4f-7a7d2f2f7ae6","Type":"ContainerDied","Data":"3ec704010cac7e317585e795ee58f832edb80dd79118da87562bb2da53f0201f"} Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.720247 4585 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.720505 4585 status_manager.go:851] "Failed to get status for pod" podUID="820218ea-5c55-45de-a8f8-1a512cf30252" pod="openshift-marketplace/community-operators-558pq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-558pq\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.720887 4585 status_manager.go:851] "Failed to get status for pod" podUID="f7e0be82-9218-49a5-a141-605615d845a8" pod="openshift-marketplace/redhat-marketplace-vnkff" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-vnkff\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.722836 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.723180 4585 status_manager.go:851] "Failed to get status for pod" podUID="13110086-95d8-4bf3-bc4f-7a7d2f2f7ae6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.725114 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.725922 4585 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b7d7c5d98b7f6dcade92ba77f78aa8345baf8161727831a616192726f6ef6f63" exitCode=0 Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.726022 4585 scope.go:117] "RemoveContainer" containerID="e7dfb22c5155fdc4bfdf9d54bf540e4018bca6a616835af7e8d5079bebcd45da" Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.726075 4585 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7dc9645fabd7968d92ce9867130dddbb6c72664af68d6b5475cf3271d1975878" exitCode=0 Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.726199 4585 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="aa22ac0ee9ad705a8affafffed373135005ef0380c9a646e06229a24d04d20b4" exitCode=0 Dec 01 14:02:17 crc kubenswrapper[4585]: I1201 14:02:17.726221 4585 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="949f7fe4da35866e66212aea18ec1ed303bd542b07e53ff6562618c19de112cc" exitCode=2 Dec 01 14:02:18 crc kubenswrapper[4585]: I1201 14:02:18.739237 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 01 14:02:18 crc kubenswrapper[4585]: I1201 14:02:18.741659 4585 generic.go:334] "Generic (PLEG): container finished" podID="820218ea-5c55-45de-a8f8-1a512cf30252" containerID="7e07cfbf95e6c2ba1f18bfafc9b5cdc93a27b597f8047708ef703b94c7a5ce9b" exitCode=0 Dec 01 14:02:18 crc kubenswrapper[4585]: I1201 14:02:18.741949 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-558pq" event={"ID":"820218ea-5c55-45de-a8f8-1a512cf30252","Type":"ContainerDied","Data":"7e07cfbf95e6c2ba1f18bfafc9b5cdc93a27b597f8047708ef703b94c7a5ce9b"} Dec 01 14:02:18 crc kubenswrapper[4585]: I1201 14:02:18.743280 4585 status_manager.go:851] "Failed to get status for pod" podUID="820218ea-5c55-45de-a8f8-1a512cf30252" pod="openshift-marketplace/community-operators-558pq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-558pq\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:18 crc kubenswrapper[4585]: I1201 14:02:18.743440 4585 status_manager.go:851] "Failed to get status for pod" podUID="f7e0be82-9218-49a5-a141-605615d845a8" pod="openshift-marketplace/redhat-marketplace-vnkff" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-vnkff\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:18 crc kubenswrapper[4585]: I1201 14:02:18.743839 4585 status_manager.go:851] "Failed to get status for pod" podUID="13110086-95d8-4bf3-bc4f-7a7d2f2f7ae6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:18 crc kubenswrapper[4585]: I1201 14:02:18.754429 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"1e06131703ce57609710e4a76a8f711413bb460b2573d3245a118034bf94fcd4"} Dec 01 14:02:18 crc kubenswrapper[4585]: E1201 14:02:18.754767 4585 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.44:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 14:02:18 crc kubenswrapper[4585]: I1201 14:02:18.754917 4585 status_manager.go:851] "Failed to get status for pod" podUID="820218ea-5c55-45de-a8f8-1a512cf30252" pod="openshift-marketplace/community-operators-558pq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-558pq\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:18 crc kubenswrapper[4585]: I1201 14:02:18.755089 4585 status_manager.go:851] "Failed to get status for pod" podUID="f7e0be82-9218-49a5-a141-605615d845a8" pod="openshift-marketplace/redhat-marketplace-vnkff" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-vnkff\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:18 crc kubenswrapper[4585]: I1201 14:02:18.755315 4585 status_manager.go:851] "Failed to get status for pod" podUID="13110086-95d8-4bf3-bc4f-7a7d2f2f7ae6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:19 crc kubenswrapper[4585]: I1201 14:02:19.003828 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 01 14:02:19 crc kubenswrapper[4585]: I1201 14:02:19.004690 4585 status_manager.go:851] "Failed to get status for pod" podUID="820218ea-5c55-45de-a8f8-1a512cf30252" pod="openshift-marketplace/community-operators-558pq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-558pq\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:19 crc kubenswrapper[4585]: I1201 14:02:19.005120 4585 status_manager.go:851] "Failed to get status for pod" podUID="f7e0be82-9218-49a5-a141-605615d845a8" pod="openshift-marketplace/redhat-marketplace-vnkff" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-vnkff\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:19 crc kubenswrapper[4585]: I1201 14:02:19.005377 4585 status_manager.go:851] "Failed to get status for pod" podUID="13110086-95d8-4bf3-bc4f-7a7d2f2f7ae6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:19 crc kubenswrapper[4585]: I1201 14:02:19.086628 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/13110086-95d8-4bf3-bc4f-7a7d2f2f7ae6-var-lock\") pod \"13110086-95d8-4bf3-bc4f-7a7d2f2f7ae6\" (UID: \"13110086-95d8-4bf3-bc4f-7a7d2f2f7ae6\") " Dec 01 14:02:19 crc kubenswrapper[4585]: I1201 14:02:19.086729 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/13110086-95d8-4bf3-bc4f-7a7d2f2f7ae6-kube-api-access\") pod \"13110086-95d8-4bf3-bc4f-7a7d2f2f7ae6\" (UID: \"13110086-95d8-4bf3-bc4f-7a7d2f2f7ae6\") " Dec 01 14:02:19 crc kubenswrapper[4585]: I1201 14:02:19.088080 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13110086-95d8-4bf3-bc4f-7a7d2f2f7ae6-var-lock" (OuterVolumeSpecName: "var-lock") pod "13110086-95d8-4bf3-bc4f-7a7d2f2f7ae6" (UID: "13110086-95d8-4bf3-bc4f-7a7d2f2f7ae6"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 14:02:19 crc kubenswrapper[4585]: I1201 14:02:19.088958 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/13110086-95d8-4bf3-bc4f-7a7d2f2f7ae6-kubelet-dir\") pod \"13110086-95d8-4bf3-bc4f-7a7d2f2f7ae6\" (UID: \"13110086-95d8-4bf3-bc4f-7a7d2f2f7ae6\") " Dec 01 14:02:19 crc kubenswrapper[4585]: I1201 14:02:19.089078 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13110086-95d8-4bf3-bc4f-7a7d2f2f7ae6-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "13110086-95d8-4bf3-bc4f-7a7d2f2f7ae6" (UID: "13110086-95d8-4bf3-bc4f-7a7d2f2f7ae6"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 14:02:19 crc kubenswrapper[4585]: I1201 14:02:19.089895 4585 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/13110086-95d8-4bf3-bc4f-7a7d2f2f7ae6-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 01 14:02:19 crc kubenswrapper[4585]: I1201 14:02:19.089914 4585 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/13110086-95d8-4bf3-bc4f-7a7d2f2f7ae6-var-lock\") on node \"crc\" DevicePath \"\"" Dec 01 14:02:19 crc kubenswrapper[4585]: I1201 14:02:19.101638 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13110086-95d8-4bf3-bc4f-7a7d2f2f7ae6-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "13110086-95d8-4bf3-bc4f-7a7d2f2f7ae6" (UID: "13110086-95d8-4bf3-bc4f-7a7d2f2f7ae6"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:02:19 crc kubenswrapper[4585]: I1201 14:02:19.190714 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/13110086-95d8-4bf3-bc4f-7a7d2f2f7ae6-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 14:02:19 crc kubenswrapper[4585]: I1201 14:02:19.461186 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 01 14:02:19 crc kubenswrapper[4585]: I1201 14:02:19.462741 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 14:02:19 crc kubenswrapper[4585]: I1201 14:02:19.463668 4585 status_manager.go:851] "Failed to get status for pod" podUID="820218ea-5c55-45de-a8f8-1a512cf30252" pod="openshift-marketplace/community-operators-558pq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-558pq\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:19 crc kubenswrapper[4585]: I1201 14:02:19.464108 4585 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:19 crc kubenswrapper[4585]: I1201 14:02:19.464765 4585 status_manager.go:851] "Failed to get status for pod" podUID="f7e0be82-9218-49a5-a141-605615d845a8" pod="openshift-marketplace/redhat-marketplace-vnkff" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-vnkff\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:19 crc kubenswrapper[4585]: I1201 14:02:19.465040 4585 status_manager.go:851] "Failed to get status for pod" podUID="13110086-95d8-4bf3-bc4f-7a7d2f2f7ae6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:19 crc kubenswrapper[4585]: I1201 14:02:19.596202 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 01 14:02:19 crc kubenswrapper[4585]: I1201 14:02:19.596961 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 01 14:02:19 crc kubenswrapper[4585]: I1201 14:02:19.596357 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 14:02:19 crc kubenswrapper[4585]: I1201 14:02:19.597131 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 01 14:02:19 crc kubenswrapper[4585]: I1201 14:02:19.597041 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 14:02:19 crc kubenswrapper[4585]: I1201 14:02:19.597259 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 14:02:19 crc kubenswrapper[4585]: I1201 14:02:19.597788 4585 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 01 14:02:19 crc kubenswrapper[4585]: I1201 14:02:19.597822 4585 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 01 14:02:19 crc kubenswrapper[4585]: I1201 14:02:19.597834 4585 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 01 14:02:19 crc kubenswrapper[4585]: I1201 14:02:19.764322 4585 generic.go:334] "Generic (PLEG): container finished" podID="f7e0be82-9218-49a5-a141-605615d845a8" containerID="9eb5ebcdf451d2ae17a4e409729c3175c274024a4d4668ecff03272798952155" exitCode=0 Dec 01 14:02:19 crc kubenswrapper[4585]: I1201 14:02:19.764424 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vnkff" event={"ID":"f7e0be82-9218-49a5-a141-605615d845a8","Type":"ContainerDied","Data":"9eb5ebcdf451d2ae17a4e409729c3175c274024a4d4668ecff03272798952155"} Dec 01 14:02:19 crc kubenswrapper[4585]: I1201 14:02:19.766731 4585 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:19 crc kubenswrapper[4585]: I1201 14:02:19.767241 4585 status_manager.go:851] "Failed to get status for pod" podUID="820218ea-5c55-45de-a8f8-1a512cf30252" pod="openshift-marketplace/community-operators-558pq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-558pq\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:19 crc kubenswrapper[4585]: I1201 14:02:19.767797 4585 status_manager.go:851] "Failed to get status for pod" podUID="f7e0be82-9218-49a5-a141-605615d845a8" pod="openshift-marketplace/redhat-marketplace-vnkff" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-vnkff\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:19 crc kubenswrapper[4585]: I1201 14:02:19.768280 4585 status_manager.go:851] "Failed to get status for pod" podUID="13110086-95d8-4bf3-bc4f-7a7d2f2f7ae6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:19 crc kubenswrapper[4585]: I1201 14:02:19.768787 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 01 14:02:19 crc kubenswrapper[4585]: I1201 14:02:19.768799 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"13110086-95d8-4bf3-bc4f-7a7d2f2f7ae6","Type":"ContainerDied","Data":"17b74ee182723b757e0fde498312ce4aea036e9c77fa30090ca7dcadbc91d3b6"} Dec 01 14:02:19 crc kubenswrapper[4585]: I1201 14:02:19.768942 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17b74ee182723b757e0fde498312ce4aea036e9c77fa30090ca7dcadbc91d3b6" Dec 01 14:02:19 crc kubenswrapper[4585]: I1201 14:02:19.777158 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 01 14:02:19 crc kubenswrapper[4585]: I1201 14:02:19.778353 4585 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b40ee42d7d1c9043b0b5c0dea3baacc46c24842ec64ed3b0d0ae1acfaf65f8e0" exitCode=0 Dec 01 14:02:19 crc kubenswrapper[4585]: I1201 14:02:19.778457 4585 scope.go:117] "RemoveContainer" containerID="b7d7c5d98b7f6dcade92ba77f78aa8345baf8161727831a616192726f6ef6f63" Dec 01 14:02:19 crc kubenswrapper[4585]: I1201 14:02:19.778646 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 14:02:19 crc kubenswrapper[4585]: E1201 14:02:19.779175 4585 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.44:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 14:02:19 crc kubenswrapper[4585]: I1201 14:02:19.791634 4585 status_manager.go:851] "Failed to get status for pod" podUID="820218ea-5c55-45de-a8f8-1a512cf30252" pod="openshift-marketplace/community-operators-558pq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-558pq\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:19 crc kubenswrapper[4585]: I1201 14:02:19.791963 4585 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:19 crc kubenswrapper[4585]: I1201 14:02:19.792534 4585 status_manager.go:851] "Failed to get status for pod" podUID="f7e0be82-9218-49a5-a141-605615d845a8" pod="openshift-marketplace/redhat-marketplace-vnkff" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-vnkff\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:19 crc kubenswrapper[4585]: I1201 14:02:19.792781 4585 status_manager.go:851] "Failed to get status for pod" podUID="13110086-95d8-4bf3-bc4f-7a7d2f2f7ae6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:19 crc kubenswrapper[4585]: I1201 14:02:19.798944 4585 status_manager.go:851] "Failed to get status for pod" podUID="f7e0be82-9218-49a5-a141-605615d845a8" pod="openshift-marketplace/redhat-marketplace-vnkff" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-vnkff\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:19 crc kubenswrapper[4585]: I1201 14:02:19.799296 4585 status_manager.go:851] "Failed to get status for pod" podUID="13110086-95d8-4bf3-bc4f-7a7d2f2f7ae6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:19 crc kubenswrapper[4585]: I1201 14:02:19.799530 4585 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:19 crc kubenswrapper[4585]: I1201 14:02:19.799778 4585 status_manager.go:851] "Failed to get status for pod" podUID="820218ea-5c55-45de-a8f8-1a512cf30252" pod="openshift-marketplace/community-operators-558pq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-558pq\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:19 crc kubenswrapper[4585]: I1201 14:02:19.802086 4585 scope.go:117] "RemoveContainer" containerID="7dc9645fabd7968d92ce9867130dddbb6c72664af68d6b5475cf3271d1975878" Dec 01 14:02:19 crc kubenswrapper[4585]: I1201 14:02:19.820571 4585 scope.go:117] "RemoveContainer" containerID="aa22ac0ee9ad705a8affafffed373135005ef0380c9a646e06229a24d04d20b4" Dec 01 14:02:19 crc kubenswrapper[4585]: I1201 14:02:19.850874 4585 scope.go:117] "RemoveContainer" containerID="949f7fe4da35866e66212aea18ec1ed303bd542b07e53ff6562618c19de112cc" Dec 01 14:02:19 crc kubenswrapper[4585]: I1201 14:02:19.867630 4585 scope.go:117] "RemoveContainer" containerID="b40ee42d7d1c9043b0b5c0dea3baacc46c24842ec64ed3b0d0ae1acfaf65f8e0" Dec 01 14:02:19 crc kubenswrapper[4585]: I1201 14:02:19.887818 4585 scope.go:117] "RemoveContainer" containerID="a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda" Dec 01 14:02:19 crc kubenswrapper[4585]: I1201 14:02:19.915025 4585 scope.go:117] "RemoveContainer" containerID="b7d7c5d98b7f6dcade92ba77f78aa8345baf8161727831a616192726f6ef6f63" Dec 01 14:02:19 crc kubenswrapper[4585]: E1201 14:02:19.915911 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7d7c5d98b7f6dcade92ba77f78aa8345baf8161727831a616192726f6ef6f63\": container with ID starting with b7d7c5d98b7f6dcade92ba77f78aa8345baf8161727831a616192726f6ef6f63 not found: ID does not exist" containerID="b7d7c5d98b7f6dcade92ba77f78aa8345baf8161727831a616192726f6ef6f63" Dec 01 14:02:19 crc kubenswrapper[4585]: I1201 14:02:19.915953 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7d7c5d98b7f6dcade92ba77f78aa8345baf8161727831a616192726f6ef6f63"} err="failed to get container status \"b7d7c5d98b7f6dcade92ba77f78aa8345baf8161727831a616192726f6ef6f63\": rpc error: code = NotFound desc = could not find container \"b7d7c5d98b7f6dcade92ba77f78aa8345baf8161727831a616192726f6ef6f63\": container with ID starting with b7d7c5d98b7f6dcade92ba77f78aa8345baf8161727831a616192726f6ef6f63 not found: ID does not exist" Dec 01 14:02:19 crc kubenswrapper[4585]: I1201 14:02:19.916015 4585 scope.go:117] "RemoveContainer" containerID="7dc9645fabd7968d92ce9867130dddbb6c72664af68d6b5475cf3271d1975878" Dec 01 14:02:19 crc kubenswrapper[4585]: E1201 14:02:19.917022 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7dc9645fabd7968d92ce9867130dddbb6c72664af68d6b5475cf3271d1975878\": container with ID starting with 7dc9645fabd7968d92ce9867130dddbb6c72664af68d6b5475cf3271d1975878 not found: ID does not exist" containerID="7dc9645fabd7968d92ce9867130dddbb6c72664af68d6b5475cf3271d1975878" Dec 01 14:02:19 crc kubenswrapper[4585]: I1201 14:02:19.917052 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dc9645fabd7968d92ce9867130dddbb6c72664af68d6b5475cf3271d1975878"} err="failed to get container status \"7dc9645fabd7968d92ce9867130dddbb6c72664af68d6b5475cf3271d1975878\": rpc error: code = NotFound desc = could not find container \"7dc9645fabd7968d92ce9867130dddbb6c72664af68d6b5475cf3271d1975878\": container with ID starting with 7dc9645fabd7968d92ce9867130dddbb6c72664af68d6b5475cf3271d1975878 not found: ID does not exist" Dec 01 14:02:19 crc kubenswrapper[4585]: I1201 14:02:19.917071 4585 scope.go:117] "RemoveContainer" containerID="aa22ac0ee9ad705a8affafffed373135005ef0380c9a646e06229a24d04d20b4" Dec 01 14:02:19 crc kubenswrapper[4585]: E1201 14:02:19.918215 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa22ac0ee9ad705a8affafffed373135005ef0380c9a646e06229a24d04d20b4\": container with ID starting with aa22ac0ee9ad705a8affafffed373135005ef0380c9a646e06229a24d04d20b4 not found: ID does not exist" containerID="aa22ac0ee9ad705a8affafffed373135005ef0380c9a646e06229a24d04d20b4" Dec 01 14:02:19 crc kubenswrapper[4585]: I1201 14:02:19.918245 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa22ac0ee9ad705a8affafffed373135005ef0380c9a646e06229a24d04d20b4"} err="failed to get container status \"aa22ac0ee9ad705a8affafffed373135005ef0380c9a646e06229a24d04d20b4\": rpc error: code = NotFound desc = could not find container \"aa22ac0ee9ad705a8affafffed373135005ef0380c9a646e06229a24d04d20b4\": container with ID starting with aa22ac0ee9ad705a8affafffed373135005ef0380c9a646e06229a24d04d20b4 not found: ID does not exist" Dec 01 14:02:19 crc kubenswrapper[4585]: I1201 14:02:19.918264 4585 scope.go:117] "RemoveContainer" containerID="949f7fe4da35866e66212aea18ec1ed303bd542b07e53ff6562618c19de112cc" Dec 01 14:02:19 crc kubenswrapper[4585]: E1201 14:02:19.918719 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"949f7fe4da35866e66212aea18ec1ed303bd542b07e53ff6562618c19de112cc\": container with ID starting with 949f7fe4da35866e66212aea18ec1ed303bd542b07e53ff6562618c19de112cc not found: ID does not exist" containerID="949f7fe4da35866e66212aea18ec1ed303bd542b07e53ff6562618c19de112cc" Dec 01 14:02:19 crc kubenswrapper[4585]: I1201 14:02:19.918791 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"949f7fe4da35866e66212aea18ec1ed303bd542b07e53ff6562618c19de112cc"} err="failed to get container status \"949f7fe4da35866e66212aea18ec1ed303bd542b07e53ff6562618c19de112cc\": rpc error: code = NotFound desc = could not find container \"949f7fe4da35866e66212aea18ec1ed303bd542b07e53ff6562618c19de112cc\": container with ID starting with 949f7fe4da35866e66212aea18ec1ed303bd542b07e53ff6562618c19de112cc not found: ID does not exist" Dec 01 14:02:19 crc kubenswrapper[4585]: I1201 14:02:19.918837 4585 scope.go:117] "RemoveContainer" containerID="b40ee42d7d1c9043b0b5c0dea3baacc46c24842ec64ed3b0d0ae1acfaf65f8e0" Dec 01 14:02:19 crc kubenswrapper[4585]: E1201 14:02:19.919361 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b40ee42d7d1c9043b0b5c0dea3baacc46c24842ec64ed3b0d0ae1acfaf65f8e0\": container with ID starting with b40ee42d7d1c9043b0b5c0dea3baacc46c24842ec64ed3b0d0ae1acfaf65f8e0 not found: ID does not exist" containerID="b40ee42d7d1c9043b0b5c0dea3baacc46c24842ec64ed3b0d0ae1acfaf65f8e0" Dec 01 14:02:19 crc kubenswrapper[4585]: I1201 14:02:19.919402 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b40ee42d7d1c9043b0b5c0dea3baacc46c24842ec64ed3b0d0ae1acfaf65f8e0"} err="failed to get container status \"b40ee42d7d1c9043b0b5c0dea3baacc46c24842ec64ed3b0d0ae1acfaf65f8e0\": rpc error: code = NotFound desc = could not find container \"b40ee42d7d1c9043b0b5c0dea3baacc46c24842ec64ed3b0d0ae1acfaf65f8e0\": container with ID starting with b40ee42d7d1c9043b0b5c0dea3baacc46c24842ec64ed3b0d0ae1acfaf65f8e0 not found: ID does not exist" Dec 01 14:02:19 crc kubenswrapper[4585]: I1201 14:02:19.919429 4585 scope.go:117] "RemoveContainer" containerID="a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda" Dec 01 14:02:19 crc kubenswrapper[4585]: E1201 14:02:19.919776 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\": container with ID starting with a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda not found: ID does not exist" containerID="a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda" Dec 01 14:02:19 crc kubenswrapper[4585]: I1201 14:02:19.919809 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda"} err="failed to get container status \"a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\": rpc error: code = NotFound desc = could not find container \"a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda\": container with ID starting with a0faab1d698ffaa26dda74e2d283f49da0b238703ff9cbd48c5de39f1e449cda not found: ID does not exist" Dec 01 14:02:20 crc kubenswrapper[4585]: I1201 14:02:20.431748 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 01 14:02:20 crc kubenswrapper[4585]: I1201 14:02:20.788800 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-558pq" event={"ID":"820218ea-5c55-45de-a8f8-1a512cf30252","Type":"ContainerStarted","Data":"4abc2efc4436002a76a1d7f4cde6d3f25f9899a79ceafb60b4239a7db0446aa9"} Dec 01 14:02:20 crc kubenswrapper[4585]: I1201 14:02:20.791024 4585 status_manager.go:851] "Failed to get status for pod" podUID="820218ea-5c55-45de-a8f8-1a512cf30252" pod="openshift-marketplace/community-operators-558pq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-558pq\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:20 crc kubenswrapper[4585]: I1201 14:02:20.791671 4585 status_manager.go:851] "Failed to get status for pod" podUID="f7e0be82-9218-49a5-a141-605615d845a8" pod="openshift-marketplace/redhat-marketplace-vnkff" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-vnkff\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:20 crc kubenswrapper[4585]: I1201 14:02:20.791959 4585 status_manager.go:851] "Failed to get status for pod" podUID="13110086-95d8-4bf3-bc4f-7a7d2f2f7ae6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:20 crc kubenswrapper[4585]: I1201 14:02:20.793643 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vnkff" event={"ID":"f7e0be82-9218-49a5-a141-605615d845a8","Type":"ContainerStarted","Data":"9dedc0b7034912d6142a80d949c7ac304a5c8226b2e5e38ddcfadbe5c7a3cb13"} Dec 01 14:02:20 crc kubenswrapper[4585]: I1201 14:02:20.794908 4585 status_manager.go:851] "Failed to get status for pod" podUID="820218ea-5c55-45de-a8f8-1a512cf30252" pod="openshift-marketplace/community-operators-558pq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-558pq\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:20 crc kubenswrapper[4585]: I1201 14:02:20.795481 4585 status_manager.go:851] "Failed to get status for pod" podUID="f7e0be82-9218-49a5-a141-605615d845a8" pod="openshift-marketplace/redhat-marketplace-vnkff" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-vnkff\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:20 crc kubenswrapper[4585]: I1201 14:02:20.795754 4585 status_manager.go:851] "Failed to get status for pod" podUID="13110086-95d8-4bf3-bc4f-7a7d2f2f7ae6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:21 crc kubenswrapper[4585]: E1201 14:02:21.812417 4585 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.102.83.44:6443: connect: connection refused" event="&Event{ObjectMeta:{redhat-marketplace-vnkff.187d1c4aed73b3a7 openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:redhat-marketplace-vnkff,UID:f7e0be82-9218-49a5-a141-605615d845a8,APIVersion:v1,ResourceVersion:29463,FieldPath:spec.initContainers{extract-utilities},},Reason:Created,Message:Created container extract-utilities,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-01 14:02:17.136042919 +0000 UTC m=+251.120256774,LastTimestamp:2025-12-01 14:02:17.136042919 +0000 UTC m=+251.120256774,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 01 14:02:22 crc kubenswrapper[4585]: E1201 14:02:22.058362 4585 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:22 crc kubenswrapper[4585]: E1201 14:02:22.058759 4585 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:22 crc kubenswrapper[4585]: E1201 14:02:22.059201 4585 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:22 crc kubenswrapper[4585]: E1201 14:02:22.059706 4585 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:22 crc kubenswrapper[4585]: E1201 14:02:22.060034 4585 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:22 crc kubenswrapper[4585]: I1201 14:02:22.060072 4585 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 01 14:02:22 crc kubenswrapper[4585]: E1201 14:02:22.060411 4585 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.44:6443: connect: connection refused" interval="200ms" Dec 01 14:02:22 crc kubenswrapper[4585]: E1201 14:02:22.261584 4585 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.44:6443: connect: connection refused" interval="400ms" Dec 01 14:02:22 crc kubenswrapper[4585]: E1201 14:02:22.663073 4585 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.44:6443: connect: connection refused" interval="800ms" Dec 01 14:02:23 crc kubenswrapper[4585]: E1201 14:02:23.465048 4585 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.44:6443: connect: connection refused" interval="1.6s" Dec 01 14:02:25 crc kubenswrapper[4585]: E1201 14:02:25.066676 4585 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.44:6443: connect: connection refused" interval="3.2s" Dec 01 14:02:25 crc kubenswrapper[4585]: I1201 14:02:25.516226 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-558pq" Dec 01 14:02:25 crc kubenswrapper[4585]: I1201 14:02:25.516916 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-558pq" Dec 01 14:02:25 crc kubenswrapper[4585]: I1201 14:02:25.558467 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-558pq" Dec 01 14:02:25 crc kubenswrapper[4585]: I1201 14:02:25.559138 4585 status_manager.go:851] "Failed to get status for pod" podUID="820218ea-5c55-45de-a8f8-1a512cf30252" pod="openshift-marketplace/community-operators-558pq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-558pq\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:25 crc kubenswrapper[4585]: I1201 14:02:25.559590 4585 status_manager.go:851] "Failed to get status for pod" podUID="f7e0be82-9218-49a5-a141-605615d845a8" pod="openshift-marketplace/redhat-marketplace-vnkff" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-vnkff\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:25 crc kubenswrapper[4585]: I1201 14:02:25.559822 4585 status_manager.go:851] "Failed to get status for pod" podUID="13110086-95d8-4bf3-bc4f-7a7d2f2f7ae6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:25 crc kubenswrapper[4585]: I1201 14:02:25.861442 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-558pq" Dec 01 14:02:25 crc kubenswrapper[4585]: I1201 14:02:25.862253 4585 status_manager.go:851] "Failed to get status for pod" podUID="13110086-95d8-4bf3-bc4f-7a7d2f2f7ae6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:25 crc kubenswrapper[4585]: I1201 14:02:25.863044 4585 status_manager.go:851] "Failed to get status for pod" podUID="820218ea-5c55-45de-a8f8-1a512cf30252" pod="openshift-marketplace/community-operators-558pq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-558pq\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:25 crc kubenswrapper[4585]: I1201 14:02:25.863372 4585 status_manager.go:851] "Failed to get status for pod" podUID="f7e0be82-9218-49a5-a141-605615d845a8" pod="openshift-marketplace/redhat-marketplace-vnkff" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-vnkff\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:26 crc kubenswrapper[4585]: I1201 14:02:26.415545 4585 status_manager.go:851] "Failed to get status for pod" podUID="820218ea-5c55-45de-a8f8-1a512cf30252" pod="openshift-marketplace/community-operators-558pq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-558pq\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:26 crc kubenswrapper[4585]: I1201 14:02:26.415912 4585 status_manager.go:851] "Failed to get status for pod" podUID="f7e0be82-9218-49a5-a141-605615d845a8" pod="openshift-marketplace/redhat-marketplace-vnkff" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-vnkff\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:26 crc kubenswrapper[4585]: I1201 14:02:26.416236 4585 status_manager.go:851] "Failed to get status for pod" podUID="13110086-95d8-4bf3-bc4f-7a7d2f2f7ae6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:26 crc kubenswrapper[4585]: I1201 14:02:26.538444 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vnkff" Dec 01 14:02:26 crc kubenswrapper[4585]: I1201 14:02:26.538504 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vnkff" Dec 01 14:02:26 crc kubenswrapper[4585]: I1201 14:02:26.585474 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vnkff" Dec 01 14:02:26 crc kubenswrapper[4585]: I1201 14:02:26.586230 4585 status_manager.go:851] "Failed to get status for pod" podUID="820218ea-5c55-45de-a8f8-1a512cf30252" pod="openshift-marketplace/community-operators-558pq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-558pq\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:26 crc kubenswrapper[4585]: I1201 14:02:26.586514 4585 status_manager.go:851] "Failed to get status for pod" podUID="f7e0be82-9218-49a5-a141-605615d845a8" pod="openshift-marketplace/redhat-marketplace-vnkff" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-vnkff\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:26 crc kubenswrapper[4585]: I1201 14:02:26.586965 4585 status_manager.go:851] "Failed to get status for pod" podUID="13110086-95d8-4bf3-bc4f-7a7d2f2f7ae6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:26 crc kubenswrapper[4585]: I1201 14:02:26.866852 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vnkff" Dec 01 14:02:26 crc kubenswrapper[4585]: I1201 14:02:26.867483 4585 status_manager.go:851] "Failed to get status for pod" podUID="f7e0be82-9218-49a5-a141-605615d845a8" pod="openshift-marketplace/redhat-marketplace-vnkff" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-vnkff\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:26 crc kubenswrapper[4585]: I1201 14:02:26.868280 4585 status_manager.go:851] "Failed to get status for pod" podUID="13110086-95d8-4bf3-bc4f-7a7d2f2f7ae6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:26 crc kubenswrapper[4585]: I1201 14:02:26.868600 4585 status_manager.go:851] "Failed to get status for pod" podUID="820218ea-5c55-45de-a8f8-1a512cf30252" pod="openshift-marketplace/community-operators-558pq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-558pq\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:28 crc kubenswrapper[4585]: E1201 14:02:28.267802 4585 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.44:6443: connect: connection refused" interval="6.4s" Dec 01 14:02:29 crc kubenswrapper[4585]: E1201 14:02:29.501736 4585 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-conmon-d240ebf61216f20038983f7b8d2e0792fc0f1ed8db050debf02c777197d417b7.scope\": RecentStats: unable to find data in memory cache]" Dec 01 14:02:29 crc kubenswrapper[4585]: I1201 14:02:29.682322 4585 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Readiness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 01 14:02:29 crc kubenswrapper[4585]: I1201 14:02:29.682650 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 01 14:02:29 crc kubenswrapper[4585]: I1201 14:02:29.844457 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 01 14:02:29 crc kubenswrapper[4585]: I1201 14:02:29.844498 4585 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="d240ebf61216f20038983f7b8d2e0792fc0f1ed8db050debf02c777197d417b7" exitCode=1 Dec 01 14:02:29 crc kubenswrapper[4585]: I1201 14:02:29.844526 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"d240ebf61216f20038983f7b8d2e0792fc0f1ed8db050debf02c777197d417b7"} Dec 01 14:02:29 crc kubenswrapper[4585]: I1201 14:02:29.844906 4585 scope.go:117] "RemoveContainer" containerID="d240ebf61216f20038983f7b8d2e0792fc0f1ed8db050debf02c777197d417b7" Dec 01 14:02:29 crc kubenswrapper[4585]: I1201 14:02:29.845464 4585 status_manager.go:851] "Failed to get status for pod" podUID="13110086-95d8-4bf3-bc4f-7a7d2f2f7ae6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:29 crc kubenswrapper[4585]: I1201 14:02:29.846328 4585 status_manager.go:851] "Failed to get status for pod" podUID="820218ea-5c55-45de-a8f8-1a512cf30252" pod="openshift-marketplace/community-operators-558pq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-558pq\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:29 crc kubenswrapper[4585]: I1201 14:02:29.846760 4585 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:29 crc kubenswrapper[4585]: I1201 14:02:29.847043 4585 status_manager.go:851] "Failed to get status for pod" podUID="f7e0be82-9218-49a5-a141-605615d845a8" pod="openshift-marketplace/redhat-marketplace-vnkff" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-vnkff\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:30 crc kubenswrapper[4585]: I1201 14:02:30.851656 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 01 14:02:30 crc kubenswrapper[4585]: I1201 14:02:30.851715 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ea8eb7ae336552fdfc54bff029db229c87e34d7723ff20d387057752f0ba5113"} Dec 01 14:02:30 crc kubenswrapper[4585]: I1201 14:02:30.852378 4585 status_manager.go:851] "Failed to get status for pod" podUID="820218ea-5c55-45de-a8f8-1a512cf30252" pod="openshift-marketplace/community-operators-558pq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-558pq\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:30 crc kubenswrapper[4585]: I1201 14:02:30.852595 4585 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:30 crc kubenswrapper[4585]: I1201 14:02:30.852824 4585 status_manager.go:851] "Failed to get status for pod" podUID="f7e0be82-9218-49a5-a141-605615d845a8" pod="openshift-marketplace/redhat-marketplace-vnkff" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-vnkff\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:30 crc kubenswrapper[4585]: I1201 14:02:30.853010 4585 status_manager.go:851] "Failed to get status for pod" podUID="13110086-95d8-4bf3-bc4f-7a7d2f2f7ae6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:31 crc kubenswrapper[4585]: E1201 14:02:31.813464 4585 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.102.83.44:6443: connect: connection refused" event="&Event{ObjectMeta:{redhat-marketplace-vnkff.187d1c4aed73b3a7 openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:redhat-marketplace-vnkff,UID:f7e0be82-9218-49a5-a141-605615d845a8,APIVersion:v1,ResourceVersion:29463,FieldPath:spec.initContainers{extract-utilities},},Reason:Created,Message:Created container extract-utilities,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-01 14:02:17.136042919 +0000 UTC m=+251.120256774,LastTimestamp:2025-12-01 14:02:17.136042919 +0000 UTC m=+251.120256774,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 01 14:02:32 crc kubenswrapper[4585]: I1201 14:02:32.412632 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 14:02:32 crc kubenswrapper[4585]: I1201 14:02:32.414906 4585 status_manager.go:851] "Failed to get status for pod" podUID="820218ea-5c55-45de-a8f8-1a512cf30252" pod="openshift-marketplace/community-operators-558pq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-558pq\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:32 crc kubenswrapper[4585]: I1201 14:02:32.415389 4585 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:32 crc kubenswrapper[4585]: I1201 14:02:32.415737 4585 status_manager.go:851] "Failed to get status for pod" podUID="f7e0be82-9218-49a5-a141-605615d845a8" pod="openshift-marketplace/redhat-marketplace-vnkff" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-vnkff\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:32 crc kubenswrapper[4585]: I1201 14:02:32.416364 4585 status_manager.go:851] "Failed to get status for pod" podUID="13110086-95d8-4bf3-bc4f-7a7d2f2f7ae6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:32 crc kubenswrapper[4585]: I1201 14:02:32.433762 4585 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b10cdae7-154c-4fd6-a308-02843603d7ff" Dec 01 14:02:32 crc kubenswrapper[4585]: I1201 14:02:32.433795 4585 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b10cdae7-154c-4fd6-a308-02843603d7ff" Dec 01 14:02:32 crc kubenswrapper[4585]: E1201 14:02:32.434357 4585 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 14:02:32 crc kubenswrapper[4585]: I1201 14:02:32.434879 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 14:02:32 crc kubenswrapper[4585]: W1201 14:02:32.460778 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-6c17c1e1f9155c93a7f57d2b67402d41c894f3abb9eea1eb088b760f9ba171a7 WatchSource:0}: Error finding container 6c17c1e1f9155c93a7f57d2b67402d41c894f3abb9eea1eb088b760f9ba171a7: Status 404 returned error can't find the container with id 6c17c1e1f9155c93a7f57d2b67402d41c894f3abb9eea1eb088b760f9ba171a7 Dec 01 14:02:32 crc kubenswrapper[4585]: I1201 14:02:32.863597 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"6c17c1e1f9155c93a7f57d2b67402d41c894f3abb9eea1eb088b760f9ba171a7"} Dec 01 14:02:33 crc kubenswrapper[4585]: I1201 14:02:33.870641 4585 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="16ded2fa817a023ca62c1afc818462f073dcc6e58dedee4190eaec0949c32880" exitCode=0 Dec 01 14:02:33 crc kubenswrapper[4585]: I1201 14:02:33.870679 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"16ded2fa817a023ca62c1afc818462f073dcc6e58dedee4190eaec0949c32880"} Dec 01 14:02:33 crc kubenswrapper[4585]: I1201 14:02:33.871070 4585 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b10cdae7-154c-4fd6-a308-02843603d7ff" Dec 01 14:02:33 crc kubenswrapper[4585]: I1201 14:02:33.871090 4585 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b10cdae7-154c-4fd6-a308-02843603d7ff" Dec 01 14:02:33 crc kubenswrapper[4585]: E1201 14:02:33.872665 4585 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 14:02:33 crc kubenswrapper[4585]: I1201 14:02:33.872748 4585 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:33 crc kubenswrapper[4585]: I1201 14:02:33.873888 4585 status_manager.go:851] "Failed to get status for pod" podUID="f7e0be82-9218-49a5-a141-605615d845a8" pod="openshift-marketplace/redhat-marketplace-vnkff" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-vnkff\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:33 crc kubenswrapper[4585]: I1201 14:02:33.874350 4585 status_manager.go:851] "Failed to get status for pod" podUID="13110086-95d8-4bf3-bc4f-7a7d2f2f7ae6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:33 crc kubenswrapper[4585]: I1201 14:02:33.874632 4585 status_manager.go:851] "Failed to get status for pod" podUID="820218ea-5c55-45de-a8f8-1a512cf30252" pod="openshift-marketplace/community-operators-558pq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-558pq\": dial tcp 38.102.83.44:6443: connect: connection refused" Dec 01 14:02:34 crc kubenswrapper[4585]: I1201 14:02:34.013226 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-c7gls" podUID="79abd33c-0184-473e-8bb9-c408a5c32efc" containerName="oauth-openshift" containerID="cri-o://35526697d308ffd1515ecaf107060b6aed20a104c81b79a8baab559edf7f4497" gracePeriod=15 Dec 01 14:02:34 crc kubenswrapper[4585]: I1201 14:02:34.398606 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-c7gls" Dec 01 14:02:34 crc kubenswrapper[4585]: I1201 14:02:34.496172 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/79abd33c-0184-473e-8bb9-c408a5c32efc-v4-0-config-system-router-certs\") pod \"79abd33c-0184-473e-8bb9-c408a5c32efc\" (UID: \"79abd33c-0184-473e-8bb9-c408a5c32efc\") " Dec 01 14:02:34 crc kubenswrapper[4585]: I1201 14:02:34.496230 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/79abd33c-0184-473e-8bb9-c408a5c32efc-v4-0-config-system-serving-cert\") pod \"79abd33c-0184-473e-8bb9-c408a5c32efc\" (UID: \"79abd33c-0184-473e-8bb9-c408a5c32efc\") " Dec 01 14:02:34 crc kubenswrapper[4585]: I1201 14:02:34.496280 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/79abd33c-0184-473e-8bb9-c408a5c32efc-v4-0-config-user-idp-0-file-data\") pod \"79abd33c-0184-473e-8bb9-c408a5c32efc\" (UID: \"79abd33c-0184-473e-8bb9-c408a5c32efc\") " Dec 01 14:02:34 crc kubenswrapper[4585]: I1201 14:02:34.496312 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/79abd33c-0184-473e-8bb9-c408a5c32efc-audit-policies\") pod \"79abd33c-0184-473e-8bb9-c408a5c32efc\" (UID: \"79abd33c-0184-473e-8bb9-c408a5c32efc\") " Dec 01 14:02:34 crc kubenswrapper[4585]: I1201 14:02:34.496331 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/79abd33c-0184-473e-8bb9-c408a5c32efc-v4-0-config-system-cliconfig\") pod \"79abd33c-0184-473e-8bb9-c408a5c32efc\" (UID: \"79abd33c-0184-473e-8bb9-c408a5c32efc\") " Dec 01 14:02:34 crc kubenswrapper[4585]: I1201 14:02:34.496347 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/79abd33c-0184-473e-8bb9-c408a5c32efc-v4-0-config-system-trusted-ca-bundle\") pod \"79abd33c-0184-473e-8bb9-c408a5c32efc\" (UID: \"79abd33c-0184-473e-8bb9-c408a5c32efc\") " Dec 01 14:02:34 crc kubenswrapper[4585]: I1201 14:02:34.496374 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72q79\" (UniqueName: \"kubernetes.io/projected/79abd33c-0184-473e-8bb9-c408a5c32efc-kube-api-access-72q79\") pod \"79abd33c-0184-473e-8bb9-c408a5c32efc\" (UID: \"79abd33c-0184-473e-8bb9-c408a5c32efc\") " Dec 01 14:02:34 crc kubenswrapper[4585]: I1201 14:02:34.496396 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/79abd33c-0184-473e-8bb9-c408a5c32efc-v4-0-config-system-session\") pod \"79abd33c-0184-473e-8bb9-c408a5c32efc\" (UID: \"79abd33c-0184-473e-8bb9-c408a5c32efc\") " Dec 01 14:02:34 crc kubenswrapper[4585]: I1201 14:02:34.496416 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/79abd33c-0184-473e-8bb9-c408a5c32efc-v4-0-config-system-service-ca\") pod \"79abd33c-0184-473e-8bb9-c408a5c32efc\" (UID: \"79abd33c-0184-473e-8bb9-c408a5c32efc\") " Dec 01 14:02:34 crc kubenswrapper[4585]: I1201 14:02:34.496435 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/79abd33c-0184-473e-8bb9-c408a5c32efc-v4-0-config-user-template-provider-selection\") pod \"79abd33c-0184-473e-8bb9-c408a5c32efc\" (UID: \"79abd33c-0184-473e-8bb9-c408a5c32efc\") " Dec 01 14:02:34 crc kubenswrapper[4585]: I1201 14:02:34.496462 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/79abd33c-0184-473e-8bb9-c408a5c32efc-audit-dir\") pod \"79abd33c-0184-473e-8bb9-c408a5c32efc\" (UID: \"79abd33c-0184-473e-8bb9-c408a5c32efc\") " Dec 01 14:02:34 crc kubenswrapper[4585]: I1201 14:02:34.496486 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/79abd33c-0184-473e-8bb9-c408a5c32efc-v4-0-config-user-template-error\") pod \"79abd33c-0184-473e-8bb9-c408a5c32efc\" (UID: \"79abd33c-0184-473e-8bb9-c408a5c32efc\") " Dec 01 14:02:34 crc kubenswrapper[4585]: I1201 14:02:34.496504 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/79abd33c-0184-473e-8bb9-c408a5c32efc-v4-0-config-user-template-login\") pod \"79abd33c-0184-473e-8bb9-c408a5c32efc\" (UID: \"79abd33c-0184-473e-8bb9-c408a5c32efc\") " Dec 01 14:02:34 crc kubenswrapper[4585]: I1201 14:02:34.496524 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/79abd33c-0184-473e-8bb9-c408a5c32efc-v4-0-config-system-ocp-branding-template\") pod \"79abd33c-0184-473e-8bb9-c408a5c32efc\" (UID: \"79abd33c-0184-473e-8bb9-c408a5c32efc\") " Dec 01 14:02:34 crc kubenswrapper[4585]: I1201 14:02:34.502147 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79abd33c-0184-473e-8bb9-c408a5c32efc-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "79abd33c-0184-473e-8bb9-c408a5c32efc" (UID: "79abd33c-0184-473e-8bb9-c408a5c32efc"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:02:34 crc kubenswrapper[4585]: I1201 14:02:34.502462 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79abd33c-0184-473e-8bb9-c408a5c32efc-kube-api-access-72q79" (OuterVolumeSpecName: "kube-api-access-72q79") pod "79abd33c-0184-473e-8bb9-c408a5c32efc" (UID: "79abd33c-0184-473e-8bb9-c408a5c32efc"). InnerVolumeSpecName "kube-api-access-72q79". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:02:34 crc kubenswrapper[4585]: I1201 14:02:34.502526 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79abd33c-0184-473e-8bb9-c408a5c32efc-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "79abd33c-0184-473e-8bb9-c408a5c32efc" (UID: "79abd33c-0184-473e-8bb9-c408a5c32efc"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 14:02:34 crc kubenswrapper[4585]: I1201 14:02:34.503121 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79abd33c-0184-473e-8bb9-c408a5c32efc-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "79abd33c-0184-473e-8bb9-c408a5c32efc" (UID: "79abd33c-0184-473e-8bb9-c408a5c32efc"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:02:34 crc kubenswrapper[4585]: I1201 14:02:34.504377 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79abd33c-0184-473e-8bb9-c408a5c32efc-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "79abd33c-0184-473e-8bb9-c408a5c32efc" (UID: "79abd33c-0184-473e-8bb9-c408a5c32efc"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:02:34 crc kubenswrapper[4585]: I1201 14:02:34.506326 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79abd33c-0184-473e-8bb9-c408a5c32efc-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "79abd33c-0184-473e-8bb9-c408a5c32efc" (UID: "79abd33c-0184-473e-8bb9-c408a5c32efc"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:02:34 crc kubenswrapper[4585]: I1201 14:02:34.506727 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79abd33c-0184-473e-8bb9-c408a5c32efc-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "79abd33c-0184-473e-8bb9-c408a5c32efc" (UID: "79abd33c-0184-473e-8bb9-c408a5c32efc"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:02:34 crc kubenswrapper[4585]: I1201 14:02:34.506917 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79abd33c-0184-473e-8bb9-c408a5c32efc-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "79abd33c-0184-473e-8bb9-c408a5c32efc" (UID: "79abd33c-0184-473e-8bb9-c408a5c32efc"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:02:34 crc kubenswrapper[4585]: I1201 14:02:34.507856 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79abd33c-0184-473e-8bb9-c408a5c32efc-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "79abd33c-0184-473e-8bb9-c408a5c32efc" (UID: "79abd33c-0184-473e-8bb9-c408a5c32efc"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:02:34 crc kubenswrapper[4585]: I1201 14:02:34.509092 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79abd33c-0184-473e-8bb9-c408a5c32efc-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "79abd33c-0184-473e-8bb9-c408a5c32efc" (UID: "79abd33c-0184-473e-8bb9-c408a5c32efc"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:02:34 crc kubenswrapper[4585]: I1201 14:02:34.509524 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79abd33c-0184-473e-8bb9-c408a5c32efc-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "79abd33c-0184-473e-8bb9-c408a5c32efc" (UID: "79abd33c-0184-473e-8bb9-c408a5c32efc"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:02:34 crc kubenswrapper[4585]: I1201 14:02:34.510718 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79abd33c-0184-473e-8bb9-c408a5c32efc-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "79abd33c-0184-473e-8bb9-c408a5c32efc" (UID: "79abd33c-0184-473e-8bb9-c408a5c32efc"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:02:34 crc kubenswrapper[4585]: I1201 14:02:34.512120 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79abd33c-0184-473e-8bb9-c408a5c32efc-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "79abd33c-0184-473e-8bb9-c408a5c32efc" (UID: "79abd33c-0184-473e-8bb9-c408a5c32efc"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:02:34 crc kubenswrapper[4585]: I1201 14:02:34.515690 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79abd33c-0184-473e-8bb9-c408a5c32efc-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "79abd33c-0184-473e-8bb9-c408a5c32efc" (UID: "79abd33c-0184-473e-8bb9-c408a5c32efc"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:02:34 crc kubenswrapper[4585]: I1201 14:02:34.597604 4585 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/79abd33c-0184-473e-8bb9-c408a5c32efc-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 01 14:02:34 crc kubenswrapper[4585]: I1201 14:02:34.597632 4585 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/79abd33c-0184-473e-8bb9-c408a5c32efc-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 01 14:02:34 crc kubenswrapper[4585]: I1201 14:02:34.597643 4585 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/79abd33c-0184-473e-8bb9-c408a5c32efc-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 01 14:02:34 crc kubenswrapper[4585]: I1201 14:02:34.597655 4585 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/79abd33c-0184-473e-8bb9-c408a5c32efc-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 14:02:34 crc kubenswrapper[4585]: I1201 14:02:34.597665 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72q79\" (UniqueName: \"kubernetes.io/projected/79abd33c-0184-473e-8bb9-c408a5c32efc-kube-api-access-72q79\") on node \"crc\" DevicePath \"\"" Dec 01 14:02:34 crc kubenswrapper[4585]: I1201 14:02:34.597673 4585 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/79abd33c-0184-473e-8bb9-c408a5c32efc-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 01 14:02:34 crc kubenswrapper[4585]: I1201 14:02:34.597682 4585 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/79abd33c-0184-473e-8bb9-c408a5c32efc-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 14:02:34 crc kubenswrapper[4585]: I1201 14:02:34.597692 4585 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/79abd33c-0184-473e-8bb9-c408a5c32efc-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 01 14:02:34 crc kubenswrapper[4585]: I1201 14:02:34.597702 4585 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/79abd33c-0184-473e-8bb9-c408a5c32efc-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 01 14:02:34 crc kubenswrapper[4585]: I1201 14:02:34.597714 4585 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/79abd33c-0184-473e-8bb9-c408a5c32efc-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 01 14:02:34 crc kubenswrapper[4585]: I1201 14:02:34.597723 4585 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/79abd33c-0184-473e-8bb9-c408a5c32efc-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 01 14:02:34 crc kubenswrapper[4585]: I1201 14:02:34.597734 4585 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/79abd33c-0184-473e-8bb9-c408a5c32efc-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 01 14:02:34 crc kubenswrapper[4585]: I1201 14:02:34.597744 4585 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/79abd33c-0184-473e-8bb9-c408a5c32efc-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 01 14:02:34 crc kubenswrapper[4585]: I1201 14:02:34.597752 4585 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/79abd33c-0184-473e-8bb9-c408a5c32efc-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 14:02:34 crc kubenswrapper[4585]: I1201 14:02:34.887738 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"8fe81ead82755cc57f9260b1152db27b894b95078a66157c6a4a173954983e86"} Dec 01 14:02:34 crc kubenswrapper[4585]: I1201 14:02:34.887782 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"83f2787c0613fc91171ff0167a27a2829212a3017ce693cd59c1d664a9776299"} Dec 01 14:02:34 crc kubenswrapper[4585]: I1201 14:02:34.887795 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"0b8d07d23486439947941c126b9592205d586b701873015aacafb7917614d5fe"} Dec 01 14:02:34 crc kubenswrapper[4585]: I1201 14:02:34.890406 4585 generic.go:334] "Generic (PLEG): container finished" podID="79abd33c-0184-473e-8bb9-c408a5c32efc" containerID="35526697d308ffd1515ecaf107060b6aed20a104c81b79a8baab559edf7f4497" exitCode=0 Dec 01 14:02:34 crc kubenswrapper[4585]: I1201 14:02:34.890447 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-c7gls" event={"ID":"79abd33c-0184-473e-8bb9-c408a5c32efc","Type":"ContainerDied","Data":"35526697d308ffd1515ecaf107060b6aed20a104c81b79a8baab559edf7f4497"} Dec 01 14:02:34 crc kubenswrapper[4585]: I1201 14:02:34.890466 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-c7gls" event={"ID":"79abd33c-0184-473e-8bb9-c408a5c32efc","Type":"ContainerDied","Data":"c46d3fa15dabe36f629944748219ada1ed13fbd3eae3e4a9a3525fb66a6a7a8d"} Dec 01 14:02:34 crc kubenswrapper[4585]: I1201 14:02:34.890483 4585 scope.go:117] "RemoveContainer" containerID="35526697d308ffd1515ecaf107060b6aed20a104c81b79a8baab559edf7f4497" Dec 01 14:02:34 crc kubenswrapper[4585]: I1201 14:02:34.890620 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-c7gls" Dec 01 14:02:35 crc kubenswrapper[4585]: I1201 14:02:35.011568 4585 scope.go:117] "RemoveContainer" containerID="35526697d308ffd1515ecaf107060b6aed20a104c81b79a8baab559edf7f4497" Dec 01 14:02:35 crc kubenswrapper[4585]: E1201 14:02:35.012013 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35526697d308ffd1515ecaf107060b6aed20a104c81b79a8baab559edf7f4497\": container with ID starting with 35526697d308ffd1515ecaf107060b6aed20a104c81b79a8baab559edf7f4497 not found: ID does not exist" containerID="35526697d308ffd1515ecaf107060b6aed20a104c81b79a8baab559edf7f4497" Dec 01 14:02:35 crc kubenswrapper[4585]: I1201 14:02:35.012062 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35526697d308ffd1515ecaf107060b6aed20a104c81b79a8baab559edf7f4497"} err="failed to get container status \"35526697d308ffd1515ecaf107060b6aed20a104c81b79a8baab559edf7f4497\": rpc error: code = NotFound desc = could not find container \"35526697d308ffd1515ecaf107060b6aed20a104c81b79a8baab559edf7f4497\": container with ID starting with 35526697d308ffd1515ecaf107060b6aed20a104c81b79a8baab559edf7f4497 not found: ID does not exist" Dec 01 14:02:35 crc kubenswrapper[4585]: I1201 14:02:35.538303 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 14:02:35 crc kubenswrapper[4585]: I1201 14:02:35.538806 4585 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 01 14:02:35 crc kubenswrapper[4585]: I1201 14:02:35.538884 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 01 14:02:35 crc kubenswrapper[4585]: I1201 14:02:35.901076 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ff0d20c03e9922a4993a2a4faccd52fc50e80acc952015da4f9d702152be2de1"} Dec 01 14:02:35 crc kubenswrapper[4585]: I1201 14:02:35.901132 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"306c7a89f117d5329bfe833452b1adf976bd69ce1163e4da72d439dab9f13bcf"} Dec 01 14:02:35 crc kubenswrapper[4585]: I1201 14:02:35.901420 4585 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b10cdae7-154c-4fd6-a308-02843603d7ff" Dec 01 14:02:35 crc kubenswrapper[4585]: I1201 14:02:35.901438 4585 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b10cdae7-154c-4fd6-a308-02843603d7ff" Dec 01 14:02:35 crc kubenswrapper[4585]: I1201 14:02:35.901648 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 14:02:37 crc kubenswrapper[4585]: I1201 14:02:37.435118 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 14:02:37 crc kubenswrapper[4585]: I1201 14:02:37.435443 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 14:02:37 crc kubenswrapper[4585]: I1201 14:02:37.439805 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 14:02:39 crc kubenswrapper[4585]: I1201 14:02:39.682437 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 14:02:40 crc kubenswrapper[4585]: I1201 14:02:40.940423 4585 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 14:02:41 crc kubenswrapper[4585]: I1201 14:02:41.030504 4585 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="4ddf4485-4b7a-42a1-a00d-6b3bb3bc2395" Dec 01 14:02:41 crc kubenswrapper[4585]: I1201 14:02:41.933427 4585 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b10cdae7-154c-4fd6-a308-02843603d7ff" Dec 01 14:02:41 crc kubenswrapper[4585]: I1201 14:02:41.933492 4585 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b10cdae7-154c-4fd6-a308-02843603d7ff" Dec 01 14:02:41 crc kubenswrapper[4585]: I1201 14:02:41.937162 4585 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="4ddf4485-4b7a-42a1-a00d-6b3bb3bc2395" Dec 01 14:02:41 crc kubenswrapper[4585]: I1201 14:02:41.938335 4585 status_manager.go:308] "Container readiness changed before pod has synced" pod="openshift-kube-apiserver/kube-apiserver-crc" containerID="cri-o://0b8d07d23486439947941c126b9592205d586b701873015aacafb7917614d5fe" Dec 01 14:02:41 crc kubenswrapper[4585]: I1201 14:02:41.938384 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 14:02:42 crc kubenswrapper[4585]: I1201 14:02:42.941838 4585 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b10cdae7-154c-4fd6-a308-02843603d7ff" Dec 01 14:02:42 crc kubenswrapper[4585]: I1201 14:02:42.941894 4585 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b10cdae7-154c-4fd6-a308-02843603d7ff" Dec 01 14:02:42 crc kubenswrapper[4585]: I1201 14:02:42.948161 4585 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="4ddf4485-4b7a-42a1-a00d-6b3bb3bc2395" Dec 01 14:02:45 crc kubenswrapper[4585]: I1201 14:02:45.538080 4585 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 01 14:02:45 crc kubenswrapper[4585]: I1201 14:02:45.538533 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 01 14:02:50 crc kubenswrapper[4585]: I1201 14:02:50.782787 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 01 14:02:50 crc kubenswrapper[4585]: I1201 14:02:50.783154 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 01 14:02:51 crc kubenswrapper[4585]: I1201 14:02:51.267957 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 01 14:02:51 crc kubenswrapper[4585]: I1201 14:02:51.562301 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 01 14:02:51 crc kubenswrapper[4585]: I1201 14:02:51.721395 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 01 14:02:51 crc kubenswrapper[4585]: I1201 14:02:51.792809 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 01 14:02:51 crc kubenswrapper[4585]: I1201 14:02:51.965908 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 01 14:02:52 crc kubenswrapper[4585]: I1201 14:02:52.010443 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 01 14:02:52 crc kubenswrapper[4585]: I1201 14:02:52.174898 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 01 14:02:52 crc kubenswrapper[4585]: I1201 14:02:52.498458 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 01 14:02:52 crc kubenswrapper[4585]: I1201 14:02:52.665455 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 01 14:02:53 crc kubenswrapper[4585]: I1201 14:02:53.042358 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 01 14:02:53 crc kubenswrapper[4585]: I1201 14:02:53.115946 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 01 14:02:53 crc kubenswrapper[4585]: I1201 14:02:53.137789 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 01 14:02:53 crc kubenswrapper[4585]: I1201 14:02:53.156470 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 01 14:02:53 crc kubenswrapper[4585]: I1201 14:02:53.611486 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 01 14:02:53 crc kubenswrapper[4585]: I1201 14:02:53.620515 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 01 14:02:53 crc kubenswrapper[4585]: I1201 14:02:53.647930 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 01 14:02:53 crc kubenswrapper[4585]: I1201 14:02:53.648321 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 01 14:02:54 crc kubenswrapper[4585]: I1201 14:02:54.013302 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 01 14:02:54 crc kubenswrapper[4585]: I1201 14:02:54.015957 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 01 14:02:54 crc kubenswrapper[4585]: I1201 14:02:54.066635 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 01 14:02:54 crc kubenswrapper[4585]: I1201 14:02:54.072869 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 01 14:02:54 crc kubenswrapper[4585]: I1201 14:02:54.076792 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 01 14:02:54 crc kubenswrapper[4585]: I1201 14:02:54.100873 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 01 14:02:54 crc kubenswrapper[4585]: I1201 14:02:54.120172 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 01 14:02:54 crc kubenswrapper[4585]: I1201 14:02:54.138055 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 01 14:02:54 crc kubenswrapper[4585]: I1201 14:02:54.154235 4585 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 01 14:02:54 crc kubenswrapper[4585]: I1201 14:02:54.184903 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 01 14:02:54 crc kubenswrapper[4585]: I1201 14:02:54.203843 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 01 14:02:54 crc kubenswrapper[4585]: I1201 14:02:54.327233 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 01 14:02:54 crc kubenswrapper[4585]: I1201 14:02:54.342038 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 01 14:02:54 crc kubenswrapper[4585]: I1201 14:02:54.393796 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 01 14:02:54 crc kubenswrapper[4585]: I1201 14:02:54.431698 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 01 14:02:54 crc kubenswrapper[4585]: I1201 14:02:54.444934 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 01 14:02:54 crc kubenswrapper[4585]: I1201 14:02:54.457909 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 01 14:02:54 crc kubenswrapper[4585]: I1201 14:02:54.540996 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 01 14:02:54 crc kubenswrapper[4585]: I1201 14:02:54.655762 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 01 14:02:54 crc kubenswrapper[4585]: I1201 14:02:54.689431 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 01 14:02:54 crc kubenswrapper[4585]: I1201 14:02:54.791521 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 01 14:02:54 crc kubenswrapper[4585]: I1201 14:02:54.905499 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 01 14:02:54 crc kubenswrapper[4585]: I1201 14:02:54.936288 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 01 14:02:54 crc kubenswrapper[4585]: I1201 14:02:54.972545 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 01 14:02:55 crc kubenswrapper[4585]: I1201 14:02:55.009211 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 01 14:02:55 crc kubenswrapper[4585]: I1201 14:02:55.105859 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 01 14:02:55 crc kubenswrapper[4585]: I1201 14:02:55.150766 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 01 14:02:55 crc kubenswrapper[4585]: I1201 14:02:55.151957 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 01 14:02:55 crc kubenswrapper[4585]: I1201 14:02:55.172331 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 01 14:02:55 crc kubenswrapper[4585]: I1201 14:02:55.187881 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 01 14:02:55 crc kubenswrapper[4585]: I1201 14:02:55.299098 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 01 14:02:55 crc kubenswrapper[4585]: I1201 14:02:55.403070 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 01 14:02:55 crc kubenswrapper[4585]: I1201 14:02:55.413092 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 01 14:02:55 crc kubenswrapper[4585]: I1201 14:02:55.506439 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 01 14:02:55 crc kubenswrapper[4585]: I1201 14:02:55.509400 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 01 14:02:55 crc kubenswrapper[4585]: I1201 14:02:55.537875 4585 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 01 14:02:55 crc kubenswrapper[4585]: I1201 14:02:55.538453 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 01 14:02:55 crc kubenswrapper[4585]: I1201 14:02:55.538524 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 14:02:55 crc kubenswrapper[4585]: I1201 14:02:55.539416 4585 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"ea8eb7ae336552fdfc54bff029db229c87e34d7723ff20d387057752f0ba5113"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Dec 01 14:02:55 crc kubenswrapper[4585]: I1201 14:02:55.539527 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://ea8eb7ae336552fdfc54bff029db229c87e34d7723ff20d387057752f0ba5113" gracePeriod=30 Dec 01 14:02:55 crc kubenswrapper[4585]: I1201 14:02:55.563720 4585 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 01 14:02:55 crc kubenswrapper[4585]: I1201 14:02:55.566317 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 01 14:02:55 crc kubenswrapper[4585]: I1201 14:02:55.761237 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 01 14:02:55 crc kubenswrapper[4585]: I1201 14:02:55.779488 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 01 14:02:55 crc kubenswrapper[4585]: I1201 14:02:55.833627 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 01 14:02:55 crc kubenswrapper[4585]: I1201 14:02:55.845345 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 01 14:02:55 crc kubenswrapper[4585]: I1201 14:02:55.929699 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 01 14:02:55 crc kubenswrapper[4585]: I1201 14:02:55.929742 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 01 14:02:55 crc kubenswrapper[4585]: I1201 14:02:55.931607 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 01 14:02:56 crc kubenswrapper[4585]: I1201 14:02:56.139935 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 01 14:02:56 crc kubenswrapper[4585]: I1201 14:02:56.191267 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 01 14:02:56 crc kubenswrapper[4585]: I1201 14:02:56.216920 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 01 14:02:56 crc kubenswrapper[4585]: I1201 14:02:56.352420 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 01 14:02:56 crc kubenswrapper[4585]: I1201 14:02:56.356171 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 01 14:02:56 crc kubenswrapper[4585]: I1201 14:02:56.383923 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 01 14:02:56 crc kubenswrapper[4585]: I1201 14:02:56.593350 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 01 14:02:56 crc kubenswrapper[4585]: I1201 14:02:56.659284 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 01 14:02:56 crc kubenswrapper[4585]: I1201 14:02:56.720210 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 01 14:02:57 crc kubenswrapper[4585]: I1201 14:02:57.064190 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 01 14:02:57 crc kubenswrapper[4585]: I1201 14:02:57.166528 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 01 14:02:57 crc kubenswrapper[4585]: I1201 14:02:57.204215 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 01 14:02:57 crc kubenswrapper[4585]: I1201 14:02:57.246212 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 01 14:02:57 crc kubenswrapper[4585]: I1201 14:02:57.410260 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 01 14:02:57 crc kubenswrapper[4585]: I1201 14:02:57.412413 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 01 14:02:57 crc kubenswrapper[4585]: I1201 14:02:57.499546 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 01 14:02:57 crc kubenswrapper[4585]: I1201 14:02:57.575262 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 01 14:02:57 crc kubenswrapper[4585]: I1201 14:02:57.583278 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 01 14:02:57 crc kubenswrapper[4585]: I1201 14:02:57.717412 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 01 14:02:57 crc kubenswrapper[4585]: I1201 14:02:57.773024 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 01 14:02:57 crc kubenswrapper[4585]: I1201 14:02:57.792242 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 01 14:02:57 crc kubenswrapper[4585]: I1201 14:02:57.809255 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 01 14:02:58 crc kubenswrapper[4585]: I1201 14:02:58.045704 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 01 14:02:58 crc kubenswrapper[4585]: I1201 14:02:58.122310 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 01 14:02:58 crc kubenswrapper[4585]: I1201 14:02:58.162944 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 01 14:02:58 crc kubenswrapper[4585]: I1201 14:02:58.185520 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 01 14:02:58 crc kubenswrapper[4585]: I1201 14:02:58.276299 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 01 14:02:58 crc kubenswrapper[4585]: I1201 14:02:58.296722 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 01 14:02:58 crc kubenswrapper[4585]: I1201 14:02:58.358700 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 01 14:02:58 crc kubenswrapper[4585]: I1201 14:02:58.474489 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 01 14:02:58 crc kubenswrapper[4585]: I1201 14:02:58.537426 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 01 14:02:58 crc kubenswrapper[4585]: I1201 14:02:58.539047 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 01 14:02:58 crc kubenswrapper[4585]: I1201 14:02:58.539714 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 01 14:02:58 crc kubenswrapper[4585]: I1201 14:02:58.585292 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 01 14:02:58 crc kubenswrapper[4585]: I1201 14:02:58.739229 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 01 14:02:58 crc kubenswrapper[4585]: I1201 14:02:58.851466 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 01 14:02:58 crc kubenswrapper[4585]: I1201 14:02:58.859250 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 01 14:02:58 crc kubenswrapper[4585]: I1201 14:02:58.862340 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 01 14:02:58 crc kubenswrapper[4585]: I1201 14:02:58.932032 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 01 14:02:58 crc kubenswrapper[4585]: I1201 14:02:58.938374 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 01 14:02:58 crc kubenswrapper[4585]: I1201 14:02:58.963072 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 01 14:02:59 crc kubenswrapper[4585]: I1201 14:02:59.003438 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 01 14:02:59 crc kubenswrapper[4585]: I1201 14:02:59.015529 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 01 14:02:59 crc kubenswrapper[4585]: I1201 14:02:59.135893 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 01 14:02:59 crc kubenswrapper[4585]: I1201 14:02:59.161769 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 01 14:02:59 crc kubenswrapper[4585]: I1201 14:02:59.239852 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 01 14:02:59 crc kubenswrapper[4585]: I1201 14:02:59.259078 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 01 14:02:59 crc kubenswrapper[4585]: I1201 14:02:59.401061 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 01 14:02:59 crc kubenswrapper[4585]: I1201 14:02:59.424791 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 01 14:02:59 crc kubenswrapper[4585]: I1201 14:02:59.580540 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 01 14:02:59 crc kubenswrapper[4585]: I1201 14:02:59.582465 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 01 14:02:59 crc kubenswrapper[4585]: I1201 14:02:59.628036 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 01 14:02:59 crc kubenswrapper[4585]: I1201 14:02:59.693160 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 01 14:02:59 crc kubenswrapper[4585]: I1201 14:02:59.725947 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 01 14:02:59 crc kubenswrapper[4585]: I1201 14:02:59.727390 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 01 14:02:59 crc kubenswrapper[4585]: I1201 14:02:59.788903 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 01 14:02:59 crc kubenswrapper[4585]: I1201 14:02:59.854990 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 01 14:02:59 crc kubenswrapper[4585]: I1201 14:02:59.878479 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 01 14:02:59 crc kubenswrapper[4585]: I1201 14:02:59.941235 4585 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 01 14:03:00 crc kubenswrapper[4585]: I1201 14:03:00.021857 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 01 14:03:00 crc kubenswrapper[4585]: I1201 14:03:00.065568 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 01 14:03:00 crc kubenswrapper[4585]: I1201 14:03:00.070589 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 01 14:03:00 crc kubenswrapper[4585]: I1201 14:03:00.076305 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 01 14:03:00 crc kubenswrapper[4585]: I1201 14:03:00.107566 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 01 14:03:00 crc kubenswrapper[4585]: I1201 14:03:00.125134 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 01 14:03:00 crc kubenswrapper[4585]: I1201 14:03:00.175635 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 01 14:03:00 crc kubenswrapper[4585]: I1201 14:03:00.225862 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 01 14:03:00 crc kubenswrapper[4585]: I1201 14:03:00.441907 4585 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 01 14:03:00 crc kubenswrapper[4585]: I1201 14:03:00.445482 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vnkff" podStartSLOduration=41.840511561 podStartE2EDuration="44.445459573s" podCreationTimestamp="2025-12-01 14:02:16 +0000 UTC" firstStartedPulling="2025-12-01 14:02:17.713829403 +0000 UTC m=+251.698043258" lastFinishedPulling="2025-12-01 14:02:20.318777415 +0000 UTC m=+254.302991270" observedRunningTime="2025-12-01 14:02:40.979152927 +0000 UTC m=+274.963366772" watchObservedRunningTime="2025-12-01 14:03:00.445459573 +0000 UTC m=+294.429673428" Dec 01 14:03:00 crc kubenswrapper[4585]: I1201 14:03:00.446896 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-558pq" podStartSLOduration=41.963218789 podStartE2EDuration="45.446889755s" podCreationTimestamp="2025-12-01 14:02:15 +0000 UTC" firstStartedPulling="2025-12-01 14:02:16.703774784 +0000 UTC m=+250.687988639" lastFinishedPulling="2025-12-01 14:02:20.18744575 +0000 UTC m=+254.171659605" observedRunningTime="2025-12-01 14:02:40.889781613 +0000 UTC m=+274.873995468" watchObservedRunningTime="2025-12-01 14:03:00.446889755 +0000 UTC m=+294.431103610" Dec 01 14:03:00 crc kubenswrapper[4585]: I1201 14:03:00.447400 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-c7gls","openshift-kube-apiserver/kube-apiserver-crc"] Dec 01 14:03:00 crc kubenswrapper[4585]: I1201 14:03:00.447453 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 01 14:03:00 crc kubenswrapper[4585]: I1201 14:03:00.453360 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 14:03:00 crc kubenswrapper[4585]: I1201 14:03:00.465112 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 01 14:03:00 crc kubenswrapper[4585]: I1201 14:03:00.476434 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=20.476409988 podStartE2EDuration="20.476409988s" podCreationTimestamp="2025-12-01 14:02:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:03:00.472550087 +0000 UTC m=+294.456763962" watchObservedRunningTime="2025-12-01 14:03:00.476409988 +0000 UTC m=+294.460623843" Dec 01 14:03:00 crc kubenswrapper[4585]: I1201 14:03:00.515551 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 01 14:03:00 crc kubenswrapper[4585]: I1201 14:03:00.521943 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 01 14:03:00 crc kubenswrapper[4585]: I1201 14:03:00.530419 4585 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 01 14:03:00 crc kubenswrapper[4585]: I1201 14:03:00.571597 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 01 14:03:00 crc kubenswrapper[4585]: I1201 14:03:00.601482 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 01 14:03:00 crc kubenswrapper[4585]: I1201 14:03:00.819947 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 01 14:03:00 crc kubenswrapper[4585]: I1201 14:03:00.849676 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 01 14:03:00 crc kubenswrapper[4585]: I1201 14:03:00.931663 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 01 14:03:01 crc kubenswrapper[4585]: I1201 14:03:01.033162 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 01 14:03:01 crc kubenswrapper[4585]: I1201 14:03:01.055178 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 01 14:03:01 crc kubenswrapper[4585]: I1201 14:03:01.069208 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 01 14:03:01 crc kubenswrapper[4585]: I1201 14:03:01.085145 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 01 14:03:01 crc kubenswrapper[4585]: I1201 14:03:01.131942 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 01 14:03:01 crc kubenswrapper[4585]: I1201 14:03:01.153825 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 01 14:03:01 crc kubenswrapper[4585]: I1201 14:03:01.204127 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 01 14:03:01 crc kubenswrapper[4585]: I1201 14:03:01.236433 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 01 14:03:01 crc kubenswrapper[4585]: I1201 14:03:01.262378 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 01 14:03:01 crc kubenswrapper[4585]: I1201 14:03:01.283267 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 01 14:03:01 crc kubenswrapper[4585]: I1201 14:03:01.366582 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 01 14:03:01 crc kubenswrapper[4585]: I1201 14:03:01.446601 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 01 14:03:01 crc kubenswrapper[4585]: I1201 14:03:01.685025 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 01 14:03:01 crc kubenswrapper[4585]: I1201 14:03:01.718187 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 01 14:03:01 crc kubenswrapper[4585]: I1201 14:03:01.779192 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 01 14:03:01 crc kubenswrapper[4585]: I1201 14:03:01.893460 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 01 14:03:01 crc kubenswrapper[4585]: I1201 14:03:01.937747 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.048460 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.115053 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.124214 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-54f75f9d4b-mwldk"] Dec 01 14:03:02 crc kubenswrapper[4585]: E1201 14:03:02.124679 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13110086-95d8-4bf3-bc4f-7a7d2f2f7ae6" containerName="installer" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.124705 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="13110086-95d8-4bf3-bc4f-7a7d2f2f7ae6" containerName="installer" Dec 01 14:03:02 crc kubenswrapper[4585]: E1201 14:03:02.124722 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79abd33c-0184-473e-8bb9-c408a5c32efc" containerName="oauth-openshift" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.124732 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="79abd33c-0184-473e-8bb9-c408a5c32efc" containerName="oauth-openshift" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.124857 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="79abd33c-0184-473e-8bb9-c408a5c32efc" containerName="oauth-openshift" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.124877 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="13110086-95d8-4bf3-bc4f-7a7d2f2f7ae6" containerName="installer" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.125587 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-54f75f9d4b-mwldk" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.130529 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.130827 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.130956 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.131356 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.131701 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.132069 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.132239 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.132341 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.132567 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.132568 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.133261 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.133272 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.140320 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.150707 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.152495 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.195008 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.238877 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/393c2d82-7e3d-4ae5-a0ba-3f9599877dc5-v4-0-config-system-session\") pod \"oauth-openshift-54f75f9d4b-mwldk\" (UID: \"393c2d82-7e3d-4ae5-a0ba-3f9599877dc5\") " pod="openshift-authentication/oauth-openshift-54f75f9d4b-mwldk" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.238938 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/393c2d82-7e3d-4ae5-a0ba-3f9599877dc5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-54f75f9d4b-mwldk\" (UID: \"393c2d82-7e3d-4ae5-a0ba-3f9599877dc5\") " pod="openshift-authentication/oauth-openshift-54f75f9d4b-mwldk" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.239000 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/393c2d82-7e3d-4ae5-a0ba-3f9599877dc5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-54f75f9d4b-mwldk\" (UID: \"393c2d82-7e3d-4ae5-a0ba-3f9599877dc5\") " pod="openshift-authentication/oauth-openshift-54f75f9d4b-mwldk" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.239036 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/393c2d82-7e3d-4ae5-a0ba-3f9599877dc5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-54f75f9d4b-mwldk\" (UID: \"393c2d82-7e3d-4ae5-a0ba-3f9599877dc5\") " pod="openshift-authentication/oauth-openshift-54f75f9d4b-mwldk" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.239061 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/393c2d82-7e3d-4ae5-a0ba-3f9599877dc5-v4-0-config-system-router-certs\") pod \"oauth-openshift-54f75f9d4b-mwldk\" (UID: \"393c2d82-7e3d-4ae5-a0ba-3f9599877dc5\") " pod="openshift-authentication/oauth-openshift-54f75f9d4b-mwldk" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.239102 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/393c2d82-7e3d-4ae5-a0ba-3f9599877dc5-audit-policies\") pod \"oauth-openshift-54f75f9d4b-mwldk\" (UID: \"393c2d82-7e3d-4ae5-a0ba-3f9599877dc5\") " pod="openshift-authentication/oauth-openshift-54f75f9d4b-mwldk" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.239145 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/393c2d82-7e3d-4ae5-a0ba-3f9599877dc5-v4-0-config-user-template-login\") pod \"oauth-openshift-54f75f9d4b-mwldk\" (UID: \"393c2d82-7e3d-4ae5-a0ba-3f9599877dc5\") " pod="openshift-authentication/oauth-openshift-54f75f9d4b-mwldk" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.239174 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/393c2d82-7e3d-4ae5-a0ba-3f9599877dc5-v4-0-config-system-service-ca\") pod \"oauth-openshift-54f75f9d4b-mwldk\" (UID: \"393c2d82-7e3d-4ae5-a0ba-3f9599877dc5\") " pod="openshift-authentication/oauth-openshift-54f75f9d4b-mwldk" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.239206 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/393c2d82-7e3d-4ae5-a0ba-3f9599877dc5-v4-0-config-user-template-error\") pod \"oauth-openshift-54f75f9d4b-mwldk\" (UID: \"393c2d82-7e3d-4ae5-a0ba-3f9599877dc5\") " pod="openshift-authentication/oauth-openshift-54f75f9d4b-mwldk" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.239238 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/393c2d82-7e3d-4ae5-a0ba-3f9599877dc5-audit-dir\") pod \"oauth-openshift-54f75f9d4b-mwldk\" (UID: \"393c2d82-7e3d-4ae5-a0ba-3f9599877dc5\") " pod="openshift-authentication/oauth-openshift-54f75f9d4b-mwldk" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.239269 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/393c2d82-7e3d-4ae5-a0ba-3f9599877dc5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-54f75f9d4b-mwldk\" (UID: \"393c2d82-7e3d-4ae5-a0ba-3f9599877dc5\") " pod="openshift-authentication/oauth-openshift-54f75f9d4b-mwldk" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.239299 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/393c2d82-7e3d-4ae5-a0ba-3f9599877dc5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-54f75f9d4b-mwldk\" (UID: \"393c2d82-7e3d-4ae5-a0ba-3f9599877dc5\") " pod="openshift-authentication/oauth-openshift-54f75f9d4b-mwldk" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.239325 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/393c2d82-7e3d-4ae5-a0ba-3f9599877dc5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-54f75f9d4b-mwldk\" (UID: \"393c2d82-7e3d-4ae5-a0ba-3f9599877dc5\") " pod="openshift-authentication/oauth-openshift-54f75f9d4b-mwldk" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.239361 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8d25v\" (UniqueName: \"kubernetes.io/projected/393c2d82-7e3d-4ae5-a0ba-3f9599877dc5-kube-api-access-8d25v\") pod \"oauth-openshift-54f75f9d4b-mwldk\" (UID: \"393c2d82-7e3d-4ae5-a0ba-3f9599877dc5\") " pod="openshift-authentication/oauth-openshift-54f75f9d4b-mwldk" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.255267 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.299925 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.330930 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.341255 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/393c2d82-7e3d-4ae5-a0ba-3f9599877dc5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-54f75f9d4b-mwldk\" (UID: \"393c2d82-7e3d-4ae5-a0ba-3f9599877dc5\") " pod="openshift-authentication/oauth-openshift-54f75f9d4b-mwldk" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.341494 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/393c2d82-7e3d-4ae5-a0ba-3f9599877dc5-v4-0-config-system-router-certs\") pod \"oauth-openshift-54f75f9d4b-mwldk\" (UID: \"393c2d82-7e3d-4ae5-a0ba-3f9599877dc5\") " pod="openshift-authentication/oauth-openshift-54f75f9d4b-mwldk" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.341663 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/393c2d82-7e3d-4ae5-a0ba-3f9599877dc5-audit-policies\") pod \"oauth-openshift-54f75f9d4b-mwldk\" (UID: \"393c2d82-7e3d-4ae5-a0ba-3f9599877dc5\") " pod="openshift-authentication/oauth-openshift-54f75f9d4b-mwldk" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.341812 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/393c2d82-7e3d-4ae5-a0ba-3f9599877dc5-v4-0-config-user-template-login\") pod \"oauth-openshift-54f75f9d4b-mwldk\" (UID: \"393c2d82-7e3d-4ae5-a0ba-3f9599877dc5\") " pod="openshift-authentication/oauth-openshift-54f75f9d4b-mwldk" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.341922 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/393c2d82-7e3d-4ae5-a0ba-3f9599877dc5-v4-0-config-system-service-ca\") pod \"oauth-openshift-54f75f9d4b-mwldk\" (UID: \"393c2d82-7e3d-4ae5-a0ba-3f9599877dc5\") " pod="openshift-authentication/oauth-openshift-54f75f9d4b-mwldk" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.342062 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/393c2d82-7e3d-4ae5-a0ba-3f9599877dc5-v4-0-config-user-template-error\") pod \"oauth-openshift-54f75f9d4b-mwldk\" (UID: \"393c2d82-7e3d-4ae5-a0ba-3f9599877dc5\") " pod="openshift-authentication/oauth-openshift-54f75f9d4b-mwldk" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.342184 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/393c2d82-7e3d-4ae5-a0ba-3f9599877dc5-audit-dir\") pod \"oauth-openshift-54f75f9d4b-mwldk\" (UID: \"393c2d82-7e3d-4ae5-a0ba-3f9599877dc5\") " pod="openshift-authentication/oauth-openshift-54f75f9d4b-mwldk" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.342305 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/393c2d82-7e3d-4ae5-a0ba-3f9599877dc5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-54f75f9d4b-mwldk\" (UID: \"393c2d82-7e3d-4ae5-a0ba-3f9599877dc5\") " pod="openshift-authentication/oauth-openshift-54f75f9d4b-mwldk" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.342419 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/393c2d82-7e3d-4ae5-a0ba-3f9599877dc5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-54f75f9d4b-mwldk\" (UID: \"393c2d82-7e3d-4ae5-a0ba-3f9599877dc5\") " pod="openshift-authentication/oauth-openshift-54f75f9d4b-mwldk" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.342528 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/393c2d82-7e3d-4ae5-a0ba-3f9599877dc5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-54f75f9d4b-mwldk\" (UID: \"393c2d82-7e3d-4ae5-a0ba-3f9599877dc5\") " pod="openshift-authentication/oauth-openshift-54f75f9d4b-mwldk" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.342650 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8d25v\" (UniqueName: \"kubernetes.io/projected/393c2d82-7e3d-4ae5-a0ba-3f9599877dc5-kube-api-access-8d25v\") pod \"oauth-openshift-54f75f9d4b-mwldk\" (UID: \"393c2d82-7e3d-4ae5-a0ba-3f9599877dc5\") " pod="openshift-authentication/oauth-openshift-54f75f9d4b-mwldk" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.342726 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/393c2d82-7e3d-4ae5-a0ba-3f9599877dc5-audit-policies\") pod \"oauth-openshift-54f75f9d4b-mwldk\" (UID: \"393c2d82-7e3d-4ae5-a0ba-3f9599877dc5\") " pod="openshift-authentication/oauth-openshift-54f75f9d4b-mwldk" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.342861 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/393c2d82-7e3d-4ae5-a0ba-3f9599877dc5-v4-0-config-system-session\") pod \"oauth-openshift-54f75f9d4b-mwldk\" (UID: \"393c2d82-7e3d-4ae5-a0ba-3f9599877dc5\") " pod="openshift-authentication/oauth-openshift-54f75f9d4b-mwldk" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.343001 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/393c2d82-7e3d-4ae5-a0ba-3f9599877dc5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-54f75f9d4b-mwldk\" (UID: \"393c2d82-7e3d-4ae5-a0ba-3f9599877dc5\") " pod="openshift-authentication/oauth-openshift-54f75f9d4b-mwldk" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.343140 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/393c2d82-7e3d-4ae5-a0ba-3f9599877dc5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-54f75f9d4b-mwldk\" (UID: \"393c2d82-7e3d-4ae5-a0ba-3f9599877dc5\") " pod="openshift-authentication/oauth-openshift-54f75f9d4b-mwldk" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.343213 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/393c2d82-7e3d-4ae5-a0ba-3f9599877dc5-v4-0-config-system-service-ca\") pod \"oauth-openshift-54f75f9d4b-mwldk\" (UID: \"393c2d82-7e3d-4ae5-a0ba-3f9599877dc5\") " pod="openshift-authentication/oauth-openshift-54f75f9d4b-mwldk" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.344499 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/393c2d82-7e3d-4ae5-a0ba-3f9599877dc5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-54f75f9d4b-mwldk\" (UID: \"393c2d82-7e3d-4ae5-a0ba-3f9599877dc5\") " pod="openshift-authentication/oauth-openshift-54f75f9d4b-mwldk" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.347412 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/393c2d82-7e3d-4ae5-a0ba-3f9599877dc5-audit-dir\") pod \"oauth-openshift-54f75f9d4b-mwldk\" (UID: \"393c2d82-7e3d-4ae5-a0ba-3f9599877dc5\") " pod="openshift-authentication/oauth-openshift-54f75f9d4b-mwldk" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.348913 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/393c2d82-7e3d-4ae5-a0ba-3f9599877dc5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-54f75f9d4b-mwldk\" (UID: \"393c2d82-7e3d-4ae5-a0ba-3f9599877dc5\") " pod="openshift-authentication/oauth-openshift-54f75f9d4b-mwldk" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.349936 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/393c2d82-7e3d-4ae5-a0ba-3f9599877dc5-v4-0-config-system-router-certs\") pod \"oauth-openshift-54f75f9d4b-mwldk\" (UID: \"393c2d82-7e3d-4ae5-a0ba-3f9599877dc5\") " pod="openshift-authentication/oauth-openshift-54f75f9d4b-mwldk" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.350654 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/393c2d82-7e3d-4ae5-a0ba-3f9599877dc5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-54f75f9d4b-mwldk\" (UID: \"393c2d82-7e3d-4ae5-a0ba-3f9599877dc5\") " pod="openshift-authentication/oauth-openshift-54f75f9d4b-mwldk" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.350808 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/393c2d82-7e3d-4ae5-a0ba-3f9599877dc5-v4-0-config-user-template-login\") pod \"oauth-openshift-54f75f9d4b-mwldk\" (UID: \"393c2d82-7e3d-4ae5-a0ba-3f9599877dc5\") " pod="openshift-authentication/oauth-openshift-54f75f9d4b-mwldk" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.350909 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/393c2d82-7e3d-4ae5-a0ba-3f9599877dc5-v4-0-config-user-template-error\") pod \"oauth-openshift-54f75f9d4b-mwldk\" (UID: \"393c2d82-7e3d-4ae5-a0ba-3f9599877dc5\") " pod="openshift-authentication/oauth-openshift-54f75f9d4b-mwldk" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.351680 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/393c2d82-7e3d-4ae5-a0ba-3f9599877dc5-v4-0-config-system-session\") pod \"oauth-openshift-54f75f9d4b-mwldk\" (UID: \"393c2d82-7e3d-4ae5-a0ba-3f9599877dc5\") " pod="openshift-authentication/oauth-openshift-54f75f9d4b-mwldk" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.352730 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/393c2d82-7e3d-4ae5-a0ba-3f9599877dc5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-54f75f9d4b-mwldk\" (UID: \"393c2d82-7e3d-4ae5-a0ba-3f9599877dc5\") " pod="openshift-authentication/oauth-openshift-54f75f9d4b-mwldk" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.353360 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/393c2d82-7e3d-4ae5-a0ba-3f9599877dc5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-54f75f9d4b-mwldk\" (UID: \"393c2d82-7e3d-4ae5-a0ba-3f9599877dc5\") " pod="openshift-authentication/oauth-openshift-54f75f9d4b-mwldk" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.362208 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/393c2d82-7e3d-4ae5-a0ba-3f9599877dc5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-54f75f9d4b-mwldk\" (UID: \"393c2d82-7e3d-4ae5-a0ba-3f9599877dc5\") " pod="openshift-authentication/oauth-openshift-54f75f9d4b-mwldk" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.365639 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d25v\" (UniqueName: \"kubernetes.io/projected/393c2d82-7e3d-4ae5-a0ba-3f9599877dc5-kube-api-access-8d25v\") pod \"oauth-openshift-54f75f9d4b-mwldk\" (UID: \"393c2d82-7e3d-4ae5-a0ba-3f9599877dc5\") " pod="openshift-authentication/oauth-openshift-54f75f9d4b-mwldk" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.422561 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79abd33c-0184-473e-8bb9-c408a5c32efc" path="/var/lib/kubelet/pods/79abd33c-0184-473e-8bb9-c408a5c32efc/volumes" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.443452 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.444534 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.460258 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-54f75f9d4b-mwldk" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.562887 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.585411 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.591946 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.684108 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.700212 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.716170 4585 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.820407 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.862485 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.883956 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.931408 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.938585 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.944589 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 01 14:03:02 crc kubenswrapper[4585]: I1201 14:03:02.976303 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 01 14:03:03 crc kubenswrapper[4585]: I1201 14:03:03.240409 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 01 14:03:03 crc kubenswrapper[4585]: I1201 14:03:03.310508 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 01 14:03:03 crc kubenswrapper[4585]: I1201 14:03:03.362830 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 01 14:03:03 crc kubenswrapper[4585]: I1201 14:03:03.363199 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 01 14:03:03 crc kubenswrapper[4585]: I1201 14:03:03.412099 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 01 14:03:03 crc kubenswrapper[4585]: I1201 14:03:03.558413 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 01 14:03:03 crc kubenswrapper[4585]: I1201 14:03:03.579320 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 01 14:03:03 crc kubenswrapper[4585]: I1201 14:03:03.649832 4585 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 01 14:03:03 crc kubenswrapper[4585]: I1201 14:03:03.650177 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://1e06131703ce57609710e4a76a8f711413bb460b2573d3245a118034bf94fcd4" gracePeriod=5 Dec 01 14:03:03 crc kubenswrapper[4585]: I1201 14:03:03.657244 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 01 14:03:03 crc kubenswrapper[4585]: I1201 14:03:03.664056 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 01 14:03:03 crc kubenswrapper[4585]: I1201 14:03:03.717374 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 01 14:03:03 crc kubenswrapper[4585]: I1201 14:03:03.773224 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 01 14:03:03 crc kubenswrapper[4585]: I1201 14:03:03.855293 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 01 14:03:03 crc kubenswrapper[4585]: I1201 14:03:03.955870 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 01 14:03:03 crc kubenswrapper[4585]: I1201 14:03:03.986323 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 01 14:03:04 crc kubenswrapper[4585]: I1201 14:03:04.057471 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 01 14:03:04 crc kubenswrapper[4585]: I1201 14:03:04.107958 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 01 14:03:04 crc kubenswrapper[4585]: I1201 14:03:04.136811 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 01 14:03:04 crc kubenswrapper[4585]: I1201 14:03:04.146633 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 01 14:03:04 crc kubenswrapper[4585]: I1201 14:03:04.222749 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 01 14:03:04 crc kubenswrapper[4585]: I1201 14:03:04.306925 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 01 14:03:04 crc kubenswrapper[4585]: I1201 14:03:04.476095 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 01 14:03:04 crc kubenswrapper[4585]: I1201 14:03:04.848796 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 01 14:03:04 crc kubenswrapper[4585]: I1201 14:03:04.999716 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 01 14:03:05 crc kubenswrapper[4585]: I1201 14:03:05.064561 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 01 14:03:05 crc kubenswrapper[4585]: I1201 14:03:05.118209 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 01 14:03:05 crc kubenswrapper[4585]: I1201 14:03:05.131867 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 01 14:03:05 crc kubenswrapper[4585]: I1201 14:03:05.277572 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 01 14:03:05 crc kubenswrapper[4585]: I1201 14:03:05.432228 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-54f75f9d4b-mwldk"] Dec 01 14:03:05 crc kubenswrapper[4585]: I1201 14:03:05.653491 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 01 14:03:05 crc kubenswrapper[4585]: E1201 14:03:05.807224 4585 log.go:32] "RunPodSandbox from runtime service failed" err=< Dec 01 14:03:05 crc kubenswrapper[4585]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-54f75f9d4b-mwldk_openshift-authentication_393c2d82-7e3d-4ae5-a0ba-3f9599877dc5_0(1aa76d769f9151f15a378f1eb621fffa1cc6e1569bc1107c5f52d45b1ce40678): error adding pod openshift-authentication_oauth-openshift-54f75f9d4b-mwldk to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"1aa76d769f9151f15a378f1eb621fffa1cc6e1569bc1107c5f52d45b1ce40678" Netns:"/var/run/netns/704af04a-da7d-440c-bbba-430c0ebdebc8" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-54f75f9d4b-mwldk;K8S_POD_INFRA_CONTAINER_ID=1aa76d769f9151f15a378f1eb621fffa1cc6e1569bc1107c5f52d45b1ce40678;K8S_POD_UID=393c2d82-7e3d-4ae5-a0ba-3f9599877dc5" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-54f75f9d4b-mwldk] networking: Multus: [openshift-authentication/oauth-openshift-54f75f9d4b-mwldk/393c2d82-7e3d-4ae5-a0ba-3f9599877dc5]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-54f75f9d4b-mwldk in out of cluster comm: pod "oauth-openshift-54f75f9d4b-mwldk" not found Dec 01 14:03:05 crc kubenswrapper[4585]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 01 14:03:05 crc kubenswrapper[4585]: > Dec 01 14:03:05 crc kubenswrapper[4585]: E1201 14:03:05.807305 4585 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Dec 01 14:03:05 crc kubenswrapper[4585]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-54f75f9d4b-mwldk_openshift-authentication_393c2d82-7e3d-4ae5-a0ba-3f9599877dc5_0(1aa76d769f9151f15a378f1eb621fffa1cc6e1569bc1107c5f52d45b1ce40678): error adding pod openshift-authentication_oauth-openshift-54f75f9d4b-mwldk to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"1aa76d769f9151f15a378f1eb621fffa1cc6e1569bc1107c5f52d45b1ce40678" Netns:"/var/run/netns/704af04a-da7d-440c-bbba-430c0ebdebc8" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-54f75f9d4b-mwldk;K8S_POD_INFRA_CONTAINER_ID=1aa76d769f9151f15a378f1eb621fffa1cc6e1569bc1107c5f52d45b1ce40678;K8S_POD_UID=393c2d82-7e3d-4ae5-a0ba-3f9599877dc5" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-54f75f9d4b-mwldk] networking: Multus: [openshift-authentication/oauth-openshift-54f75f9d4b-mwldk/393c2d82-7e3d-4ae5-a0ba-3f9599877dc5]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-54f75f9d4b-mwldk in out of cluster comm: pod "oauth-openshift-54f75f9d4b-mwldk" not found Dec 01 14:03:05 crc kubenswrapper[4585]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 01 14:03:05 crc kubenswrapper[4585]: > pod="openshift-authentication/oauth-openshift-54f75f9d4b-mwldk" Dec 01 14:03:05 crc kubenswrapper[4585]: E1201 14:03:05.807353 4585 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Dec 01 14:03:05 crc kubenswrapper[4585]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-54f75f9d4b-mwldk_openshift-authentication_393c2d82-7e3d-4ae5-a0ba-3f9599877dc5_0(1aa76d769f9151f15a378f1eb621fffa1cc6e1569bc1107c5f52d45b1ce40678): error adding pod openshift-authentication_oauth-openshift-54f75f9d4b-mwldk to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"1aa76d769f9151f15a378f1eb621fffa1cc6e1569bc1107c5f52d45b1ce40678" Netns:"/var/run/netns/704af04a-da7d-440c-bbba-430c0ebdebc8" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-54f75f9d4b-mwldk;K8S_POD_INFRA_CONTAINER_ID=1aa76d769f9151f15a378f1eb621fffa1cc6e1569bc1107c5f52d45b1ce40678;K8S_POD_UID=393c2d82-7e3d-4ae5-a0ba-3f9599877dc5" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-54f75f9d4b-mwldk] networking: Multus: [openshift-authentication/oauth-openshift-54f75f9d4b-mwldk/393c2d82-7e3d-4ae5-a0ba-3f9599877dc5]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-54f75f9d4b-mwldk in out of cluster comm: pod "oauth-openshift-54f75f9d4b-mwldk" not found Dec 01 14:03:05 crc kubenswrapper[4585]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 01 14:03:05 crc kubenswrapper[4585]: > pod="openshift-authentication/oauth-openshift-54f75f9d4b-mwldk" Dec 01 14:03:05 crc kubenswrapper[4585]: E1201 14:03:05.807420 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"oauth-openshift-54f75f9d4b-mwldk_openshift-authentication(393c2d82-7e3d-4ae5-a0ba-3f9599877dc5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"oauth-openshift-54f75f9d4b-mwldk_openshift-authentication(393c2d82-7e3d-4ae5-a0ba-3f9599877dc5)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-54f75f9d4b-mwldk_openshift-authentication_393c2d82-7e3d-4ae5-a0ba-3f9599877dc5_0(1aa76d769f9151f15a378f1eb621fffa1cc6e1569bc1107c5f52d45b1ce40678): error adding pod openshift-authentication_oauth-openshift-54f75f9d4b-mwldk to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"1aa76d769f9151f15a378f1eb621fffa1cc6e1569bc1107c5f52d45b1ce40678\\\" Netns:\\\"/var/run/netns/704af04a-da7d-440c-bbba-430c0ebdebc8\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-54f75f9d4b-mwldk;K8S_POD_INFRA_CONTAINER_ID=1aa76d769f9151f15a378f1eb621fffa1cc6e1569bc1107c5f52d45b1ce40678;K8S_POD_UID=393c2d82-7e3d-4ae5-a0ba-3f9599877dc5\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-54f75f9d4b-mwldk] networking: Multus: [openshift-authentication/oauth-openshift-54f75f9d4b-mwldk/393c2d82-7e3d-4ae5-a0ba-3f9599877dc5]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-54f75f9d4b-mwldk in out of cluster comm: pod \\\"oauth-openshift-54f75f9d4b-mwldk\\\" not found\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-authentication/oauth-openshift-54f75f9d4b-mwldk" podUID="393c2d82-7e3d-4ae5-a0ba-3f9599877dc5" Dec 01 14:03:05 crc kubenswrapper[4585]: I1201 14:03:05.859084 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 01 14:03:06 crc kubenswrapper[4585]: I1201 14:03:06.045427 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 01 14:03:06 crc kubenswrapper[4585]: I1201 14:03:06.086852 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-54f75f9d4b-mwldk" Dec 01 14:03:06 crc kubenswrapper[4585]: I1201 14:03:06.087602 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-54f75f9d4b-mwldk" Dec 01 14:03:06 crc kubenswrapper[4585]: I1201 14:03:06.208769 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 01 14:03:06 crc kubenswrapper[4585]: I1201 14:03:06.252356 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 01 14:03:06 crc kubenswrapper[4585]: I1201 14:03:06.259262 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 01 14:03:06 crc kubenswrapper[4585]: I1201 14:03:06.307165 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 01 14:03:06 crc kubenswrapper[4585]: I1201 14:03:06.316816 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-54f75f9d4b-mwldk"] Dec 01 14:03:06 crc kubenswrapper[4585]: I1201 14:03:06.360085 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 01 14:03:06 crc kubenswrapper[4585]: I1201 14:03:06.416713 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 01 14:03:07 crc kubenswrapper[4585]: I1201 14:03:07.095117 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-54f75f9d4b-mwldk" event={"ID":"393c2d82-7e3d-4ae5-a0ba-3f9599877dc5","Type":"ContainerStarted","Data":"fcdc577878c4d40e3146819f27037276c239aa0b49a990a75e59c97867361e21"} Dec 01 14:03:07 crc kubenswrapper[4585]: I1201 14:03:07.095437 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-54f75f9d4b-mwldk" Dec 01 14:03:07 crc kubenswrapper[4585]: I1201 14:03:07.095453 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-54f75f9d4b-mwldk" event={"ID":"393c2d82-7e3d-4ae5-a0ba-3f9599877dc5","Type":"ContainerStarted","Data":"ff5cde076e926706350a38c9d068aa76f7de7954ea6cf74a64b926f11307ff21"} Dec 01 14:03:07 crc kubenswrapper[4585]: I1201 14:03:07.141120 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-54f75f9d4b-mwldk" Dec 01 14:03:07 crc kubenswrapper[4585]: I1201 14:03:07.164866 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-54f75f9d4b-mwldk" podStartSLOduration=59.164843069 podStartE2EDuration="59.164843069s" podCreationTimestamp="2025-12-01 14:02:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:03:07.124891815 +0000 UTC m=+301.109105740" watchObservedRunningTime="2025-12-01 14:03:07.164843069 +0000 UTC m=+301.149056924" Dec 01 14:03:07 crc kubenswrapper[4585]: I1201 14:03:07.282990 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 01 14:03:09 crc kubenswrapper[4585]: I1201 14:03:09.112946 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 01 14:03:09 crc kubenswrapper[4585]: I1201 14:03:09.114928 4585 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="1e06131703ce57609710e4a76a8f711413bb460b2573d3245a118034bf94fcd4" exitCode=137 Dec 01 14:03:09 crc kubenswrapper[4585]: I1201 14:03:09.236650 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 01 14:03:09 crc kubenswrapper[4585]: I1201 14:03:09.236745 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 14:03:09 crc kubenswrapper[4585]: I1201 14:03:09.355938 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 01 14:03:09 crc kubenswrapper[4585]: I1201 14:03:09.356099 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 01 14:03:09 crc kubenswrapper[4585]: I1201 14:03:09.356142 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 14:03:09 crc kubenswrapper[4585]: I1201 14:03:09.356230 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 01 14:03:09 crc kubenswrapper[4585]: I1201 14:03:09.356265 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 01 14:03:09 crc kubenswrapper[4585]: I1201 14:03:09.356305 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 01 14:03:09 crc kubenswrapper[4585]: I1201 14:03:09.356319 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 14:03:09 crc kubenswrapper[4585]: I1201 14:03:09.356372 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 14:03:09 crc kubenswrapper[4585]: I1201 14:03:09.356433 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 14:03:09 crc kubenswrapper[4585]: I1201 14:03:09.356721 4585 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 01 14:03:09 crc kubenswrapper[4585]: I1201 14:03:09.356747 4585 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 01 14:03:09 crc kubenswrapper[4585]: I1201 14:03:09.356764 4585 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 01 14:03:09 crc kubenswrapper[4585]: I1201 14:03:09.356779 4585 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 01 14:03:09 crc kubenswrapper[4585]: I1201 14:03:09.367894 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 14:03:09 crc kubenswrapper[4585]: I1201 14:03:09.458323 4585 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 01 14:03:10 crc kubenswrapper[4585]: I1201 14:03:10.140228 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 01 14:03:10 crc kubenswrapper[4585]: I1201 14:03:10.140440 4585 scope.go:117] "RemoveContainer" containerID="1e06131703ce57609710e4a76a8f711413bb460b2573d3245a118034bf94fcd4" Dec 01 14:03:10 crc kubenswrapper[4585]: I1201 14:03:10.140504 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 14:03:10 crc kubenswrapper[4585]: I1201 14:03:10.421499 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 01 14:03:16 crc kubenswrapper[4585]: I1201 14:03:16.802549 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 01 14:03:19 crc kubenswrapper[4585]: I1201 14:03:19.412701 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 01 14:03:22 crc kubenswrapper[4585]: I1201 14:03:22.317254 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 01 14:03:23 crc kubenswrapper[4585]: I1201 14:03:23.595803 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 01 14:03:23 crc kubenswrapper[4585]: I1201 14:03:23.717713 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 01 14:03:24 crc kubenswrapper[4585]: I1201 14:03:24.094784 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 01 14:03:24 crc kubenswrapper[4585]: I1201 14:03:24.797677 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 01 14:03:24 crc kubenswrapper[4585]: I1201 14:03:24.904900 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 01 14:03:26 crc kubenswrapper[4585]: I1201 14:03:26.245098 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Dec 01 14:03:26 crc kubenswrapper[4585]: I1201 14:03:26.247308 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 01 14:03:26 crc kubenswrapper[4585]: I1201 14:03:26.247386 4585 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="ea8eb7ae336552fdfc54bff029db229c87e34d7723ff20d387057752f0ba5113" exitCode=137 Dec 01 14:03:26 crc kubenswrapper[4585]: I1201 14:03:26.247431 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"ea8eb7ae336552fdfc54bff029db229c87e34d7723ff20d387057752f0ba5113"} Dec 01 14:03:26 crc kubenswrapper[4585]: I1201 14:03:26.247476 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9f02dcbc2a7940583c19e38f95def824c2dfb5c19f7db253106c750880cca3e6"} Dec 01 14:03:26 crc kubenswrapper[4585]: I1201 14:03:26.247498 4585 scope.go:117] "RemoveContainer" containerID="d240ebf61216f20038983f7b8d2e0792fc0f1ed8db050debf02c777197d417b7" Dec 01 14:03:27 crc kubenswrapper[4585]: I1201 14:03:27.255929 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Dec 01 14:03:27 crc kubenswrapper[4585]: I1201 14:03:27.564763 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 01 14:03:28 crc kubenswrapper[4585]: I1201 14:03:28.364064 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 01 14:03:29 crc kubenswrapper[4585]: I1201 14:03:29.681643 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 14:03:30 crc kubenswrapper[4585]: I1201 14:03:30.855383 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 01 14:03:31 crc kubenswrapper[4585]: I1201 14:03:31.206512 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 01 14:03:31 crc kubenswrapper[4585]: I1201 14:03:31.332403 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 01 14:03:35 crc kubenswrapper[4585]: I1201 14:03:35.538044 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 14:03:35 crc kubenswrapper[4585]: I1201 14:03:35.544179 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 14:03:36 crc kubenswrapper[4585]: I1201 14:03:36.315792 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 01 14:03:36 crc kubenswrapper[4585]: I1201 14:03:36.335403 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 14:03:38 crc kubenswrapper[4585]: I1201 14:03:38.723438 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 01 14:03:38 crc kubenswrapper[4585]: I1201 14:03:38.893577 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 01 14:03:39 crc kubenswrapper[4585]: I1201 14:03:39.332428 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 01 14:03:42 crc kubenswrapper[4585]: I1201 14:03:42.738892 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 01 14:03:43 crc kubenswrapper[4585]: I1201 14:03:43.525758 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 01 14:03:47 crc kubenswrapper[4585]: I1201 14:03:47.415702 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 01 14:04:06 crc kubenswrapper[4585]: I1201 14:04:06.345486 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-n4pr6"] Dec 01 14:04:06 crc kubenswrapper[4585]: I1201 14:04:06.346293 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-n4pr6" podUID="e0b7b830-078c-4448-b914-ab62e5ff7059" containerName="controller-manager" containerID="cri-o://2cbe9f4750d1dc4f5fdacae9f66accbd7e008fac26e4d61d68b288134b1bf124" gracePeriod=30 Dec 01 14:04:06 crc kubenswrapper[4585]: I1201 14:04:06.446577 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-2vtl4"] Dec 01 14:04:06 crc kubenswrapper[4585]: I1201 14:04:06.446846 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2vtl4" podUID="7aedeec4-fc41-47aa-85a1-d5a92de50deb" containerName="route-controller-manager" containerID="cri-o://3f69e7acf4e89062c30ccd7ac049a19d74400767bd1fa888a301bc5a4e125c1c" gracePeriod=30 Dec 01 14:04:06 crc kubenswrapper[4585]: I1201 14:04:06.526278 4585 generic.go:334] "Generic (PLEG): container finished" podID="e0b7b830-078c-4448-b914-ab62e5ff7059" containerID="2cbe9f4750d1dc4f5fdacae9f66accbd7e008fac26e4d61d68b288134b1bf124" exitCode=0 Dec 01 14:04:06 crc kubenswrapper[4585]: I1201 14:04:06.526522 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-n4pr6" event={"ID":"e0b7b830-078c-4448-b914-ab62e5ff7059","Type":"ContainerDied","Data":"2cbe9f4750d1dc4f5fdacae9f66accbd7e008fac26e4d61d68b288134b1bf124"} Dec 01 14:04:06 crc kubenswrapper[4585]: I1201 14:04:06.770169 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-n4pr6" Dec 01 14:04:06 crc kubenswrapper[4585]: I1201 14:04:06.835943 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0b7b830-078c-4448-b914-ab62e5ff7059-serving-cert\") pod \"e0b7b830-078c-4448-b914-ab62e5ff7059\" (UID: \"e0b7b830-078c-4448-b914-ab62e5ff7059\") " Dec 01 14:04:06 crc kubenswrapper[4585]: I1201 14:04:06.836100 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0b7b830-078c-4448-b914-ab62e5ff7059-config\") pod \"e0b7b830-078c-4448-b914-ab62e5ff7059\" (UID: \"e0b7b830-078c-4448-b914-ab62e5ff7059\") " Dec 01 14:04:06 crc kubenswrapper[4585]: I1201 14:04:06.836186 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e0b7b830-078c-4448-b914-ab62e5ff7059-proxy-ca-bundles\") pod \"e0b7b830-078c-4448-b914-ab62e5ff7059\" (UID: \"e0b7b830-078c-4448-b914-ab62e5ff7059\") " Dec 01 14:04:06 crc kubenswrapper[4585]: I1201 14:04:06.836263 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ls596\" (UniqueName: \"kubernetes.io/projected/e0b7b830-078c-4448-b914-ab62e5ff7059-kube-api-access-ls596\") pod \"e0b7b830-078c-4448-b914-ab62e5ff7059\" (UID: \"e0b7b830-078c-4448-b914-ab62e5ff7059\") " Dec 01 14:04:06 crc kubenswrapper[4585]: I1201 14:04:06.836288 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0b7b830-078c-4448-b914-ab62e5ff7059-client-ca\") pod \"e0b7b830-078c-4448-b914-ab62e5ff7059\" (UID: \"e0b7b830-078c-4448-b914-ab62e5ff7059\") " Dec 01 14:04:06 crc kubenswrapper[4585]: I1201 14:04:06.837511 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0b7b830-078c-4448-b914-ab62e5ff7059-client-ca" (OuterVolumeSpecName: "client-ca") pod "e0b7b830-078c-4448-b914-ab62e5ff7059" (UID: "e0b7b830-078c-4448-b914-ab62e5ff7059"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:04:06 crc kubenswrapper[4585]: I1201 14:04:06.838305 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0b7b830-078c-4448-b914-ab62e5ff7059-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "e0b7b830-078c-4448-b914-ab62e5ff7059" (UID: "e0b7b830-078c-4448-b914-ab62e5ff7059"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:04:06 crc kubenswrapper[4585]: I1201 14:04:06.838904 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0b7b830-078c-4448-b914-ab62e5ff7059-config" (OuterVolumeSpecName: "config") pod "e0b7b830-078c-4448-b914-ab62e5ff7059" (UID: "e0b7b830-078c-4448-b914-ab62e5ff7059"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:04:06 crc kubenswrapper[4585]: I1201 14:04:06.850193 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0b7b830-078c-4448-b914-ab62e5ff7059-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e0b7b830-078c-4448-b914-ab62e5ff7059" (UID: "e0b7b830-078c-4448-b914-ab62e5ff7059"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:04:06 crc kubenswrapper[4585]: I1201 14:04:06.850728 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0b7b830-078c-4448-b914-ab62e5ff7059-kube-api-access-ls596" (OuterVolumeSpecName: "kube-api-access-ls596") pod "e0b7b830-078c-4448-b914-ab62e5ff7059" (UID: "e0b7b830-078c-4448-b914-ab62e5ff7059"). InnerVolumeSpecName "kube-api-access-ls596". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:04:06 crc kubenswrapper[4585]: I1201 14:04:06.854763 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2vtl4" Dec 01 14:04:06 crc kubenswrapper[4585]: I1201 14:04:06.937615 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7aedeec4-fc41-47aa-85a1-d5a92de50deb-config\") pod \"7aedeec4-fc41-47aa-85a1-d5a92de50deb\" (UID: \"7aedeec4-fc41-47aa-85a1-d5a92de50deb\") " Dec 01 14:04:06 crc kubenswrapper[4585]: I1201 14:04:06.937683 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzd22\" (UniqueName: \"kubernetes.io/projected/7aedeec4-fc41-47aa-85a1-d5a92de50deb-kube-api-access-bzd22\") pod \"7aedeec4-fc41-47aa-85a1-d5a92de50deb\" (UID: \"7aedeec4-fc41-47aa-85a1-d5a92de50deb\") " Dec 01 14:04:06 crc kubenswrapper[4585]: I1201 14:04:06.937715 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7aedeec4-fc41-47aa-85a1-d5a92de50deb-serving-cert\") pod \"7aedeec4-fc41-47aa-85a1-d5a92de50deb\" (UID: \"7aedeec4-fc41-47aa-85a1-d5a92de50deb\") " Dec 01 14:04:06 crc kubenswrapper[4585]: I1201 14:04:06.937736 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7aedeec4-fc41-47aa-85a1-d5a92de50deb-client-ca\") pod \"7aedeec4-fc41-47aa-85a1-d5a92de50deb\" (UID: \"7aedeec4-fc41-47aa-85a1-d5a92de50deb\") " Dec 01 14:04:06 crc kubenswrapper[4585]: I1201 14:04:06.938142 4585 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e0b7b830-078c-4448-b914-ab62e5ff7059-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 14:04:06 crc kubenswrapper[4585]: I1201 14:04:06.938166 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ls596\" (UniqueName: \"kubernetes.io/projected/e0b7b830-078c-4448-b914-ab62e5ff7059-kube-api-access-ls596\") on node \"crc\" DevicePath \"\"" Dec 01 14:04:06 crc kubenswrapper[4585]: I1201 14:04:06.938183 4585 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0b7b830-078c-4448-b914-ab62e5ff7059-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 14:04:06 crc kubenswrapper[4585]: I1201 14:04:06.938907 4585 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0b7b830-078c-4448-b914-ab62e5ff7059-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 14:04:06 crc kubenswrapper[4585]: I1201 14:04:06.938928 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0b7b830-078c-4448-b914-ab62e5ff7059-config\") on node \"crc\" DevicePath \"\"" Dec 01 14:04:06 crc kubenswrapper[4585]: I1201 14:04:06.938702 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7aedeec4-fc41-47aa-85a1-d5a92de50deb-client-ca" (OuterVolumeSpecName: "client-ca") pod "7aedeec4-fc41-47aa-85a1-d5a92de50deb" (UID: "7aedeec4-fc41-47aa-85a1-d5a92de50deb"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:04:06 crc kubenswrapper[4585]: I1201 14:04:06.938820 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7aedeec4-fc41-47aa-85a1-d5a92de50deb-config" (OuterVolumeSpecName: "config") pod "7aedeec4-fc41-47aa-85a1-d5a92de50deb" (UID: "7aedeec4-fc41-47aa-85a1-d5a92de50deb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:04:06 crc kubenswrapper[4585]: I1201 14:04:06.941906 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7aedeec4-fc41-47aa-85a1-d5a92de50deb-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7aedeec4-fc41-47aa-85a1-d5a92de50deb" (UID: "7aedeec4-fc41-47aa-85a1-d5a92de50deb"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:04:06 crc kubenswrapper[4585]: I1201 14:04:06.942884 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7aedeec4-fc41-47aa-85a1-d5a92de50deb-kube-api-access-bzd22" (OuterVolumeSpecName: "kube-api-access-bzd22") pod "7aedeec4-fc41-47aa-85a1-d5a92de50deb" (UID: "7aedeec4-fc41-47aa-85a1-d5a92de50deb"). InnerVolumeSpecName "kube-api-access-bzd22". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:04:07 crc kubenswrapper[4585]: I1201 14:04:07.039408 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7aedeec4-fc41-47aa-85a1-d5a92de50deb-config\") on node \"crc\" DevicePath \"\"" Dec 01 14:04:07 crc kubenswrapper[4585]: I1201 14:04:07.039858 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzd22\" (UniqueName: \"kubernetes.io/projected/7aedeec4-fc41-47aa-85a1-d5a92de50deb-kube-api-access-bzd22\") on node \"crc\" DevicePath \"\"" Dec 01 14:04:07 crc kubenswrapper[4585]: I1201 14:04:07.039873 4585 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7aedeec4-fc41-47aa-85a1-d5a92de50deb-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 14:04:07 crc kubenswrapper[4585]: I1201 14:04:07.039888 4585 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7aedeec4-fc41-47aa-85a1-d5a92de50deb-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 14:04:07 crc kubenswrapper[4585]: I1201 14:04:07.535892 4585 generic.go:334] "Generic (PLEG): container finished" podID="7aedeec4-fc41-47aa-85a1-d5a92de50deb" containerID="3f69e7acf4e89062c30ccd7ac049a19d74400767bd1fa888a301bc5a4e125c1c" exitCode=0 Dec 01 14:04:07 crc kubenswrapper[4585]: I1201 14:04:07.535997 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2vtl4" event={"ID":"7aedeec4-fc41-47aa-85a1-d5a92de50deb","Type":"ContainerDied","Data":"3f69e7acf4e89062c30ccd7ac049a19d74400767bd1fa888a301bc5a4e125c1c"} Dec 01 14:04:07 crc kubenswrapper[4585]: I1201 14:04:07.536028 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2vtl4" Dec 01 14:04:07 crc kubenswrapper[4585]: I1201 14:04:07.536057 4585 scope.go:117] "RemoveContainer" containerID="3f69e7acf4e89062c30ccd7ac049a19d74400767bd1fa888a301bc5a4e125c1c" Dec 01 14:04:07 crc kubenswrapper[4585]: I1201 14:04:07.536044 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2vtl4" event={"ID":"7aedeec4-fc41-47aa-85a1-d5a92de50deb","Type":"ContainerDied","Data":"399fa49b23bfa440b0d8a1ea9cb5a6d18bf49facf00a52b0589f86ceebf5ca49"} Dec 01 14:04:07 crc kubenswrapper[4585]: I1201 14:04:07.538628 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-n4pr6" event={"ID":"e0b7b830-078c-4448-b914-ab62e5ff7059","Type":"ContainerDied","Data":"c271bb81a6ff7c8a174716750428e5b3af447fa7e355a3f9f0c65bd01d6fba6f"} Dec 01 14:04:07 crc kubenswrapper[4585]: I1201 14:04:07.538715 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-n4pr6" Dec 01 14:04:07 crc kubenswrapper[4585]: I1201 14:04:07.554216 4585 scope.go:117] "RemoveContainer" containerID="3f69e7acf4e89062c30ccd7ac049a19d74400767bd1fa888a301bc5a4e125c1c" Dec 01 14:04:07 crc kubenswrapper[4585]: E1201 14:04:07.554993 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f69e7acf4e89062c30ccd7ac049a19d74400767bd1fa888a301bc5a4e125c1c\": container with ID starting with 3f69e7acf4e89062c30ccd7ac049a19d74400767bd1fa888a301bc5a4e125c1c not found: ID does not exist" containerID="3f69e7acf4e89062c30ccd7ac049a19d74400767bd1fa888a301bc5a4e125c1c" Dec 01 14:04:07 crc kubenswrapper[4585]: I1201 14:04:07.555042 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f69e7acf4e89062c30ccd7ac049a19d74400767bd1fa888a301bc5a4e125c1c"} err="failed to get container status \"3f69e7acf4e89062c30ccd7ac049a19d74400767bd1fa888a301bc5a4e125c1c\": rpc error: code = NotFound desc = could not find container \"3f69e7acf4e89062c30ccd7ac049a19d74400767bd1fa888a301bc5a4e125c1c\": container with ID starting with 3f69e7acf4e89062c30ccd7ac049a19d74400767bd1fa888a301bc5a4e125c1c not found: ID does not exist" Dec 01 14:04:07 crc kubenswrapper[4585]: I1201 14:04:07.555073 4585 scope.go:117] "RemoveContainer" containerID="2cbe9f4750d1dc4f5fdacae9f66accbd7e008fac26e4d61d68b288134b1bf124" Dec 01 14:04:07 crc kubenswrapper[4585]: I1201 14:04:07.573507 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-n4pr6"] Dec 01 14:04:07 crc kubenswrapper[4585]: I1201 14:04:07.581104 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-n4pr6"] Dec 01 14:04:07 crc kubenswrapper[4585]: I1201 14:04:07.583833 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-2vtl4"] Dec 01 14:04:07 crc kubenswrapper[4585]: I1201 14:04:07.586493 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-2vtl4"] Dec 01 14:04:08 crc kubenswrapper[4585]: I1201 14:04:08.421292 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7aedeec4-fc41-47aa-85a1-d5a92de50deb" path="/var/lib/kubelet/pods/7aedeec4-fc41-47aa-85a1-d5a92de50deb/volumes" Dec 01 14:04:08 crc kubenswrapper[4585]: I1201 14:04:08.421851 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0b7b830-078c-4448-b914-ab62e5ff7059" path="/var/lib/kubelet/pods/e0b7b830-078c-4448-b914-ab62e5ff7059/volumes" Dec 01 14:04:08 crc kubenswrapper[4585]: I1201 14:04:08.479875 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d74b7c87d-s58hc"] Dec 01 14:04:08 crc kubenswrapper[4585]: E1201 14:04:08.480213 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 01 14:04:08 crc kubenswrapper[4585]: I1201 14:04:08.480237 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 01 14:04:08 crc kubenswrapper[4585]: E1201 14:04:08.480262 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0b7b830-078c-4448-b914-ab62e5ff7059" containerName="controller-manager" Dec 01 14:04:08 crc kubenswrapper[4585]: I1201 14:04:08.480270 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0b7b830-078c-4448-b914-ab62e5ff7059" containerName="controller-manager" Dec 01 14:04:08 crc kubenswrapper[4585]: E1201 14:04:08.480284 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7aedeec4-fc41-47aa-85a1-d5a92de50deb" containerName="route-controller-manager" Dec 01 14:04:08 crc kubenswrapper[4585]: I1201 14:04:08.480291 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="7aedeec4-fc41-47aa-85a1-d5a92de50deb" containerName="route-controller-manager" Dec 01 14:04:08 crc kubenswrapper[4585]: I1201 14:04:08.480447 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0b7b830-078c-4448-b914-ab62e5ff7059" containerName="controller-manager" Dec 01 14:04:08 crc kubenswrapper[4585]: I1201 14:04:08.480468 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="7aedeec4-fc41-47aa-85a1-d5a92de50deb" containerName="route-controller-manager" Dec 01 14:04:08 crc kubenswrapper[4585]: I1201 14:04:08.480480 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 01 14:04:08 crc kubenswrapper[4585]: I1201 14:04:08.481214 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-s58hc" Dec 01 14:04:08 crc kubenswrapper[4585]: I1201 14:04:08.483873 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 01 14:04:08 crc kubenswrapper[4585]: I1201 14:04:08.484127 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6d47dc4c68-6vr9n"] Dec 01 14:04:08 crc kubenswrapper[4585]: I1201 14:04:08.484510 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 01 14:04:08 crc kubenswrapper[4585]: I1201 14:04:08.484800 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6d47dc4c68-6vr9n" Dec 01 14:04:08 crc kubenswrapper[4585]: I1201 14:04:08.485002 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 01 14:04:08 crc kubenswrapper[4585]: I1201 14:04:08.485144 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 01 14:04:08 crc kubenswrapper[4585]: I1201 14:04:08.485433 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 01 14:04:08 crc kubenswrapper[4585]: I1201 14:04:08.486520 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 01 14:04:08 crc kubenswrapper[4585]: I1201 14:04:08.491705 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 01 14:04:08 crc kubenswrapper[4585]: I1201 14:04:08.492304 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 01 14:04:08 crc kubenswrapper[4585]: I1201 14:04:08.492738 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 01 14:04:08 crc kubenswrapper[4585]: I1201 14:04:08.492945 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 01 14:04:08 crc kubenswrapper[4585]: I1201 14:04:08.493290 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 01 14:04:08 crc kubenswrapper[4585]: I1201 14:04:08.494256 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 01 14:04:08 crc kubenswrapper[4585]: I1201 14:04:08.495916 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 01 14:04:08 crc kubenswrapper[4585]: I1201 14:04:08.504379 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6d47dc4c68-6vr9n"] Dec 01 14:04:08 crc kubenswrapper[4585]: I1201 14:04:08.508028 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d74b7c87d-s58hc"] Dec 01 14:04:08 crc kubenswrapper[4585]: I1201 14:04:08.566758 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee0a972a-5c20-4d54-a5dc-fab8121d2de0-serving-cert\") pod \"route-controller-manager-5d74b7c87d-s58hc\" (UID: \"ee0a972a-5c20-4d54-a5dc-fab8121d2de0\") " pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-s58hc" Dec 01 14:04:08 crc kubenswrapper[4585]: I1201 14:04:08.566817 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/61cfcfa9-42be-4ff9-8147-c01d696e46ce-client-ca\") pod \"controller-manager-6d47dc4c68-6vr9n\" (UID: \"61cfcfa9-42be-4ff9-8147-c01d696e46ce\") " pod="openshift-controller-manager/controller-manager-6d47dc4c68-6vr9n" Dec 01 14:04:08 crc kubenswrapper[4585]: I1201 14:04:08.566851 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/61cfcfa9-42be-4ff9-8147-c01d696e46ce-proxy-ca-bundles\") pod \"controller-manager-6d47dc4c68-6vr9n\" (UID: \"61cfcfa9-42be-4ff9-8147-c01d696e46ce\") " pod="openshift-controller-manager/controller-manager-6d47dc4c68-6vr9n" Dec 01 14:04:08 crc kubenswrapper[4585]: I1201 14:04:08.566874 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61cfcfa9-42be-4ff9-8147-c01d696e46ce-serving-cert\") pod \"controller-manager-6d47dc4c68-6vr9n\" (UID: \"61cfcfa9-42be-4ff9-8147-c01d696e46ce\") " pod="openshift-controller-manager/controller-manager-6d47dc4c68-6vr9n" Dec 01 14:04:08 crc kubenswrapper[4585]: I1201 14:04:08.566899 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61cfcfa9-42be-4ff9-8147-c01d696e46ce-config\") pod \"controller-manager-6d47dc4c68-6vr9n\" (UID: \"61cfcfa9-42be-4ff9-8147-c01d696e46ce\") " pod="openshift-controller-manager/controller-manager-6d47dc4c68-6vr9n" Dec 01 14:04:08 crc kubenswrapper[4585]: I1201 14:04:08.566919 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee0a972a-5c20-4d54-a5dc-fab8121d2de0-config\") pod \"route-controller-manager-5d74b7c87d-s58hc\" (UID: \"ee0a972a-5c20-4d54-a5dc-fab8121d2de0\") " pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-s58hc" Dec 01 14:04:08 crc kubenswrapper[4585]: I1201 14:04:08.566934 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ee0a972a-5c20-4d54-a5dc-fab8121d2de0-client-ca\") pod \"route-controller-manager-5d74b7c87d-s58hc\" (UID: \"ee0a972a-5c20-4d54-a5dc-fab8121d2de0\") " pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-s58hc" Dec 01 14:04:08 crc kubenswrapper[4585]: I1201 14:04:08.567012 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn59z\" (UniqueName: \"kubernetes.io/projected/ee0a972a-5c20-4d54-a5dc-fab8121d2de0-kube-api-access-rn59z\") pod \"route-controller-manager-5d74b7c87d-s58hc\" (UID: \"ee0a972a-5c20-4d54-a5dc-fab8121d2de0\") " pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-s58hc" Dec 01 14:04:08 crc kubenswrapper[4585]: I1201 14:04:08.567103 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4b4n\" (UniqueName: \"kubernetes.io/projected/61cfcfa9-42be-4ff9-8147-c01d696e46ce-kube-api-access-c4b4n\") pod \"controller-manager-6d47dc4c68-6vr9n\" (UID: \"61cfcfa9-42be-4ff9-8147-c01d696e46ce\") " pod="openshift-controller-manager/controller-manager-6d47dc4c68-6vr9n" Dec 01 14:04:08 crc kubenswrapper[4585]: I1201 14:04:08.668035 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee0a972a-5c20-4d54-a5dc-fab8121d2de0-serving-cert\") pod \"route-controller-manager-5d74b7c87d-s58hc\" (UID: \"ee0a972a-5c20-4d54-a5dc-fab8121d2de0\") " pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-s58hc" Dec 01 14:04:08 crc kubenswrapper[4585]: I1201 14:04:08.668111 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/61cfcfa9-42be-4ff9-8147-c01d696e46ce-client-ca\") pod \"controller-manager-6d47dc4c68-6vr9n\" (UID: \"61cfcfa9-42be-4ff9-8147-c01d696e46ce\") " pod="openshift-controller-manager/controller-manager-6d47dc4c68-6vr9n" Dec 01 14:04:08 crc kubenswrapper[4585]: I1201 14:04:08.668130 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/61cfcfa9-42be-4ff9-8147-c01d696e46ce-proxy-ca-bundles\") pod \"controller-manager-6d47dc4c68-6vr9n\" (UID: \"61cfcfa9-42be-4ff9-8147-c01d696e46ce\") " pod="openshift-controller-manager/controller-manager-6d47dc4c68-6vr9n" Dec 01 14:04:08 crc kubenswrapper[4585]: I1201 14:04:08.668151 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61cfcfa9-42be-4ff9-8147-c01d696e46ce-serving-cert\") pod \"controller-manager-6d47dc4c68-6vr9n\" (UID: \"61cfcfa9-42be-4ff9-8147-c01d696e46ce\") " pod="openshift-controller-manager/controller-manager-6d47dc4c68-6vr9n" Dec 01 14:04:08 crc kubenswrapper[4585]: I1201 14:04:08.668177 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61cfcfa9-42be-4ff9-8147-c01d696e46ce-config\") pod \"controller-manager-6d47dc4c68-6vr9n\" (UID: \"61cfcfa9-42be-4ff9-8147-c01d696e46ce\") " pod="openshift-controller-manager/controller-manager-6d47dc4c68-6vr9n" Dec 01 14:04:08 crc kubenswrapper[4585]: I1201 14:04:08.668194 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee0a972a-5c20-4d54-a5dc-fab8121d2de0-config\") pod \"route-controller-manager-5d74b7c87d-s58hc\" (UID: \"ee0a972a-5c20-4d54-a5dc-fab8121d2de0\") " pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-s58hc" Dec 01 14:04:08 crc kubenswrapper[4585]: I1201 14:04:08.668211 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ee0a972a-5c20-4d54-a5dc-fab8121d2de0-client-ca\") pod \"route-controller-manager-5d74b7c87d-s58hc\" (UID: \"ee0a972a-5c20-4d54-a5dc-fab8121d2de0\") " pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-s58hc" Dec 01 14:04:08 crc kubenswrapper[4585]: I1201 14:04:08.668227 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rn59z\" (UniqueName: \"kubernetes.io/projected/ee0a972a-5c20-4d54-a5dc-fab8121d2de0-kube-api-access-rn59z\") pod \"route-controller-manager-5d74b7c87d-s58hc\" (UID: \"ee0a972a-5c20-4d54-a5dc-fab8121d2de0\") " pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-s58hc" Dec 01 14:04:08 crc kubenswrapper[4585]: I1201 14:04:08.668248 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4b4n\" (UniqueName: \"kubernetes.io/projected/61cfcfa9-42be-4ff9-8147-c01d696e46ce-kube-api-access-c4b4n\") pod \"controller-manager-6d47dc4c68-6vr9n\" (UID: \"61cfcfa9-42be-4ff9-8147-c01d696e46ce\") " pod="openshift-controller-manager/controller-manager-6d47dc4c68-6vr9n" Dec 01 14:04:08 crc kubenswrapper[4585]: I1201 14:04:08.669782 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ee0a972a-5c20-4d54-a5dc-fab8121d2de0-client-ca\") pod \"route-controller-manager-5d74b7c87d-s58hc\" (UID: \"ee0a972a-5c20-4d54-a5dc-fab8121d2de0\") " pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-s58hc" Dec 01 14:04:08 crc kubenswrapper[4585]: I1201 14:04:08.669790 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/61cfcfa9-42be-4ff9-8147-c01d696e46ce-client-ca\") pod \"controller-manager-6d47dc4c68-6vr9n\" (UID: \"61cfcfa9-42be-4ff9-8147-c01d696e46ce\") " pod="openshift-controller-manager/controller-manager-6d47dc4c68-6vr9n" Dec 01 14:04:08 crc kubenswrapper[4585]: I1201 14:04:08.670157 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61cfcfa9-42be-4ff9-8147-c01d696e46ce-config\") pod \"controller-manager-6d47dc4c68-6vr9n\" (UID: \"61cfcfa9-42be-4ff9-8147-c01d696e46ce\") " pod="openshift-controller-manager/controller-manager-6d47dc4c68-6vr9n" Dec 01 14:04:08 crc kubenswrapper[4585]: I1201 14:04:08.670460 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/61cfcfa9-42be-4ff9-8147-c01d696e46ce-proxy-ca-bundles\") pod \"controller-manager-6d47dc4c68-6vr9n\" (UID: \"61cfcfa9-42be-4ff9-8147-c01d696e46ce\") " pod="openshift-controller-manager/controller-manager-6d47dc4c68-6vr9n" Dec 01 14:04:08 crc kubenswrapper[4585]: I1201 14:04:08.670585 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee0a972a-5c20-4d54-a5dc-fab8121d2de0-config\") pod \"route-controller-manager-5d74b7c87d-s58hc\" (UID: \"ee0a972a-5c20-4d54-a5dc-fab8121d2de0\") " pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-s58hc" Dec 01 14:04:08 crc kubenswrapper[4585]: I1201 14:04:08.674904 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61cfcfa9-42be-4ff9-8147-c01d696e46ce-serving-cert\") pod \"controller-manager-6d47dc4c68-6vr9n\" (UID: \"61cfcfa9-42be-4ff9-8147-c01d696e46ce\") " pod="openshift-controller-manager/controller-manager-6d47dc4c68-6vr9n" Dec 01 14:04:08 crc kubenswrapper[4585]: I1201 14:04:08.674911 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee0a972a-5c20-4d54-a5dc-fab8121d2de0-serving-cert\") pod \"route-controller-manager-5d74b7c87d-s58hc\" (UID: \"ee0a972a-5c20-4d54-a5dc-fab8121d2de0\") " pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-s58hc" Dec 01 14:04:08 crc kubenswrapper[4585]: I1201 14:04:08.687754 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4b4n\" (UniqueName: \"kubernetes.io/projected/61cfcfa9-42be-4ff9-8147-c01d696e46ce-kube-api-access-c4b4n\") pod \"controller-manager-6d47dc4c68-6vr9n\" (UID: \"61cfcfa9-42be-4ff9-8147-c01d696e46ce\") " pod="openshift-controller-manager/controller-manager-6d47dc4c68-6vr9n" Dec 01 14:04:08 crc kubenswrapper[4585]: I1201 14:04:08.697186 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rn59z\" (UniqueName: \"kubernetes.io/projected/ee0a972a-5c20-4d54-a5dc-fab8121d2de0-kube-api-access-rn59z\") pod \"route-controller-manager-5d74b7c87d-s58hc\" (UID: \"ee0a972a-5c20-4d54-a5dc-fab8121d2de0\") " pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-s58hc" Dec 01 14:04:08 crc kubenswrapper[4585]: I1201 14:04:08.806308 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-s58hc" Dec 01 14:04:08 crc kubenswrapper[4585]: I1201 14:04:08.827895 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6d47dc4c68-6vr9n" Dec 01 14:04:09 crc kubenswrapper[4585]: I1201 14:04:09.015337 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d74b7c87d-s58hc"] Dec 01 14:04:09 crc kubenswrapper[4585]: I1201 14:04:09.066114 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6d47dc4c68-6vr9n"] Dec 01 14:04:09 crc kubenswrapper[4585]: W1201 14:04:09.079559 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61cfcfa9_42be_4ff9_8147_c01d696e46ce.slice/crio-5f8dea156db7b218f3d124d1e806dd889b2799f81016b8fb341bb850f7567b5d WatchSource:0}: Error finding container 5f8dea156db7b218f3d124d1e806dd889b2799f81016b8fb341bb850f7567b5d: Status 404 returned error can't find the container with id 5f8dea156db7b218f3d124d1e806dd889b2799f81016b8fb341bb850f7567b5d Dec 01 14:04:09 crc kubenswrapper[4585]: I1201 14:04:09.555393 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-s58hc" event={"ID":"ee0a972a-5c20-4d54-a5dc-fab8121d2de0","Type":"ContainerStarted","Data":"3089ee4286a212bd7734a4a0c0a8bcdb897f70b060f4540f3d2fa97ce005f857"} Dec 01 14:04:09 crc kubenswrapper[4585]: I1201 14:04:09.555440 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-s58hc" event={"ID":"ee0a972a-5c20-4d54-a5dc-fab8121d2de0","Type":"ContainerStarted","Data":"711bff127d843a95a3f4904e9efe41a4868c192f33998e1182409ff686f253ce"} Dec 01 14:04:09 crc kubenswrapper[4585]: I1201 14:04:09.557750 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-s58hc" Dec 01 14:04:09 crc kubenswrapper[4585]: I1201 14:04:09.559842 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6d47dc4c68-6vr9n" event={"ID":"61cfcfa9-42be-4ff9-8147-c01d696e46ce","Type":"ContainerStarted","Data":"e31d326ee8aaaf5c38f11dccb399f54ee0fdf42e3ac2f3d4df46bbc4bbdfea82"} Dec 01 14:04:09 crc kubenswrapper[4585]: I1201 14:04:09.559891 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6d47dc4c68-6vr9n" event={"ID":"61cfcfa9-42be-4ff9-8147-c01d696e46ce","Type":"ContainerStarted","Data":"5f8dea156db7b218f3d124d1e806dd889b2799f81016b8fb341bb850f7567b5d"} Dec 01 14:04:09 crc kubenswrapper[4585]: I1201 14:04:09.560441 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6d47dc4c68-6vr9n" Dec 01 14:04:09 crc kubenswrapper[4585]: I1201 14:04:09.578929 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6d47dc4c68-6vr9n" Dec 01 14:04:09 crc kubenswrapper[4585]: I1201 14:04:09.605708 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-s58hc" podStartSLOduration=3.605684303 podStartE2EDuration="3.605684303s" podCreationTimestamp="2025-12-01 14:04:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:04:09.598596723 +0000 UTC m=+363.582810578" watchObservedRunningTime="2025-12-01 14:04:09.605684303 +0000 UTC m=+363.589898158" Dec 01 14:04:09 crc kubenswrapper[4585]: I1201 14:04:09.688107 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6d47dc4c68-6vr9n" podStartSLOduration=3.6880858549999997 podStartE2EDuration="3.688085855s" podCreationTimestamp="2025-12-01 14:04:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:04:09.685847748 +0000 UTC m=+363.670061603" watchObservedRunningTime="2025-12-01 14:04:09.688085855 +0000 UTC m=+363.672299710" Dec 01 14:04:09 crc kubenswrapper[4585]: I1201 14:04:09.774632 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-s58hc" Dec 01 14:04:10 crc kubenswrapper[4585]: I1201 14:04:10.292961 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mzg67"] Dec 01 14:04:10 crc kubenswrapper[4585]: I1201 14:04:10.294093 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mzg67" Dec 01 14:04:10 crc kubenswrapper[4585]: I1201 14:04:10.296490 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 01 14:04:10 crc kubenswrapper[4585]: I1201 14:04:10.314371 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mzg67"] Dec 01 14:04:10 crc kubenswrapper[4585]: I1201 14:04:10.394983 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d92e1230-1b33-449a-9d96-204cdc4cc3ee-catalog-content\") pod \"certified-operators-mzg67\" (UID: \"d92e1230-1b33-449a-9d96-204cdc4cc3ee\") " pod="openshift-marketplace/certified-operators-mzg67" Dec 01 14:04:10 crc kubenswrapper[4585]: I1201 14:04:10.395054 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rf7tl\" (UniqueName: \"kubernetes.io/projected/d92e1230-1b33-449a-9d96-204cdc4cc3ee-kube-api-access-rf7tl\") pod \"certified-operators-mzg67\" (UID: \"d92e1230-1b33-449a-9d96-204cdc4cc3ee\") " pod="openshift-marketplace/certified-operators-mzg67" Dec 01 14:04:10 crc kubenswrapper[4585]: I1201 14:04:10.395083 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d92e1230-1b33-449a-9d96-204cdc4cc3ee-utilities\") pod \"certified-operators-mzg67\" (UID: \"d92e1230-1b33-449a-9d96-204cdc4cc3ee\") " pod="openshift-marketplace/certified-operators-mzg67" Dec 01 14:04:10 crc kubenswrapper[4585]: I1201 14:04:10.496121 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d92e1230-1b33-449a-9d96-204cdc4cc3ee-utilities\") pod \"certified-operators-mzg67\" (UID: \"d92e1230-1b33-449a-9d96-204cdc4cc3ee\") " pod="openshift-marketplace/certified-operators-mzg67" Dec 01 14:04:10 crc kubenswrapper[4585]: I1201 14:04:10.496244 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d92e1230-1b33-449a-9d96-204cdc4cc3ee-catalog-content\") pod \"certified-operators-mzg67\" (UID: \"d92e1230-1b33-449a-9d96-204cdc4cc3ee\") " pod="openshift-marketplace/certified-operators-mzg67" Dec 01 14:04:10 crc kubenswrapper[4585]: I1201 14:04:10.496294 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rf7tl\" (UniqueName: \"kubernetes.io/projected/d92e1230-1b33-449a-9d96-204cdc4cc3ee-kube-api-access-rf7tl\") pod \"certified-operators-mzg67\" (UID: \"d92e1230-1b33-449a-9d96-204cdc4cc3ee\") " pod="openshift-marketplace/certified-operators-mzg67" Dec 01 14:04:10 crc kubenswrapper[4585]: I1201 14:04:10.496827 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d92e1230-1b33-449a-9d96-204cdc4cc3ee-utilities\") pod \"certified-operators-mzg67\" (UID: \"d92e1230-1b33-449a-9d96-204cdc4cc3ee\") " pod="openshift-marketplace/certified-operators-mzg67" Dec 01 14:04:10 crc kubenswrapper[4585]: I1201 14:04:10.496864 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d92e1230-1b33-449a-9d96-204cdc4cc3ee-catalog-content\") pod \"certified-operators-mzg67\" (UID: \"d92e1230-1b33-449a-9d96-204cdc4cc3ee\") " pod="openshift-marketplace/certified-operators-mzg67" Dec 01 14:04:10 crc kubenswrapper[4585]: I1201 14:04:10.518547 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rf7tl\" (UniqueName: \"kubernetes.io/projected/d92e1230-1b33-449a-9d96-204cdc4cc3ee-kube-api-access-rf7tl\") pod \"certified-operators-mzg67\" (UID: \"d92e1230-1b33-449a-9d96-204cdc4cc3ee\") " pod="openshift-marketplace/certified-operators-mzg67" Dec 01 14:04:10 crc kubenswrapper[4585]: I1201 14:04:10.612367 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mzg67" Dec 01 14:04:11 crc kubenswrapper[4585]: I1201 14:04:11.014182 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mzg67"] Dec 01 14:04:11 crc kubenswrapper[4585]: I1201 14:04:11.573184 4585 generic.go:334] "Generic (PLEG): container finished" podID="d92e1230-1b33-449a-9d96-204cdc4cc3ee" containerID="45336fd94d7b5453c7a05c29bc0ddb5642563eb517d40ef0230d1b819f0c3ccf" exitCode=0 Dec 01 14:04:11 crc kubenswrapper[4585]: I1201 14:04:11.573404 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mzg67" event={"ID":"d92e1230-1b33-449a-9d96-204cdc4cc3ee","Type":"ContainerDied","Data":"45336fd94d7b5453c7a05c29bc0ddb5642563eb517d40ef0230d1b819f0c3ccf"} Dec 01 14:04:11 crc kubenswrapper[4585]: I1201 14:04:11.573785 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mzg67" event={"ID":"d92e1230-1b33-449a-9d96-204cdc4cc3ee","Type":"ContainerStarted","Data":"feb76ae8c59da6b53673c1d755227fc1b2ddc1d6199313ba260e2363a9084c92"} Dec 01 14:04:12 crc kubenswrapper[4585]: I1201 14:04:12.685709 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4mtgw"] Dec 01 14:04:12 crc kubenswrapper[4585]: I1201 14:04:12.687046 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4mtgw" Dec 01 14:04:12 crc kubenswrapper[4585]: I1201 14:04:12.689178 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 01 14:04:12 crc kubenswrapper[4585]: I1201 14:04:12.737642 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4mtgw"] Dec 01 14:04:12 crc kubenswrapper[4585]: I1201 14:04:12.827189 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7ac0d5a-4c26-4734-9386-f775f8dc5461-catalog-content\") pod \"redhat-operators-4mtgw\" (UID: \"d7ac0d5a-4c26-4734-9386-f775f8dc5461\") " pod="openshift-marketplace/redhat-operators-4mtgw" Dec 01 14:04:12 crc kubenswrapper[4585]: I1201 14:04:12.827512 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbs4t\" (UniqueName: \"kubernetes.io/projected/d7ac0d5a-4c26-4734-9386-f775f8dc5461-kube-api-access-rbs4t\") pod \"redhat-operators-4mtgw\" (UID: \"d7ac0d5a-4c26-4734-9386-f775f8dc5461\") " pod="openshift-marketplace/redhat-operators-4mtgw" Dec 01 14:04:12 crc kubenswrapper[4585]: I1201 14:04:12.827650 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7ac0d5a-4c26-4734-9386-f775f8dc5461-utilities\") pod \"redhat-operators-4mtgw\" (UID: \"d7ac0d5a-4c26-4734-9386-f775f8dc5461\") " pod="openshift-marketplace/redhat-operators-4mtgw" Dec 01 14:04:12 crc kubenswrapper[4585]: I1201 14:04:12.929012 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7ac0d5a-4c26-4734-9386-f775f8dc5461-catalog-content\") pod \"redhat-operators-4mtgw\" (UID: \"d7ac0d5a-4c26-4734-9386-f775f8dc5461\") " pod="openshift-marketplace/redhat-operators-4mtgw" Dec 01 14:04:12 crc kubenswrapper[4585]: I1201 14:04:12.929075 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbs4t\" (UniqueName: \"kubernetes.io/projected/d7ac0d5a-4c26-4734-9386-f775f8dc5461-kube-api-access-rbs4t\") pod \"redhat-operators-4mtgw\" (UID: \"d7ac0d5a-4c26-4734-9386-f775f8dc5461\") " pod="openshift-marketplace/redhat-operators-4mtgw" Dec 01 14:04:12 crc kubenswrapper[4585]: I1201 14:04:12.929111 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7ac0d5a-4c26-4734-9386-f775f8dc5461-utilities\") pod \"redhat-operators-4mtgw\" (UID: \"d7ac0d5a-4c26-4734-9386-f775f8dc5461\") " pod="openshift-marketplace/redhat-operators-4mtgw" Dec 01 14:04:12 crc kubenswrapper[4585]: I1201 14:04:12.929715 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7ac0d5a-4c26-4734-9386-f775f8dc5461-catalog-content\") pod \"redhat-operators-4mtgw\" (UID: \"d7ac0d5a-4c26-4734-9386-f775f8dc5461\") " pod="openshift-marketplace/redhat-operators-4mtgw" Dec 01 14:04:12 crc kubenswrapper[4585]: I1201 14:04:12.929770 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7ac0d5a-4c26-4734-9386-f775f8dc5461-utilities\") pod \"redhat-operators-4mtgw\" (UID: \"d7ac0d5a-4c26-4734-9386-f775f8dc5461\") " pod="openshift-marketplace/redhat-operators-4mtgw" Dec 01 14:04:12 crc kubenswrapper[4585]: I1201 14:04:12.954051 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbs4t\" (UniqueName: \"kubernetes.io/projected/d7ac0d5a-4c26-4734-9386-f775f8dc5461-kube-api-access-rbs4t\") pod \"redhat-operators-4mtgw\" (UID: \"d7ac0d5a-4c26-4734-9386-f775f8dc5461\") " pod="openshift-marketplace/redhat-operators-4mtgw" Dec 01 14:04:13 crc kubenswrapper[4585]: I1201 14:04:13.001638 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4mtgw" Dec 01 14:04:13 crc kubenswrapper[4585]: I1201 14:04:13.458575 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4mtgw"] Dec 01 14:04:13 crc kubenswrapper[4585]: I1201 14:04:13.585813 4585 generic.go:334] "Generic (PLEG): container finished" podID="d92e1230-1b33-449a-9d96-204cdc4cc3ee" containerID="3d7e2c320283c9a4093a052f44178580847808733b6cc1575b2e5afc8a41a71c" exitCode=0 Dec 01 14:04:13 crc kubenswrapper[4585]: I1201 14:04:13.585883 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mzg67" event={"ID":"d92e1230-1b33-449a-9d96-204cdc4cc3ee","Type":"ContainerDied","Data":"3d7e2c320283c9a4093a052f44178580847808733b6cc1575b2e5afc8a41a71c"} Dec 01 14:04:13 crc kubenswrapper[4585]: I1201 14:04:13.589843 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4mtgw" event={"ID":"d7ac0d5a-4c26-4734-9386-f775f8dc5461","Type":"ContainerStarted","Data":"8c4ea4a43781cf383d28f52890560ad5a934d1361792a4a986be6208ccd330ff"} Dec 01 14:04:13 crc kubenswrapper[4585]: I1201 14:04:13.716405 4585 patch_prober.go:28] interesting pod/machine-config-daemon-lj9gs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 14:04:13 crc kubenswrapper[4585]: I1201 14:04:13.716504 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 14:04:14 crc kubenswrapper[4585]: I1201 14:04:14.600950 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mzg67" event={"ID":"d92e1230-1b33-449a-9d96-204cdc4cc3ee","Type":"ContainerStarted","Data":"056c0ca4eb740a7e98bbba90fca14dd50e3a35d17730090ae9d0449b1cd7f882"} Dec 01 14:04:14 crc kubenswrapper[4585]: I1201 14:04:14.604773 4585 generic.go:334] "Generic (PLEG): container finished" podID="d7ac0d5a-4c26-4734-9386-f775f8dc5461" containerID="c9f57f85aee0c06ad67f3ecb299db204d1d4d3c2f805e41832c3d184463fd26a" exitCode=0 Dec 01 14:04:14 crc kubenswrapper[4585]: I1201 14:04:14.604849 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4mtgw" event={"ID":"d7ac0d5a-4c26-4734-9386-f775f8dc5461","Type":"ContainerDied","Data":"c9f57f85aee0c06ad67f3ecb299db204d1d4d3c2f805e41832c3d184463fd26a"} Dec 01 14:04:14 crc kubenswrapper[4585]: I1201 14:04:14.628462 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mzg67" podStartSLOduration=2.038538899 podStartE2EDuration="4.628443578s" podCreationTimestamp="2025-12-01 14:04:10 +0000 UTC" firstStartedPulling="2025-12-01 14:04:11.574870181 +0000 UTC m=+365.559084046" lastFinishedPulling="2025-12-01 14:04:14.16477487 +0000 UTC m=+368.148988725" observedRunningTime="2025-12-01 14:04:14.627147355 +0000 UTC m=+368.611361220" watchObservedRunningTime="2025-12-01 14:04:14.628443578 +0000 UTC m=+368.612657433" Dec 01 14:04:15 crc kubenswrapper[4585]: I1201 14:04:15.615791 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4mtgw" event={"ID":"d7ac0d5a-4c26-4734-9386-f775f8dc5461","Type":"ContainerStarted","Data":"67ac70dade0e9250c7c3074efa1db1eac56312f589507f5fa7b80778dd96252f"} Dec 01 14:04:16 crc kubenswrapper[4585]: I1201 14:04:16.621726 4585 generic.go:334] "Generic (PLEG): container finished" podID="d7ac0d5a-4c26-4734-9386-f775f8dc5461" containerID="67ac70dade0e9250c7c3074efa1db1eac56312f589507f5fa7b80778dd96252f" exitCode=0 Dec 01 14:04:16 crc kubenswrapper[4585]: I1201 14:04:16.621771 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4mtgw" event={"ID":"d7ac0d5a-4c26-4734-9386-f775f8dc5461","Type":"ContainerDied","Data":"67ac70dade0e9250c7c3074efa1db1eac56312f589507f5fa7b80778dd96252f"} Dec 01 14:04:17 crc kubenswrapper[4585]: I1201 14:04:17.629918 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4mtgw" event={"ID":"d7ac0d5a-4c26-4734-9386-f775f8dc5461","Type":"ContainerStarted","Data":"df7d2526494aa6f56f6f85028943202d689b3f8d58aec26166d4ef7ae13097e7"} Dec 01 14:04:17 crc kubenswrapper[4585]: I1201 14:04:17.652649 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4mtgw" podStartSLOduration=3.147792439 podStartE2EDuration="5.65263001s" podCreationTimestamp="2025-12-01 14:04:12 +0000 UTC" firstStartedPulling="2025-12-01 14:04:14.607359913 +0000 UTC m=+368.591573768" lastFinishedPulling="2025-12-01 14:04:17.112197484 +0000 UTC m=+371.096411339" observedRunningTime="2025-12-01 14:04:17.650058505 +0000 UTC m=+371.634272380" watchObservedRunningTime="2025-12-01 14:04:17.65263001 +0000 UTC m=+371.636843865" Dec 01 14:04:20 crc kubenswrapper[4585]: I1201 14:04:20.612420 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mzg67" Dec 01 14:04:20 crc kubenswrapper[4585]: I1201 14:04:20.612777 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mzg67" Dec 01 14:04:20 crc kubenswrapper[4585]: I1201 14:04:20.672031 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mzg67" Dec 01 14:04:20 crc kubenswrapper[4585]: I1201 14:04:20.716036 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mzg67" Dec 01 14:04:23 crc kubenswrapper[4585]: I1201 14:04:23.002728 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4mtgw" Dec 01 14:04:23 crc kubenswrapper[4585]: I1201 14:04:23.003196 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4mtgw" Dec 01 14:04:23 crc kubenswrapper[4585]: I1201 14:04:23.042837 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4mtgw" Dec 01 14:04:23 crc kubenswrapper[4585]: I1201 14:04:23.706419 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4mtgw" Dec 01 14:04:26 crc kubenswrapper[4585]: I1201 14:04:26.399830 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d74b7c87d-s58hc"] Dec 01 14:04:26 crc kubenswrapper[4585]: I1201 14:04:26.400624 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-s58hc" podUID="ee0a972a-5c20-4d54-a5dc-fab8121d2de0" containerName="route-controller-manager" containerID="cri-o://3089ee4286a212bd7734a4a0c0a8bcdb897f70b060f4540f3d2fa97ce005f857" gracePeriod=30 Dec 01 14:04:27 crc kubenswrapper[4585]: I1201 14:04:27.696764 4585 generic.go:334] "Generic (PLEG): container finished" podID="ee0a972a-5c20-4d54-a5dc-fab8121d2de0" containerID="3089ee4286a212bd7734a4a0c0a8bcdb897f70b060f4540f3d2fa97ce005f857" exitCode=0 Dec 01 14:04:27 crc kubenswrapper[4585]: I1201 14:04:27.696857 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-s58hc" event={"ID":"ee0a972a-5c20-4d54-a5dc-fab8121d2de0","Type":"ContainerDied","Data":"3089ee4286a212bd7734a4a0c0a8bcdb897f70b060f4540f3d2fa97ce005f857"} Dec 01 14:04:28 crc kubenswrapper[4585]: I1201 14:04:28.145882 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-s58hc" Dec 01 14:04:28 crc kubenswrapper[4585]: I1201 14:04:28.163742 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rn59z\" (UniqueName: \"kubernetes.io/projected/ee0a972a-5c20-4d54-a5dc-fab8121d2de0-kube-api-access-rn59z\") pod \"ee0a972a-5c20-4d54-a5dc-fab8121d2de0\" (UID: \"ee0a972a-5c20-4d54-a5dc-fab8121d2de0\") " Dec 01 14:04:28 crc kubenswrapper[4585]: I1201 14:04:28.163880 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee0a972a-5c20-4d54-a5dc-fab8121d2de0-config\") pod \"ee0a972a-5c20-4d54-a5dc-fab8121d2de0\" (UID: \"ee0a972a-5c20-4d54-a5dc-fab8121d2de0\") " Dec 01 14:04:28 crc kubenswrapper[4585]: I1201 14:04:28.164018 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee0a972a-5c20-4d54-a5dc-fab8121d2de0-serving-cert\") pod \"ee0a972a-5c20-4d54-a5dc-fab8121d2de0\" (UID: \"ee0a972a-5c20-4d54-a5dc-fab8121d2de0\") " Dec 01 14:04:28 crc kubenswrapper[4585]: I1201 14:04:28.164093 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ee0a972a-5c20-4d54-a5dc-fab8121d2de0-client-ca\") pod \"ee0a972a-5c20-4d54-a5dc-fab8121d2de0\" (UID: \"ee0a972a-5c20-4d54-a5dc-fab8121d2de0\") " Dec 01 14:04:28 crc kubenswrapper[4585]: I1201 14:04:28.165378 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee0a972a-5c20-4d54-a5dc-fab8121d2de0-config" (OuterVolumeSpecName: "config") pod "ee0a972a-5c20-4d54-a5dc-fab8121d2de0" (UID: "ee0a972a-5c20-4d54-a5dc-fab8121d2de0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:04:28 crc kubenswrapper[4585]: I1201 14:04:28.165669 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee0a972a-5c20-4d54-a5dc-fab8121d2de0-client-ca" (OuterVolumeSpecName: "client-ca") pod "ee0a972a-5c20-4d54-a5dc-fab8121d2de0" (UID: "ee0a972a-5c20-4d54-a5dc-fab8121d2de0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:04:28 crc kubenswrapper[4585]: I1201 14:04:28.166506 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee0a972a-5c20-4d54-a5dc-fab8121d2de0-config\") on node \"crc\" DevicePath \"\"" Dec 01 14:04:28 crc kubenswrapper[4585]: I1201 14:04:28.166554 4585 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ee0a972a-5c20-4d54-a5dc-fab8121d2de0-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 14:04:28 crc kubenswrapper[4585]: I1201 14:04:28.175373 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee0a972a-5c20-4d54-a5dc-fab8121d2de0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ee0a972a-5c20-4d54-a5dc-fab8121d2de0" (UID: "ee0a972a-5c20-4d54-a5dc-fab8121d2de0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:04:28 crc kubenswrapper[4585]: I1201 14:04:28.184316 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee0a972a-5c20-4d54-a5dc-fab8121d2de0-kube-api-access-rn59z" (OuterVolumeSpecName: "kube-api-access-rn59z") pod "ee0a972a-5c20-4d54-a5dc-fab8121d2de0" (UID: "ee0a972a-5c20-4d54-a5dc-fab8121d2de0"). InnerVolumeSpecName "kube-api-access-rn59z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:04:28 crc kubenswrapper[4585]: I1201 14:04:28.192964 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-685565dc9d-j6lfq"] Dec 01 14:04:28 crc kubenswrapper[4585]: E1201 14:04:28.193322 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee0a972a-5c20-4d54-a5dc-fab8121d2de0" containerName="route-controller-manager" Dec 01 14:04:28 crc kubenswrapper[4585]: I1201 14:04:28.193346 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee0a972a-5c20-4d54-a5dc-fab8121d2de0" containerName="route-controller-manager" Dec 01 14:04:28 crc kubenswrapper[4585]: I1201 14:04:28.193487 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee0a972a-5c20-4d54-a5dc-fab8121d2de0" containerName="route-controller-manager" Dec 01 14:04:28 crc kubenswrapper[4585]: I1201 14:04:28.194102 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-685565dc9d-j6lfq" Dec 01 14:04:28 crc kubenswrapper[4585]: I1201 14:04:28.209633 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-685565dc9d-j6lfq"] Dec 01 14:04:28 crc kubenswrapper[4585]: I1201 14:04:28.267806 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6decf28-4b98-4789-8ee9-2e16356f0ff5-config\") pod \"route-controller-manager-685565dc9d-j6lfq\" (UID: \"b6decf28-4b98-4789-8ee9-2e16356f0ff5\") " pod="openshift-route-controller-manager/route-controller-manager-685565dc9d-j6lfq" Dec 01 14:04:28 crc kubenswrapper[4585]: I1201 14:04:28.267876 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b6decf28-4b98-4789-8ee9-2e16356f0ff5-client-ca\") pod \"route-controller-manager-685565dc9d-j6lfq\" (UID: \"b6decf28-4b98-4789-8ee9-2e16356f0ff5\") " pod="openshift-route-controller-manager/route-controller-manager-685565dc9d-j6lfq" Dec 01 14:04:28 crc kubenswrapper[4585]: I1201 14:04:28.268022 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpfbj\" (UniqueName: \"kubernetes.io/projected/b6decf28-4b98-4789-8ee9-2e16356f0ff5-kube-api-access-lpfbj\") pod \"route-controller-manager-685565dc9d-j6lfq\" (UID: \"b6decf28-4b98-4789-8ee9-2e16356f0ff5\") " pod="openshift-route-controller-manager/route-controller-manager-685565dc9d-j6lfq" Dec 01 14:04:28 crc kubenswrapper[4585]: I1201 14:04:28.268155 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6decf28-4b98-4789-8ee9-2e16356f0ff5-serving-cert\") pod \"route-controller-manager-685565dc9d-j6lfq\" (UID: \"b6decf28-4b98-4789-8ee9-2e16356f0ff5\") " pod="openshift-route-controller-manager/route-controller-manager-685565dc9d-j6lfq" Dec 01 14:04:28 crc kubenswrapper[4585]: I1201 14:04:28.268316 4585 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee0a972a-5c20-4d54-a5dc-fab8121d2de0-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 14:04:28 crc kubenswrapper[4585]: I1201 14:04:28.268343 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rn59z\" (UniqueName: \"kubernetes.io/projected/ee0a972a-5c20-4d54-a5dc-fab8121d2de0-kube-api-access-rn59z\") on node \"crc\" DevicePath \"\"" Dec 01 14:04:28 crc kubenswrapper[4585]: I1201 14:04:28.369901 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6decf28-4b98-4789-8ee9-2e16356f0ff5-config\") pod \"route-controller-manager-685565dc9d-j6lfq\" (UID: \"b6decf28-4b98-4789-8ee9-2e16356f0ff5\") " pod="openshift-route-controller-manager/route-controller-manager-685565dc9d-j6lfq" Dec 01 14:04:28 crc kubenswrapper[4585]: I1201 14:04:28.369967 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b6decf28-4b98-4789-8ee9-2e16356f0ff5-client-ca\") pod \"route-controller-manager-685565dc9d-j6lfq\" (UID: \"b6decf28-4b98-4789-8ee9-2e16356f0ff5\") " pod="openshift-route-controller-manager/route-controller-manager-685565dc9d-j6lfq" Dec 01 14:04:28 crc kubenswrapper[4585]: I1201 14:04:28.370011 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpfbj\" (UniqueName: \"kubernetes.io/projected/b6decf28-4b98-4789-8ee9-2e16356f0ff5-kube-api-access-lpfbj\") pod \"route-controller-manager-685565dc9d-j6lfq\" (UID: \"b6decf28-4b98-4789-8ee9-2e16356f0ff5\") " pod="openshift-route-controller-manager/route-controller-manager-685565dc9d-j6lfq" Dec 01 14:04:28 crc kubenswrapper[4585]: I1201 14:04:28.370037 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6decf28-4b98-4789-8ee9-2e16356f0ff5-serving-cert\") pod \"route-controller-manager-685565dc9d-j6lfq\" (UID: \"b6decf28-4b98-4789-8ee9-2e16356f0ff5\") " pod="openshift-route-controller-manager/route-controller-manager-685565dc9d-j6lfq" Dec 01 14:04:28 crc kubenswrapper[4585]: I1201 14:04:28.371545 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6decf28-4b98-4789-8ee9-2e16356f0ff5-config\") pod \"route-controller-manager-685565dc9d-j6lfq\" (UID: \"b6decf28-4b98-4789-8ee9-2e16356f0ff5\") " pod="openshift-route-controller-manager/route-controller-manager-685565dc9d-j6lfq" Dec 01 14:04:28 crc kubenswrapper[4585]: I1201 14:04:28.371874 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b6decf28-4b98-4789-8ee9-2e16356f0ff5-client-ca\") pod \"route-controller-manager-685565dc9d-j6lfq\" (UID: \"b6decf28-4b98-4789-8ee9-2e16356f0ff5\") " pod="openshift-route-controller-manager/route-controller-manager-685565dc9d-j6lfq" Dec 01 14:04:28 crc kubenswrapper[4585]: I1201 14:04:28.374592 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6decf28-4b98-4789-8ee9-2e16356f0ff5-serving-cert\") pod \"route-controller-manager-685565dc9d-j6lfq\" (UID: \"b6decf28-4b98-4789-8ee9-2e16356f0ff5\") " pod="openshift-route-controller-manager/route-controller-manager-685565dc9d-j6lfq" Dec 01 14:04:28 crc kubenswrapper[4585]: I1201 14:04:28.389924 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpfbj\" (UniqueName: \"kubernetes.io/projected/b6decf28-4b98-4789-8ee9-2e16356f0ff5-kube-api-access-lpfbj\") pod \"route-controller-manager-685565dc9d-j6lfq\" (UID: \"b6decf28-4b98-4789-8ee9-2e16356f0ff5\") " pod="openshift-route-controller-manager/route-controller-manager-685565dc9d-j6lfq" Dec 01 14:04:28 crc kubenswrapper[4585]: I1201 14:04:28.526424 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-685565dc9d-j6lfq" Dec 01 14:04:28 crc kubenswrapper[4585]: I1201 14:04:28.718461 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-s58hc" event={"ID":"ee0a972a-5c20-4d54-a5dc-fab8121d2de0","Type":"ContainerDied","Data":"711bff127d843a95a3f4904e9efe41a4868c192f33998e1182409ff686f253ce"} Dec 01 14:04:28 crc kubenswrapper[4585]: I1201 14:04:28.718881 4585 scope.go:117] "RemoveContainer" containerID="3089ee4286a212bd7734a4a0c0a8bcdb897f70b060f4540f3d2fa97ce005f857" Dec 01 14:04:28 crc kubenswrapper[4585]: I1201 14:04:28.718564 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d74b7c87d-s58hc" Dec 01 14:04:28 crc kubenswrapper[4585]: I1201 14:04:28.743830 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d74b7c87d-s58hc"] Dec 01 14:04:28 crc kubenswrapper[4585]: I1201 14:04:28.748854 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d74b7c87d-s58hc"] Dec 01 14:04:28 crc kubenswrapper[4585]: I1201 14:04:28.982754 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-685565dc9d-j6lfq"] Dec 01 14:04:28 crc kubenswrapper[4585]: W1201 14:04:28.984650 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6decf28_4b98_4789_8ee9_2e16356f0ff5.slice/crio-f300497cfe4814c4c9840726f05925305e664689c4509baebf8b48d4591cd0e2 WatchSource:0}: Error finding container f300497cfe4814c4c9840726f05925305e664689c4509baebf8b48d4591cd0e2: Status 404 returned error can't find the container with id f300497cfe4814c4c9840726f05925305e664689c4509baebf8b48d4591cd0e2 Dec 01 14:04:29 crc kubenswrapper[4585]: I1201 14:04:29.725636 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-685565dc9d-j6lfq" event={"ID":"b6decf28-4b98-4789-8ee9-2e16356f0ff5","Type":"ContainerStarted","Data":"77cfe00d98b439133d7be7c7f5baaa7ab363fec2cad4eaa2456a4cf6bdf34dd0"} Dec 01 14:04:29 crc kubenswrapper[4585]: I1201 14:04:29.725685 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-685565dc9d-j6lfq" event={"ID":"b6decf28-4b98-4789-8ee9-2e16356f0ff5","Type":"ContainerStarted","Data":"f300497cfe4814c4c9840726f05925305e664689c4509baebf8b48d4591cd0e2"} Dec 01 14:04:29 crc kubenswrapper[4585]: I1201 14:04:29.725707 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-685565dc9d-j6lfq" Dec 01 14:04:29 crc kubenswrapper[4585]: I1201 14:04:29.739465 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-685565dc9d-j6lfq" Dec 01 14:04:29 crc kubenswrapper[4585]: I1201 14:04:29.747168 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-685565dc9d-j6lfq" podStartSLOduration=3.747139711 podStartE2EDuration="3.747139711s" podCreationTimestamp="2025-12-01 14:04:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:04:29.743736495 +0000 UTC m=+383.727950350" watchObservedRunningTime="2025-12-01 14:04:29.747139711 +0000 UTC m=+383.731353566" Dec 01 14:04:30 crc kubenswrapper[4585]: I1201 14:04:30.420924 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee0a972a-5c20-4d54-a5dc-fab8121d2de0" path="/var/lib/kubelet/pods/ee0a972a-5c20-4d54-a5dc-fab8121d2de0/volumes" Dec 01 14:04:38 crc kubenswrapper[4585]: I1201 14:04:38.372664 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-t4xwc"] Dec 01 14:04:38 crc kubenswrapper[4585]: I1201 14:04:38.375870 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-t4xwc" Dec 01 14:04:38 crc kubenswrapper[4585]: I1201 14:04:38.405857 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-t4xwc"] Dec 01 14:04:38 crc kubenswrapper[4585]: I1201 14:04:38.424299 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-t4xwc\" (UID: \"e04e5bf9-f8fb-4361-a630-cd5f4483d289\") " pod="openshift-image-registry/image-registry-66df7c8f76-t4xwc" Dec 01 14:04:38 crc kubenswrapper[4585]: I1201 14:04:38.424385 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kppjm\" (UniqueName: \"kubernetes.io/projected/e04e5bf9-f8fb-4361-a630-cd5f4483d289-kube-api-access-kppjm\") pod \"image-registry-66df7c8f76-t4xwc\" (UID: \"e04e5bf9-f8fb-4361-a630-cd5f4483d289\") " pod="openshift-image-registry/image-registry-66df7c8f76-t4xwc" Dec 01 14:04:38 crc kubenswrapper[4585]: I1201 14:04:38.424422 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e04e5bf9-f8fb-4361-a630-cd5f4483d289-ca-trust-extracted\") pod \"image-registry-66df7c8f76-t4xwc\" (UID: \"e04e5bf9-f8fb-4361-a630-cd5f4483d289\") " pod="openshift-image-registry/image-registry-66df7c8f76-t4xwc" Dec 01 14:04:38 crc kubenswrapper[4585]: I1201 14:04:38.424456 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e04e5bf9-f8fb-4361-a630-cd5f4483d289-registry-tls\") pod \"image-registry-66df7c8f76-t4xwc\" (UID: \"e04e5bf9-f8fb-4361-a630-cd5f4483d289\") " pod="openshift-image-registry/image-registry-66df7c8f76-t4xwc" Dec 01 14:04:38 crc kubenswrapper[4585]: I1201 14:04:38.424483 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e04e5bf9-f8fb-4361-a630-cd5f4483d289-trusted-ca\") pod \"image-registry-66df7c8f76-t4xwc\" (UID: \"e04e5bf9-f8fb-4361-a630-cd5f4483d289\") " pod="openshift-image-registry/image-registry-66df7c8f76-t4xwc" Dec 01 14:04:38 crc kubenswrapper[4585]: I1201 14:04:38.424512 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e04e5bf9-f8fb-4361-a630-cd5f4483d289-bound-sa-token\") pod \"image-registry-66df7c8f76-t4xwc\" (UID: \"e04e5bf9-f8fb-4361-a630-cd5f4483d289\") " pod="openshift-image-registry/image-registry-66df7c8f76-t4xwc" Dec 01 14:04:38 crc kubenswrapper[4585]: I1201 14:04:38.424560 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e04e5bf9-f8fb-4361-a630-cd5f4483d289-registry-certificates\") pod \"image-registry-66df7c8f76-t4xwc\" (UID: \"e04e5bf9-f8fb-4361-a630-cd5f4483d289\") " pod="openshift-image-registry/image-registry-66df7c8f76-t4xwc" Dec 01 14:04:38 crc kubenswrapper[4585]: I1201 14:04:38.424583 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e04e5bf9-f8fb-4361-a630-cd5f4483d289-installation-pull-secrets\") pod \"image-registry-66df7c8f76-t4xwc\" (UID: \"e04e5bf9-f8fb-4361-a630-cd5f4483d289\") " pod="openshift-image-registry/image-registry-66df7c8f76-t4xwc" Dec 01 14:04:38 crc kubenswrapper[4585]: I1201 14:04:38.464432 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-t4xwc\" (UID: \"e04e5bf9-f8fb-4361-a630-cd5f4483d289\") " pod="openshift-image-registry/image-registry-66df7c8f76-t4xwc" Dec 01 14:04:38 crc kubenswrapper[4585]: I1201 14:04:38.526101 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kppjm\" (UniqueName: \"kubernetes.io/projected/e04e5bf9-f8fb-4361-a630-cd5f4483d289-kube-api-access-kppjm\") pod \"image-registry-66df7c8f76-t4xwc\" (UID: \"e04e5bf9-f8fb-4361-a630-cd5f4483d289\") " pod="openshift-image-registry/image-registry-66df7c8f76-t4xwc" Dec 01 14:04:38 crc kubenswrapper[4585]: I1201 14:04:38.526147 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e04e5bf9-f8fb-4361-a630-cd5f4483d289-ca-trust-extracted\") pod \"image-registry-66df7c8f76-t4xwc\" (UID: \"e04e5bf9-f8fb-4361-a630-cd5f4483d289\") " pod="openshift-image-registry/image-registry-66df7c8f76-t4xwc" Dec 01 14:04:38 crc kubenswrapper[4585]: I1201 14:04:38.526176 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e04e5bf9-f8fb-4361-a630-cd5f4483d289-registry-tls\") pod \"image-registry-66df7c8f76-t4xwc\" (UID: \"e04e5bf9-f8fb-4361-a630-cd5f4483d289\") " pod="openshift-image-registry/image-registry-66df7c8f76-t4xwc" Dec 01 14:04:38 crc kubenswrapper[4585]: I1201 14:04:38.526195 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e04e5bf9-f8fb-4361-a630-cd5f4483d289-trusted-ca\") pod \"image-registry-66df7c8f76-t4xwc\" (UID: \"e04e5bf9-f8fb-4361-a630-cd5f4483d289\") " pod="openshift-image-registry/image-registry-66df7c8f76-t4xwc" Dec 01 14:04:38 crc kubenswrapper[4585]: I1201 14:04:38.526214 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e04e5bf9-f8fb-4361-a630-cd5f4483d289-bound-sa-token\") pod \"image-registry-66df7c8f76-t4xwc\" (UID: \"e04e5bf9-f8fb-4361-a630-cd5f4483d289\") " pod="openshift-image-registry/image-registry-66df7c8f76-t4xwc" Dec 01 14:04:38 crc kubenswrapper[4585]: I1201 14:04:38.526248 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e04e5bf9-f8fb-4361-a630-cd5f4483d289-registry-certificates\") pod \"image-registry-66df7c8f76-t4xwc\" (UID: \"e04e5bf9-f8fb-4361-a630-cd5f4483d289\") " pod="openshift-image-registry/image-registry-66df7c8f76-t4xwc" Dec 01 14:04:38 crc kubenswrapper[4585]: I1201 14:04:38.526265 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e04e5bf9-f8fb-4361-a630-cd5f4483d289-installation-pull-secrets\") pod \"image-registry-66df7c8f76-t4xwc\" (UID: \"e04e5bf9-f8fb-4361-a630-cd5f4483d289\") " pod="openshift-image-registry/image-registry-66df7c8f76-t4xwc" Dec 01 14:04:38 crc kubenswrapper[4585]: I1201 14:04:38.527169 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e04e5bf9-f8fb-4361-a630-cd5f4483d289-ca-trust-extracted\") pod \"image-registry-66df7c8f76-t4xwc\" (UID: \"e04e5bf9-f8fb-4361-a630-cd5f4483d289\") " pod="openshift-image-registry/image-registry-66df7c8f76-t4xwc" Dec 01 14:04:38 crc kubenswrapper[4585]: I1201 14:04:38.528044 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e04e5bf9-f8fb-4361-a630-cd5f4483d289-registry-certificates\") pod \"image-registry-66df7c8f76-t4xwc\" (UID: \"e04e5bf9-f8fb-4361-a630-cd5f4483d289\") " pod="openshift-image-registry/image-registry-66df7c8f76-t4xwc" Dec 01 14:04:38 crc kubenswrapper[4585]: I1201 14:04:38.528200 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e04e5bf9-f8fb-4361-a630-cd5f4483d289-trusted-ca\") pod \"image-registry-66df7c8f76-t4xwc\" (UID: \"e04e5bf9-f8fb-4361-a630-cd5f4483d289\") " pod="openshift-image-registry/image-registry-66df7c8f76-t4xwc" Dec 01 14:04:38 crc kubenswrapper[4585]: I1201 14:04:38.534247 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e04e5bf9-f8fb-4361-a630-cd5f4483d289-installation-pull-secrets\") pod \"image-registry-66df7c8f76-t4xwc\" (UID: \"e04e5bf9-f8fb-4361-a630-cd5f4483d289\") " pod="openshift-image-registry/image-registry-66df7c8f76-t4xwc" Dec 01 14:04:38 crc kubenswrapper[4585]: I1201 14:04:38.538664 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e04e5bf9-f8fb-4361-a630-cd5f4483d289-registry-tls\") pod \"image-registry-66df7c8f76-t4xwc\" (UID: \"e04e5bf9-f8fb-4361-a630-cd5f4483d289\") " pod="openshift-image-registry/image-registry-66df7c8f76-t4xwc" Dec 01 14:04:38 crc kubenswrapper[4585]: I1201 14:04:38.545874 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e04e5bf9-f8fb-4361-a630-cd5f4483d289-bound-sa-token\") pod \"image-registry-66df7c8f76-t4xwc\" (UID: \"e04e5bf9-f8fb-4361-a630-cd5f4483d289\") " pod="openshift-image-registry/image-registry-66df7c8f76-t4xwc" Dec 01 14:04:38 crc kubenswrapper[4585]: I1201 14:04:38.548732 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kppjm\" (UniqueName: \"kubernetes.io/projected/e04e5bf9-f8fb-4361-a630-cd5f4483d289-kube-api-access-kppjm\") pod \"image-registry-66df7c8f76-t4xwc\" (UID: \"e04e5bf9-f8fb-4361-a630-cd5f4483d289\") " pod="openshift-image-registry/image-registry-66df7c8f76-t4xwc" Dec 01 14:04:38 crc kubenswrapper[4585]: I1201 14:04:38.694520 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-t4xwc" Dec 01 14:04:39 crc kubenswrapper[4585]: I1201 14:04:39.131284 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-t4xwc"] Dec 01 14:04:39 crc kubenswrapper[4585]: I1201 14:04:39.791380 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-t4xwc" event={"ID":"e04e5bf9-f8fb-4361-a630-cd5f4483d289","Type":"ContainerStarted","Data":"5508fc955a314b2cae5c1b0d027743ffc7c1b63d685f7729c72632263616e2a4"} Dec 01 14:04:39 crc kubenswrapper[4585]: I1201 14:04:39.791734 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-t4xwc" Dec 01 14:04:39 crc kubenswrapper[4585]: I1201 14:04:39.791748 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-t4xwc" event={"ID":"e04e5bf9-f8fb-4361-a630-cd5f4483d289","Type":"ContainerStarted","Data":"9ebde442f257457587ed5508c4ddf7864dc18559517e249ba140ef251fef632b"} Dec 01 14:04:39 crc kubenswrapper[4585]: I1201 14:04:39.815774 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-t4xwc" podStartSLOduration=1.815741636 podStartE2EDuration="1.815741636s" podCreationTimestamp="2025-12-01 14:04:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:04:39.814590781 +0000 UTC m=+393.798804646" watchObservedRunningTime="2025-12-01 14:04:39.815741636 +0000 UTC m=+393.799955501" Dec 01 14:04:43 crc kubenswrapper[4585]: I1201 14:04:43.716418 4585 patch_prober.go:28] interesting pod/machine-config-daemon-lj9gs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 14:04:43 crc kubenswrapper[4585]: I1201 14:04:43.716880 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 14:04:58 crc kubenswrapper[4585]: I1201 14:04:58.702156 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-t4xwc" Dec 01 14:04:58 crc kubenswrapper[4585]: I1201 14:04:58.802936 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-nrbjm"] Dec 01 14:05:13 crc kubenswrapper[4585]: I1201 14:05:13.716332 4585 patch_prober.go:28] interesting pod/machine-config-daemon-lj9gs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 14:05:13 crc kubenswrapper[4585]: I1201 14:05:13.717439 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 14:05:13 crc kubenswrapper[4585]: I1201 14:05:13.717542 4585 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" Dec 01 14:05:13 crc kubenswrapper[4585]: I1201 14:05:13.718632 4585 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8d257e580d36e30a7107591145ef0a5ff804617e4fa8a607c1d45bf357edd6a4"} pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 14:05:13 crc kubenswrapper[4585]: I1201 14:05:13.718743 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerName="machine-config-daemon" containerID="cri-o://8d257e580d36e30a7107591145ef0a5ff804617e4fa8a607c1d45bf357edd6a4" gracePeriod=600 Dec 01 14:05:14 crc kubenswrapper[4585]: I1201 14:05:14.033113 4585 generic.go:334] "Generic (PLEG): container finished" podID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerID="8d257e580d36e30a7107591145ef0a5ff804617e4fa8a607c1d45bf357edd6a4" exitCode=0 Dec 01 14:05:14 crc kubenswrapper[4585]: I1201 14:05:14.033199 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" event={"ID":"f7beb40d-bcd0-43c8-a9fe-c32408790a4c","Type":"ContainerDied","Data":"8d257e580d36e30a7107591145ef0a5ff804617e4fa8a607c1d45bf357edd6a4"} Dec 01 14:05:14 crc kubenswrapper[4585]: I1201 14:05:14.033662 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" event={"ID":"f7beb40d-bcd0-43c8-a9fe-c32408790a4c","Type":"ContainerStarted","Data":"80328dc2704086bb4c5e275cec97e65017716d1273f5d086800acbe8844177d3"} Dec 01 14:05:14 crc kubenswrapper[4585]: I1201 14:05:14.033693 4585 scope.go:117] "RemoveContainer" containerID="4c2cf1c1e4965502ab56b6d9bbf7840d00225b831b1e613295befad0748d7d00" Dec 01 14:05:23 crc kubenswrapper[4585]: I1201 14:05:23.850956 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" podUID="6763aabd-f571-4b13-82fd-3a4a9bdf8406" containerName="registry" containerID="cri-o://bdadbd4ee03b3680b9bc0694394f0f784e6166981338a2c45679477dca2777cb" gracePeriod=30 Dec 01 14:05:24 crc kubenswrapper[4585]: I1201 14:05:24.108588 4585 generic.go:334] "Generic (PLEG): container finished" podID="6763aabd-f571-4b13-82fd-3a4a9bdf8406" containerID="bdadbd4ee03b3680b9bc0694394f0f784e6166981338a2c45679477dca2777cb" exitCode=0 Dec 01 14:05:24 crc kubenswrapper[4585]: I1201 14:05:24.109099 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" event={"ID":"6763aabd-f571-4b13-82fd-3a4a9bdf8406","Type":"ContainerDied","Data":"bdadbd4ee03b3680b9bc0694394f0f784e6166981338a2c45679477dca2777cb"} Dec 01 14:05:24 crc kubenswrapper[4585]: I1201 14:05:24.353933 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:05:24 crc kubenswrapper[4585]: I1201 14:05:24.448762 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6763aabd-f571-4b13-82fd-3a4a9bdf8406-bound-sa-token\") pod \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " Dec 01 14:05:24 crc kubenswrapper[4585]: I1201 14:05:24.448821 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwps2\" (UniqueName: \"kubernetes.io/projected/6763aabd-f571-4b13-82fd-3a4a9bdf8406-kube-api-access-dwps2\") pod \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " Dec 01 14:05:24 crc kubenswrapper[4585]: I1201 14:05:24.448874 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6763aabd-f571-4b13-82fd-3a4a9bdf8406-trusted-ca\") pod \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " Dec 01 14:05:24 crc kubenswrapper[4585]: I1201 14:05:24.448928 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6763aabd-f571-4b13-82fd-3a4a9bdf8406-registry-tls\") pod \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " Dec 01 14:05:24 crc kubenswrapper[4585]: I1201 14:05:24.449249 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " Dec 01 14:05:24 crc kubenswrapper[4585]: I1201 14:05:24.449317 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6763aabd-f571-4b13-82fd-3a4a9bdf8406-ca-trust-extracted\") pod \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " Dec 01 14:05:24 crc kubenswrapper[4585]: I1201 14:05:24.449367 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6763aabd-f571-4b13-82fd-3a4a9bdf8406-registry-certificates\") pod \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " Dec 01 14:05:24 crc kubenswrapper[4585]: I1201 14:05:24.449403 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6763aabd-f571-4b13-82fd-3a4a9bdf8406-installation-pull-secrets\") pod \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\" (UID: \"6763aabd-f571-4b13-82fd-3a4a9bdf8406\") " Dec 01 14:05:24 crc kubenswrapper[4585]: I1201 14:05:24.452077 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6763aabd-f571-4b13-82fd-3a4a9bdf8406-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "6763aabd-f571-4b13-82fd-3a4a9bdf8406" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:05:24 crc kubenswrapper[4585]: I1201 14:05:24.452888 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6763aabd-f571-4b13-82fd-3a4a9bdf8406-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "6763aabd-f571-4b13-82fd-3a4a9bdf8406" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:05:24 crc kubenswrapper[4585]: I1201 14:05:24.458477 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6763aabd-f571-4b13-82fd-3a4a9bdf8406-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "6763aabd-f571-4b13-82fd-3a4a9bdf8406" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:05:24 crc kubenswrapper[4585]: I1201 14:05:24.458718 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6763aabd-f571-4b13-82fd-3a4a9bdf8406-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "6763aabd-f571-4b13-82fd-3a4a9bdf8406" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:05:24 crc kubenswrapper[4585]: I1201 14:05:24.459578 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6763aabd-f571-4b13-82fd-3a4a9bdf8406-kube-api-access-dwps2" (OuterVolumeSpecName: "kube-api-access-dwps2") pod "6763aabd-f571-4b13-82fd-3a4a9bdf8406" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406"). InnerVolumeSpecName "kube-api-access-dwps2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:05:24 crc kubenswrapper[4585]: I1201 14:05:24.462086 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6763aabd-f571-4b13-82fd-3a4a9bdf8406-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "6763aabd-f571-4b13-82fd-3a4a9bdf8406" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:05:24 crc kubenswrapper[4585]: I1201 14:05:24.469180 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "6763aabd-f571-4b13-82fd-3a4a9bdf8406" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 01 14:05:24 crc kubenswrapper[4585]: I1201 14:05:24.473647 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6763aabd-f571-4b13-82fd-3a4a9bdf8406-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "6763aabd-f571-4b13-82fd-3a4a9bdf8406" (UID: "6763aabd-f571-4b13-82fd-3a4a9bdf8406"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:05:24 crc kubenswrapper[4585]: I1201 14:05:24.551016 4585 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6763aabd-f571-4b13-82fd-3a4a9bdf8406-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 01 14:05:24 crc kubenswrapper[4585]: I1201 14:05:24.551058 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwps2\" (UniqueName: \"kubernetes.io/projected/6763aabd-f571-4b13-82fd-3a4a9bdf8406-kube-api-access-dwps2\") on node \"crc\" DevicePath \"\"" Dec 01 14:05:24 crc kubenswrapper[4585]: I1201 14:05:24.551075 4585 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6763aabd-f571-4b13-82fd-3a4a9bdf8406-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 14:05:24 crc kubenswrapper[4585]: I1201 14:05:24.551085 4585 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6763aabd-f571-4b13-82fd-3a4a9bdf8406-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 01 14:05:24 crc kubenswrapper[4585]: I1201 14:05:24.551095 4585 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6763aabd-f571-4b13-82fd-3a4a9bdf8406-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 01 14:05:24 crc kubenswrapper[4585]: I1201 14:05:24.551105 4585 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6763aabd-f571-4b13-82fd-3a4a9bdf8406-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 01 14:05:24 crc kubenswrapper[4585]: I1201 14:05:24.551115 4585 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6763aabd-f571-4b13-82fd-3a4a9bdf8406-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 01 14:05:25 crc kubenswrapper[4585]: I1201 14:05:25.120900 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" event={"ID":"6763aabd-f571-4b13-82fd-3a4a9bdf8406","Type":"ContainerDied","Data":"3e1e827a43f74634644da947859b40ccd0b30f29fb64bd562cb1a345965e8529"} Dec 01 14:05:25 crc kubenswrapper[4585]: I1201 14:05:25.121022 4585 scope.go:117] "RemoveContainer" containerID="bdadbd4ee03b3680b9bc0694394f0f784e6166981338a2c45679477dca2777cb" Dec 01 14:05:25 crc kubenswrapper[4585]: I1201 14:05:25.121075 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-nrbjm" Dec 01 14:05:25 crc kubenswrapper[4585]: I1201 14:05:25.187918 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-nrbjm"] Dec 01 14:05:25 crc kubenswrapper[4585]: I1201 14:05:25.194878 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-nrbjm"] Dec 01 14:05:26 crc kubenswrapper[4585]: I1201 14:05:26.429708 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6763aabd-f571-4b13-82fd-3a4a9bdf8406" path="/var/lib/kubelet/pods/6763aabd-f571-4b13-82fd-3a4a9bdf8406/volumes" Dec 01 14:07:06 crc kubenswrapper[4585]: I1201 14:07:06.678341 4585 scope.go:117] "RemoveContainer" containerID="62d79488af697b9eb7d206b75aafc7e5cbe6d7e2b205e09fbc90a009d90474cc" Dec 01 14:07:13 crc kubenswrapper[4585]: I1201 14:07:13.716329 4585 patch_prober.go:28] interesting pod/machine-config-daemon-lj9gs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 14:07:13 crc kubenswrapper[4585]: I1201 14:07:13.716915 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 14:07:35 crc kubenswrapper[4585]: I1201 14:07:35.806359 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-mk428"] Dec 01 14:07:35 crc kubenswrapper[4585]: E1201 14:07:35.807165 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6763aabd-f571-4b13-82fd-3a4a9bdf8406" containerName="registry" Dec 01 14:07:35 crc kubenswrapper[4585]: I1201 14:07:35.807179 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="6763aabd-f571-4b13-82fd-3a4a9bdf8406" containerName="registry" Dec 01 14:07:35 crc kubenswrapper[4585]: I1201 14:07:35.807295 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="6763aabd-f571-4b13-82fd-3a4a9bdf8406" containerName="registry" Dec 01 14:07:35 crc kubenswrapper[4585]: I1201 14:07:35.807661 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-mk428" Dec 01 14:07:35 crc kubenswrapper[4585]: I1201 14:07:35.809544 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-kfgj7"] Dec 01 14:07:35 crc kubenswrapper[4585]: I1201 14:07:35.810065 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-kfgj7" Dec 01 14:07:35 crc kubenswrapper[4585]: I1201 14:07:35.812538 4585 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-mpq22" Dec 01 14:07:35 crc kubenswrapper[4585]: I1201 14:07:35.819242 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 01 14:07:35 crc kubenswrapper[4585]: I1201 14:07:35.820020 4585 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-j2sng" Dec 01 14:07:35 crc kubenswrapper[4585]: I1201 14:07:35.821894 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 01 14:07:35 crc kubenswrapper[4585]: I1201 14:07:35.854665 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-kfgj7"] Dec 01 14:07:35 crc kubenswrapper[4585]: I1201 14:07:35.876960 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-2ldxr"] Dec 01 14:07:35 crc kubenswrapper[4585]: I1201 14:07:35.877884 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-2ldxr" Dec 01 14:07:35 crc kubenswrapper[4585]: I1201 14:07:35.882413 4585 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-brc5m" Dec 01 14:07:35 crc kubenswrapper[4585]: I1201 14:07:35.882712 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-mk428"] Dec 01 14:07:35 crc kubenswrapper[4585]: I1201 14:07:35.895130 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-2ldxr"] Dec 01 14:07:35 crc kubenswrapper[4585]: I1201 14:07:35.943818 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2rw2\" (UniqueName: \"kubernetes.io/projected/45ed0e5e-d1d0-45c5-9710-bcc051a7956e-kube-api-access-p2rw2\") pod \"cert-manager-5b446d88c5-mk428\" (UID: \"45ed0e5e-d1d0-45c5-9710-bcc051a7956e\") " pod="cert-manager/cert-manager-5b446d88c5-mk428" Dec 01 14:07:35 crc kubenswrapper[4585]: I1201 14:07:35.943881 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzbg5\" (UniqueName: \"kubernetes.io/projected/03ae09b7-07fe-4a7b-9012-c17019e6d0fa-kube-api-access-mzbg5\") pod \"cert-manager-cainjector-7f985d654d-kfgj7\" (UID: \"03ae09b7-07fe-4a7b-9012-c17019e6d0fa\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-kfgj7" Dec 01 14:07:36 crc kubenswrapper[4585]: I1201 14:07:36.045076 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2rw2\" (UniqueName: \"kubernetes.io/projected/45ed0e5e-d1d0-45c5-9710-bcc051a7956e-kube-api-access-p2rw2\") pod \"cert-manager-5b446d88c5-mk428\" (UID: \"45ed0e5e-d1d0-45c5-9710-bcc051a7956e\") " pod="cert-manager/cert-manager-5b446d88c5-mk428" Dec 01 14:07:36 crc kubenswrapper[4585]: I1201 14:07:36.045436 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzbg5\" (UniqueName: \"kubernetes.io/projected/03ae09b7-07fe-4a7b-9012-c17019e6d0fa-kube-api-access-mzbg5\") pod \"cert-manager-cainjector-7f985d654d-kfgj7\" (UID: \"03ae09b7-07fe-4a7b-9012-c17019e6d0fa\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-kfgj7" Dec 01 14:07:36 crc kubenswrapper[4585]: I1201 14:07:36.045653 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqwnh\" (UniqueName: \"kubernetes.io/projected/f1560ff4-292f-425d-8b6f-d481c951c541-kube-api-access-mqwnh\") pod \"cert-manager-webhook-5655c58dd6-2ldxr\" (UID: \"f1560ff4-292f-425d-8b6f-d481c951c541\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-2ldxr" Dec 01 14:07:36 crc kubenswrapper[4585]: I1201 14:07:36.071995 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2rw2\" (UniqueName: \"kubernetes.io/projected/45ed0e5e-d1d0-45c5-9710-bcc051a7956e-kube-api-access-p2rw2\") pod \"cert-manager-5b446d88c5-mk428\" (UID: \"45ed0e5e-d1d0-45c5-9710-bcc051a7956e\") " pod="cert-manager/cert-manager-5b446d88c5-mk428" Dec 01 14:07:36 crc kubenswrapper[4585]: I1201 14:07:36.073412 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzbg5\" (UniqueName: \"kubernetes.io/projected/03ae09b7-07fe-4a7b-9012-c17019e6d0fa-kube-api-access-mzbg5\") pod \"cert-manager-cainjector-7f985d654d-kfgj7\" (UID: \"03ae09b7-07fe-4a7b-9012-c17019e6d0fa\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-kfgj7" Dec 01 14:07:36 crc kubenswrapper[4585]: I1201 14:07:36.126253 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-mk428" Dec 01 14:07:36 crc kubenswrapper[4585]: I1201 14:07:36.132987 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-kfgj7" Dec 01 14:07:36 crc kubenswrapper[4585]: I1201 14:07:36.146941 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqwnh\" (UniqueName: \"kubernetes.io/projected/f1560ff4-292f-425d-8b6f-d481c951c541-kube-api-access-mqwnh\") pod \"cert-manager-webhook-5655c58dd6-2ldxr\" (UID: \"f1560ff4-292f-425d-8b6f-d481c951c541\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-2ldxr" Dec 01 14:07:36 crc kubenswrapper[4585]: I1201 14:07:36.188162 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqwnh\" (UniqueName: \"kubernetes.io/projected/f1560ff4-292f-425d-8b6f-d481c951c541-kube-api-access-mqwnh\") pod \"cert-manager-webhook-5655c58dd6-2ldxr\" (UID: \"f1560ff4-292f-425d-8b6f-d481c951c541\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-2ldxr" Dec 01 14:07:36 crc kubenswrapper[4585]: I1201 14:07:36.194359 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-2ldxr" Dec 01 14:07:36 crc kubenswrapper[4585]: I1201 14:07:36.447860 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-kfgj7"] Dec 01 14:07:36 crc kubenswrapper[4585]: I1201 14:07:36.470445 4585 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 14:07:36 crc kubenswrapper[4585]: I1201 14:07:36.611944 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-2ldxr"] Dec 01 14:07:36 crc kubenswrapper[4585]: I1201 14:07:36.626719 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-mk428"] Dec 01 14:07:36 crc kubenswrapper[4585]: W1201 14:07:36.628884 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1560ff4_292f_425d_8b6f_d481c951c541.slice/crio-46eecd191a2a9cd5adda6ca5b21ed18335a88023f9a80ee35540ec3e9843cf36 WatchSource:0}: Error finding container 46eecd191a2a9cd5adda6ca5b21ed18335a88023f9a80ee35540ec3e9843cf36: Status 404 returned error can't find the container with id 46eecd191a2a9cd5adda6ca5b21ed18335a88023f9a80ee35540ec3e9843cf36 Dec 01 14:07:36 crc kubenswrapper[4585]: W1201 14:07:36.642765 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45ed0e5e_d1d0_45c5_9710_bcc051a7956e.slice/crio-7cc897cfa3a70977f21232b0bdbb8c13dec873dbde3174b88d2916dcc9aff5d6 WatchSource:0}: Error finding container 7cc897cfa3a70977f21232b0bdbb8c13dec873dbde3174b88d2916dcc9aff5d6: Status 404 returned error can't find the container with id 7cc897cfa3a70977f21232b0bdbb8c13dec873dbde3174b88d2916dcc9aff5d6 Dec 01 14:07:37 crc kubenswrapper[4585]: I1201 14:07:37.060804 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-2ldxr" event={"ID":"f1560ff4-292f-425d-8b6f-d481c951c541","Type":"ContainerStarted","Data":"46eecd191a2a9cd5adda6ca5b21ed18335a88023f9a80ee35540ec3e9843cf36"} Dec 01 14:07:37 crc kubenswrapper[4585]: I1201 14:07:37.062139 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-mk428" event={"ID":"45ed0e5e-d1d0-45c5-9710-bcc051a7956e","Type":"ContainerStarted","Data":"7cc897cfa3a70977f21232b0bdbb8c13dec873dbde3174b88d2916dcc9aff5d6"} Dec 01 14:07:37 crc kubenswrapper[4585]: I1201 14:07:37.063258 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-kfgj7" event={"ID":"03ae09b7-07fe-4a7b-9012-c17019e6d0fa","Type":"ContainerStarted","Data":"042e38b8e57243c3afddbaff1d3357faeb9e155a5a819375c170a1da4529393f"} Dec 01 14:07:41 crc kubenswrapper[4585]: I1201 14:07:41.113633 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-mk428" event={"ID":"45ed0e5e-d1d0-45c5-9710-bcc051a7956e","Type":"ContainerStarted","Data":"5f11a6823d296d8b1be26b42ebb7d1dd8bb4989a2da5c6a743902c95e785aa21"} Dec 01 14:07:41 crc kubenswrapper[4585]: I1201 14:07:41.115634 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-kfgj7" event={"ID":"03ae09b7-07fe-4a7b-9012-c17019e6d0fa","Type":"ContainerStarted","Data":"d835a5c41d053d7c5c014d75bd24a8cb2201ec692f4c5db47e0669066453f53d"} Dec 01 14:07:41 crc kubenswrapper[4585]: I1201 14:07:41.117179 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-2ldxr" event={"ID":"f1560ff4-292f-425d-8b6f-d481c951c541","Type":"ContainerStarted","Data":"423bd1c897ab1dccff318917874125db4e4af0db811657826851bddd9b518f2f"} Dec 01 14:07:41 crc kubenswrapper[4585]: I1201 14:07:41.117372 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-2ldxr" Dec 01 14:07:41 crc kubenswrapper[4585]: I1201 14:07:41.153155 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-mk428" podStartSLOduration=2.828191923 podStartE2EDuration="6.153130068s" podCreationTimestamp="2025-12-01 14:07:35 +0000 UTC" firstStartedPulling="2025-12-01 14:07:36.656331264 +0000 UTC m=+570.640545119" lastFinishedPulling="2025-12-01 14:07:39.981269409 +0000 UTC m=+573.965483264" observedRunningTime="2025-12-01 14:07:41.132556978 +0000 UTC m=+575.116770833" watchObservedRunningTime="2025-12-01 14:07:41.153130068 +0000 UTC m=+575.137343923" Dec 01 14:07:41 crc kubenswrapper[4585]: I1201 14:07:41.173332 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-2ldxr" podStartSLOduration=2.768377553 podStartE2EDuration="6.173313008s" podCreationTimestamp="2025-12-01 14:07:35 +0000 UTC" firstStartedPulling="2025-12-01 14:07:36.640902961 +0000 UTC m=+570.625116806" lastFinishedPulling="2025-12-01 14:07:40.045838406 +0000 UTC m=+574.030052261" observedRunningTime="2025-12-01 14:07:41.155728538 +0000 UTC m=+575.139942403" watchObservedRunningTime="2025-12-01 14:07:41.173313008 +0000 UTC m=+575.157526863" Dec 01 14:07:43 crc kubenswrapper[4585]: I1201 14:07:43.716960 4585 patch_prober.go:28] interesting pod/machine-config-daemon-lj9gs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 14:07:43 crc kubenswrapper[4585]: I1201 14:07:43.717163 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.043940 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-kfgj7" podStartSLOduration=7.594595522 podStartE2EDuration="11.043914364s" podCreationTimestamp="2025-12-01 14:07:35 +0000 UTC" firstStartedPulling="2025-12-01 14:07:36.470167774 +0000 UTC m=+570.454381629" lastFinishedPulling="2025-12-01 14:07:39.919486616 +0000 UTC m=+573.903700471" observedRunningTime="2025-12-01 14:07:41.177400037 +0000 UTC m=+575.161613892" watchObservedRunningTime="2025-12-01 14:07:46.043914364 +0000 UTC m=+580.028128219" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.047146 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-tjkqr"] Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.047607 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" podUID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" containerName="ovn-controller" containerID="cri-o://fdbccd3910729d23067d94c9ed2f5f3c917cd4880185a3b7e8c4c811cd30e609" gracePeriod=30 Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.047752 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" podUID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" containerName="kube-rbac-proxy-node" containerID="cri-o://90163e3776ee8b2a338e3f8e6aa3d918d1d6123d246fbbd88659d342348167a3" gracePeriod=30 Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.047711 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" podUID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" containerName="northd" containerID="cri-o://1b64c1a0d36f8addf6040455b1e933756582857ef8098c935b9cc5a115e8afee" gracePeriod=30 Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.047816 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" podUID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://37437844a9c9719627f96c0afa303f0d449c46def7249b8c69ebd44900661fa4" gracePeriod=30 Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.047843 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" podUID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" containerName="nbdb" containerID="cri-o://ec93c685227d533ac1ab980dca6f72792f746e68511a7a546fb25189b02507d3" gracePeriod=30 Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.047770 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" podUID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" containerName="ovn-acl-logging" containerID="cri-o://0d740a1be02b59a062ea8d5c44620cb57725a7ec494fd365e44a8980947c0d2e" gracePeriod=30 Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.047929 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" podUID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" containerName="sbdb" containerID="cri-o://5c85de7cd35b11158d931065cec6bafaf456be350f4a6095b8a9d0851bf87b1a" gracePeriod=30 Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.137506 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" podUID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" containerName="ovnkube-controller" containerID="cri-o://3b2013660155359d058ac5bb7ff50988013ccaf6dd753976bfddfdfc11f9ec90" gracePeriod=30 Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.202537 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-2ldxr" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.497992 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tjkqr_b0b45150-070d-4f7c-b53a-d76dcbaa6e6d/ovnkube-controller/3.log" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.500524 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tjkqr_b0b45150-070d-4f7c-b53a-d76dcbaa6e6d/ovn-acl-logging/0.log" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.501084 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tjkqr_b0b45150-070d-4f7c-b53a-d76dcbaa6e6d/ovn-controller/0.log" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.501517 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.565609 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-sblgv"] Dec 01 14:07:46 crc kubenswrapper[4585]: E1201 14:07:46.565929 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" containerName="ovn-acl-logging" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.565952 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" containerName="ovn-acl-logging" Dec 01 14:07:46 crc kubenswrapper[4585]: E1201 14:07:46.565988 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" containerName="nbdb" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.565998 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" containerName="nbdb" Dec 01 14:07:46 crc kubenswrapper[4585]: E1201 14:07:46.566012 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" containerName="ovnkube-controller" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.566022 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" containerName="ovnkube-controller" Dec 01 14:07:46 crc kubenswrapper[4585]: E1201 14:07:46.566032 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" containerName="sbdb" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.566039 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" containerName="sbdb" Dec 01 14:07:46 crc kubenswrapper[4585]: E1201 14:07:46.566048 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" containerName="ovn-controller" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.566055 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" containerName="ovn-controller" Dec 01 14:07:46 crc kubenswrapper[4585]: E1201 14:07:46.566063 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" containerName="ovnkube-controller" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.566070 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" containerName="ovnkube-controller" Dec 01 14:07:46 crc kubenswrapper[4585]: E1201 14:07:46.566080 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" containerName="kube-rbac-proxy-ovn-metrics" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.566087 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" containerName="kube-rbac-proxy-ovn-metrics" Dec 01 14:07:46 crc kubenswrapper[4585]: E1201 14:07:46.566098 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" containerName="ovnkube-controller" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.566107 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" containerName="ovnkube-controller" Dec 01 14:07:46 crc kubenswrapper[4585]: E1201 14:07:46.566118 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" containerName="ovnkube-controller" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.566125 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" containerName="ovnkube-controller" Dec 01 14:07:46 crc kubenswrapper[4585]: E1201 14:07:46.566134 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" containerName="ovnkube-controller" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.566141 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" containerName="ovnkube-controller" Dec 01 14:07:46 crc kubenswrapper[4585]: E1201 14:07:46.566155 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" containerName="northd" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.566163 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" containerName="northd" Dec 01 14:07:46 crc kubenswrapper[4585]: E1201 14:07:46.566172 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" containerName="kube-rbac-proxy-node" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.566181 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" containerName="kube-rbac-proxy-node" Dec 01 14:07:46 crc kubenswrapper[4585]: E1201 14:07:46.566196 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" containerName="kubecfg-setup" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.566204 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" containerName="kubecfg-setup" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.566341 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" containerName="kube-rbac-proxy-node" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.566358 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" containerName="ovnkube-controller" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.566367 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" containerName="kube-rbac-proxy-ovn-metrics" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.566379 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" containerName="ovn-controller" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.566388 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" containerName="northd" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.566398 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" containerName="nbdb" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.566407 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" containerName="ovn-acl-logging" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.566419 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" containerName="ovnkube-controller" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.566427 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" containerName="ovnkube-controller" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.566437 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" containerName="sbdb" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.566447 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" containerName="ovnkube-controller" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.566703 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" containerName="ovnkube-controller" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.569192 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.640729 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-etc-openvswitch\") pod \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.640796 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-run-systemd\") pod \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.640828 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-ovnkube-config\") pod \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.640850 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-log-socket\") pod \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.640881 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjhfw\" (UniqueName: \"kubernetes.io/projected/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-kube-api-access-xjhfw\") pod \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.640901 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-run-openvswitch\") pod \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.640898 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" (UID: "b0b45150-070d-4f7c-b53a-d76dcbaa6e6d"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.640941 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-env-overrides\") pod \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.640955 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-var-lib-openvswitch\") pod \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.641008 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-host-run-ovn-kubernetes\") pod \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.641023 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-host-cni-bin\") pod \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.641057 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-ovnkube-script-lib\") pod \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.641072 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-log-socket" (OuterVolumeSpecName: "log-socket") pod "b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" (UID: "b0b45150-070d-4f7c-b53a-d76dcbaa6e6d"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.641133 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-host-kubelet\") pod \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.641165 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-run-ovn\") pod \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.641162 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" (UID: "b0b45150-070d-4f7c-b53a-d76dcbaa6e6d"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.641193 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-host-run-netns\") pod \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.641196 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" (UID: "b0b45150-070d-4f7c-b53a-d76dcbaa6e6d"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.641215 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-host-cni-netd\") pod \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.641240 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-host-slash\") pod \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.641275 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-systemd-units\") pod \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.641297 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.641331 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-ovn-node-metrics-cert\") pod \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.641356 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-node-log\") pod \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\" (UID: \"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d\") " Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.641612 4585 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.641627 4585 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-log-socket\") on node \"crc\" DevicePath \"\"" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.641635 4585 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.641645 4585 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.641635 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" (UID: "b0b45150-070d-4f7c-b53a-d76dcbaa6e6d"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.641684 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-node-log" (OuterVolumeSpecName: "node-log") pod "b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" (UID: "b0b45150-070d-4f7c-b53a-d76dcbaa6e6d"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.641685 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" (UID: "b0b45150-070d-4f7c-b53a-d76dcbaa6e6d"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.641718 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" (UID: "b0b45150-070d-4f7c-b53a-d76dcbaa6e6d"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.641725 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-host-slash" (OuterVolumeSpecName: "host-slash") pod "b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" (UID: "b0b45150-070d-4f7c-b53a-d76dcbaa6e6d"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.641719 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" (UID: "b0b45150-070d-4f7c-b53a-d76dcbaa6e6d"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.641753 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" (UID: "b0b45150-070d-4f7c-b53a-d76dcbaa6e6d"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.641764 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" (UID: "b0b45150-070d-4f7c-b53a-d76dcbaa6e6d"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.641755 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" (UID: "b0b45150-070d-4f7c-b53a-d76dcbaa6e6d"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.641778 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" (UID: "b0b45150-070d-4f7c-b53a-d76dcbaa6e6d"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.641790 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" (UID: "b0b45150-070d-4f7c-b53a-d76dcbaa6e6d"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.641811 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" (UID: "b0b45150-070d-4f7c-b53a-d76dcbaa6e6d"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.641961 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" (UID: "b0b45150-070d-4f7c-b53a-d76dcbaa6e6d"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.647719 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" (UID: "b0b45150-070d-4f7c-b53a-d76dcbaa6e6d"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.647767 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-kube-api-access-xjhfw" (OuterVolumeSpecName: "kube-api-access-xjhfw") pod "b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" (UID: "b0b45150-070d-4f7c-b53a-d76dcbaa6e6d"). InnerVolumeSpecName "kube-api-access-xjhfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.656438 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" (UID: "b0b45150-070d-4f7c-b53a-d76dcbaa6e6d"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.743018 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d7afad35-2240-4b10-8595-c7badfe07544-ovnkube-config\") pod \"ovnkube-node-sblgv\" (UID: \"d7afad35-2240-4b10-8595-c7badfe07544\") " pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.743083 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d7afad35-2240-4b10-8595-c7badfe07544-host-cni-bin\") pod \"ovnkube-node-sblgv\" (UID: \"d7afad35-2240-4b10-8595-c7badfe07544\") " pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.743114 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d7afad35-2240-4b10-8595-c7badfe07544-run-systemd\") pod \"ovnkube-node-sblgv\" (UID: \"d7afad35-2240-4b10-8595-c7badfe07544\") " pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.743671 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d7afad35-2240-4b10-8595-c7badfe07544-var-lib-openvswitch\") pod \"ovnkube-node-sblgv\" (UID: \"d7afad35-2240-4b10-8595-c7badfe07544\") " pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.743722 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d7afad35-2240-4b10-8595-c7badfe07544-etc-openvswitch\") pod \"ovnkube-node-sblgv\" (UID: \"d7afad35-2240-4b10-8595-c7badfe07544\") " pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.743887 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d7afad35-2240-4b10-8595-c7badfe07544-run-ovn\") pod \"ovnkube-node-sblgv\" (UID: \"d7afad35-2240-4b10-8595-c7badfe07544\") " pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.743963 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d7afad35-2240-4b10-8595-c7badfe07544-systemd-units\") pod \"ovnkube-node-sblgv\" (UID: \"d7afad35-2240-4b10-8595-c7badfe07544\") " pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.744009 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d7afad35-2240-4b10-8595-c7badfe07544-ovn-node-metrics-cert\") pod \"ovnkube-node-sblgv\" (UID: \"d7afad35-2240-4b10-8595-c7badfe07544\") " pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.744074 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cppcb\" (UniqueName: \"kubernetes.io/projected/d7afad35-2240-4b10-8595-c7badfe07544-kube-api-access-cppcb\") pod \"ovnkube-node-sblgv\" (UID: \"d7afad35-2240-4b10-8595-c7badfe07544\") " pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.744108 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d7afad35-2240-4b10-8595-c7badfe07544-log-socket\") pod \"ovnkube-node-sblgv\" (UID: \"d7afad35-2240-4b10-8595-c7badfe07544\") " pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.744126 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d7afad35-2240-4b10-8595-c7badfe07544-host-cni-netd\") pod \"ovnkube-node-sblgv\" (UID: \"d7afad35-2240-4b10-8595-c7badfe07544\") " pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.744206 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d7afad35-2240-4b10-8595-c7badfe07544-host-slash\") pod \"ovnkube-node-sblgv\" (UID: \"d7afad35-2240-4b10-8595-c7badfe07544\") " pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.744234 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d7afad35-2240-4b10-8595-c7badfe07544-env-overrides\") pod \"ovnkube-node-sblgv\" (UID: \"d7afad35-2240-4b10-8595-c7badfe07544\") " pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.744279 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d7afad35-2240-4b10-8595-c7badfe07544-ovnkube-script-lib\") pod \"ovnkube-node-sblgv\" (UID: \"d7afad35-2240-4b10-8595-c7badfe07544\") " pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.744298 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d7afad35-2240-4b10-8595-c7badfe07544-host-run-netns\") pod \"ovnkube-node-sblgv\" (UID: \"d7afad35-2240-4b10-8595-c7badfe07544\") " pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.744355 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d7afad35-2240-4b10-8595-c7badfe07544-host-kubelet\") pod \"ovnkube-node-sblgv\" (UID: \"d7afad35-2240-4b10-8595-c7badfe07544\") " pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.744438 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d7afad35-2240-4b10-8595-c7badfe07544-host-run-ovn-kubernetes\") pod \"ovnkube-node-sblgv\" (UID: \"d7afad35-2240-4b10-8595-c7badfe07544\") " pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.744471 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d7afad35-2240-4b10-8595-c7badfe07544-node-log\") pod \"ovnkube-node-sblgv\" (UID: \"d7afad35-2240-4b10-8595-c7badfe07544\") " pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.744500 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d7afad35-2240-4b10-8595-c7badfe07544-run-openvswitch\") pod \"ovnkube-node-sblgv\" (UID: \"d7afad35-2240-4b10-8595-c7badfe07544\") " pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.744537 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d7afad35-2240-4b10-8595-c7badfe07544-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sblgv\" (UID: \"d7afad35-2240-4b10-8595-c7badfe07544\") " pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.744660 4585 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-host-slash\") on node \"crc\" DevicePath \"\"" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.744680 4585 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.744693 4585 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.744703 4585 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.744721 4585 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-node-log\") on node \"crc\" DevicePath \"\"" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.744739 4585 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.744753 4585 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.744770 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjhfw\" (UniqueName: \"kubernetes.io/projected/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-kube-api-access-xjhfw\") on node \"crc\" DevicePath \"\"" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.744781 4585 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.744795 4585 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.744805 4585 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.744815 4585 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.744825 4585 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.744835 4585 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.744843 4585 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.744852 4585 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.845189 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d7afad35-2240-4b10-8595-c7badfe07544-run-systemd\") pod \"ovnkube-node-sblgv\" (UID: \"d7afad35-2240-4b10-8595-c7badfe07544\") " pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.845244 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d7afad35-2240-4b10-8595-c7badfe07544-var-lib-openvswitch\") pod \"ovnkube-node-sblgv\" (UID: \"d7afad35-2240-4b10-8595-c7badfe07544\") " pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.845270 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d7afad35-2240-4b10-8595-c7badfe07544-etc-openvswitch\") pod \"ovnkube-node-sblgv\" (UID: \"d7afad35-2240-4b10-8595-c7badfe07544\") " pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.845287 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d7afad35-2240-4b10-8595-c7badfe07544-run-ovn\") pod \"ovnkube-node-sblgv\" (UID: \"d7afad35-2240-4b10-8595-c7badfe07544\") " pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.845306 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d7afad35-2240-4b10-8595-c7badfe07544-systemd-units\") pod \"ovnkube-node-sblgv\" (UID: \"d7afad35-2240-4b10-8595-c7badfe07544\") " pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.845325 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d7afad35-2240-4b10-8595-c7badfe07544-ovn-node-metrics-cert\") pod \"ovnkube-node-sblgv\" (UID: \"d7afad35-2240-4b10-8595-c7badfe07544\") " pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.845345 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cppcb\" (UniqueName: \"kubernetes.io/projected/d7afad35-2240-4b10-8595-c7badfe07544-kube-api-access-cppcb\") pod \"ovnkube-node-sblgv\" (UID: \"d7afad35-2240-4b10-8595-c7badfe07544\") " pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.845366 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d7afad35-2240-4b10-8595-c7badfe07544-log-socket\") pod \"ovnkube-node-sblgv\" (UID: \"d7afad35-2240-4b10-8595-c7badfe07544\") " pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.845359 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d7afad35-2240-4b10-8595-c7badfe07544-var-lib-openvswitch\") pod \"ovnkube-node-sblgv\" (UID: \"d7afad35-2240-4b10-8595-c7badfe07544\") " pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.845382 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d7afad35-2240-4b10-8595-c7badfe07544-host-cni-netd\") pod \"ovnkube-node-sblgv\" (UID: \"d7afad35-2240-4b10-8595-c7badfe07544\") " pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.845385 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d7afad35-2240-4b10-8595-c7badfe07544-run-ovn\") pod \"ovnkube-node-sblgv\" (UID: \"d7afad35-2240-4b10-8595-c7badfe07544\") " pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.845427 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d7afad35-2240-4b10-8595-c7badfe07544-host-slash\") pod \"ovnkube-node-sblgv\" (UID: \"d7afad35-2240-4b10-8595-c7badfe07544\") " pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.845419 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d7afad35-2240-4b10-8595-c7badfe07544-run-systemd\") pod \"ovnkube-node-sblgv\" (UID: \"d7afad35-2240-4b10-8595-c7badfe07544\") " pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.845477 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d7afad35-2240-4b10-8595-c7badfe07544-systemd-units\") pod \"ovnkube-node-sblgv\" (UID: \"d7afad35-2240-4b10-8595-c7badfe07544\") " pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.845444 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d7afad35-2240-4b10-8595-c7badfe07544-log-socket\") pod \"ovnkube-node-sblgv\" (UID: \"d7afad35-2240-4b10-8595-c7badfe07544\") " pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.845402 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d7afad35-2240-4b10-8595-c7badfe07544-host-slash\") pod \"ovnkube-node-sblgv\" (UID: \"d7afad35-2240-4b10-8595-c7badfe07544\") " pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.845455 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d7afad35-2240-4b10-8595-c7badfe07544-host-cni-netd\") pod \"ovnkube-node-sblgv\" (UID: \"d7afad35-2240-4b10-8595-c7badfe07544\") " pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.845359 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d7afad35-2240-4b10-8595-c7badfe07544-etc-openvswitch\") pod \"ovnkube-node-sblgv\" (UID: \"d7afad35-2240-4b10-8595-c7badfe07544\") " pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.845596 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d7afad35-2240-4b10-8595-c7badfe07544-env-overrides\") pod \"ovnkube-node-sblgv\" (UID: \"d7afad35-2240-4b10-8595-c7badfe07544\") " pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.845686 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d7afad35-2240-4b10-8595-c7badfe07544-host-run-netns\") pod \"ovnkube-node-sblgv\" (UID: \"d7afad35-2240-4b10-8595-c7badfe07544\") " pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.845729 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d7afad35-2240-4b10-8595-c7badfe07544-ovnkube-script-lib\") pod \"ovnkube-node-sblgv\" (UID: \"d7afad35-2240-4b10-8595-c7badfe07544\") " pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.845761 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d7afad35-2240-4b10-8595-c7badfe07544-host-kubelet\") pod \"ovnkube-node-sblgv\" (UID: \"d7afad35-2240-4b10-8595-c7badfe07544\") " pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.845786 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d7afad35-2240-4b10-8595-c7badfe07544-host-run-netns\") pod \"ovnkube-node-sblgv\" (UID: \"d7afad35-2240-4b10-8595-c7badfe07544\") " pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.845798 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d7afad35-2240-4b10-8595-c7badfe07544-host-run-ovn-kubernetes\") pod \"ovnkube-node-sblgv\" (UID: \"d7afad35-2240-4b10-8595-c7badfe07544\") " pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.845829 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d7afad35-2240-4b10-8595-c7badfe07544-node-log\") pod \"ovnkube-node-sblgv\" (UID: \"d7afad35-2240-4b10-8595-c7badfe07544\") " pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.845861 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d7afad35-2240-4b10-8595-c7badfe07544-run-openvswitch\") pod \"ovnkube-node-sblgv\" (UID: \"d7afad35-2240-4b10-8595-c7badfe07544\") " pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.845903 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d7afad35-2240-4b10-8595-c7badfe07544-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sblgv\" (UID: \"d7afad35-2240-4b10-8595-c7badfe07544\") " pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.845953 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d7afad35-2240-4b10-8595-c7badfe07544-ovnkube-config\") pod \"ovnkube-node-sblgv\" (UID: \"d7afad35-2240-4b10-8595-c7badfe07544\") " pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.846022 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d7afad35-2240-4b10-8595-c7badfe07544-host-cni-bin\") pod \"ovnkube-node-sblgv\" (UID: \"d7afad35-2240-4b10-8595-c7badfe07544\") " pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.846109 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d7afad35-2240-4b10-8595-c7badfe07544-host-cni-bin\") pod \"ovnkube-node-sblgv\" (UID: \"d7afad35-2240-4b10-8595-c7badfe07544\") " pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.846147 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d7afad35-2240-4b10-8595-c7badfe07544-host-kubelet\") pod \"ovnkube-node-sblgv\" (UID: \"d7afad35-2240-4b10-8595-c7badfe07544\") " pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.846176 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d7afad35-2240-4b10-8595-c7badfe07544-host-run-ovn-kubernetes\") pod \"ovnkube-node-sblgv\" (UID: \"d7afad35-2240-4b10-8595-c7badfe07544\") " pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.846194 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d7afad35-2240-4b10-8595-c7badfe07544-run-openvswitch\") pod \"ovnkube-node-sblgv\" (UID: \"d7afad35-2240-4b10-8595-c7badfe07544\") " pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.846205 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d7afad35-2240-4b10-8595-c7badfe07544-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sblgv\" (UID: \"d7afad35-2240-4b10-8595-c7badfe07544\") " pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.846219 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d7afad35-2240-4b10-8595-c7badfe07544-node-log\") pod \"ovnkube-node-sblgv\" (UID: \"d7afad35-2240-4b10-8595-c7badfe07544\") " pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.846434 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d7afad35-2240-4b10-8595-c7badfe07544-env-overrides\") pod \"ovnkube-node-sblgv\" (UID: \"d7afad35-2240-4b10-8595-c7badfe07544\") " pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.846639 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d7afad35-2240-4b10-8595-c7badfe07544-ovnkube-script-lib\") pod \"ovnkube-node-sblgv\" (UID: \"d7afad35-2240-4b10-8595-c7badfe07544\") " pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.847057 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d7afad35-2240-4b10-8595-c7badfe07544-ovnkube-config\") pod \"ovnkube-node-sblgv\" (UID: \"d7afad35-2240-4b10-8595-c7badfe07544\") " pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.851174 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d7afad35-2240-4b10-8595-c7badfe07544-ovn-node-metrics-cert\") pod \"ovnkube-node-sblgv\" (UID: \"d7afad35-2240-4b10-8595-c7badfe07544\") " pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.864322 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cppcb\" (UniqueName: \"kubernetes.io/projected/d7afad35-2240-4b10-8595-c7badfe07544-kube-api-access-cppcb\") pod \"ovnkube-node-sblgv\" (UID: \"d7afad35-2240-4b10-8595-c7badfe07544\") " pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:07:46 crc kubenswrapper[4585]: I1201 14:07:46.888397 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.156492 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9wjs5_6e7ad3ad-7937-409b-b1c9-9c801f937400/kube-multus/2.log" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.157927 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9wjs5_6e7ad3ad-7937-409b-b1c9-9c801f937400/kube-multus/1.log" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.157994 4585 generic.go:334] "Generic (PLEG): container finished" podID="6e7ad3ad-7937-409b-b1c9-9c801f937400" containerID="faa193efeada9b36721dd685be49ca406d22ffe8d1ba80f075c5d12bec3e3baf" exitCode=2 Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.158046 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9wjs5" event={"ID":"6e7ad3ad-7937-409b-b1c9-9c801f937400","Type":"ContainerDied","Data":"faa193efeada9b36721dd685be49ca406d22ffe8d1ba80f075c5d12bec3e3baf"} Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.158106 4585 scope.go:117] "RemoveContainer" containerID="babb191a7dbe739f4bf7c2ae917b82b700f2987751c60a05ef3d9b3d11195953" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.159480 4585 scope.go:117] "RemoveContainer" containerID="faa193efeada9b36721dd685be49ca406d22ffe8d1ba80f075c5d12bec3e3baf" Dec 01 14:07:47 crc kubenswrapper[4585]: E1201 14:07:47.159893 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-9wjs5_openshift-multus(6e7ad3ad-7937-409b-b1c9-9c801f937400)\"" pod="openshift-multus/multus-9wjs5" podUID="6e7ad3ad-7937-409b-b1c9-9c801f937400" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.162934 4585 generic.go:334] "Generic (PLEG): container finished" podID="d7afad35-2240-4b10-8595-c7badfe07544" containerID="aba3ef59d9ad41f32e031cf963fed5afadbfd142c51966e49b9a1807d32f7368" exitCode=0 Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.163127 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" event={"ID":"d7afad35-2240-4b10-8595-c7badfe07544","Type":"ContainerDied","Data":"aba3ef59d9ad41f32e031cf963fed5afadbfd142c51966e49b9a1807d32f7368"} Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.163197 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" event={"ID":"d7afad35-2240-4b10-8595-c7badfe07544","Type":"ContainerStarted","Data":"37b97302079bbef9f572e811b17990c46a747f444372ad25a4ccede604ebede8"} Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.168243 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tjkqr_b0b45150-070d-4f7c-b53a-d76dcbaa6e6d/ovnkube-controller/3.log" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.175743 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tjkqr_b0b45150-070d-4f7c-b53a-d76dcbaa6e6d/ovn-acl-logging/0.log" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.184251 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tjkqr_b0b45150-070d-4f7c-b53a-d76dcbaa6e6d/ovn-controller/0.log" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.189416 4585 generic.go:334] "Generic (PLEG): container finished" podID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" containerID="3b2013660155359d058ac5bb7ff50988013ccaf6dd753976bfddfdfc11f9ec90" exitCode=0 Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.189463 4585 generic.go:334] "Generic (PLEG): container finished" podID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" containerID="5c85de7cd35b11158d931065cec6bafaf456be350f4a6095b8a9d0851bf87b1a" exitCode=0 Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.189491 4585 generic.go:334] "Generic (PLEG): container finished" podID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" containerID="ec93c685227d533ac1ab980dca6f72792f746e68511a7a546fb25189b02507d3" exitCode=0 Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.189500 4585 generic.go:334] "Generic (PLEG): container finished" podID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" containerID="1b64c1a0d36f8addf6040455b1e933756582857ef8098c935b9cc5a115e8afee" exitCode=0 Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.189507 4585 generic.go:334] "Generic (PLEG): container finished" podID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" containerID="37437844a9c9719627f96c0afa303f0d449c46def7249b8c69ebd44900661fa4" exitCode=0 Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.189514 4585 generic.go:334] "Generic (PLEG): container finished" podID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" containerID="90163e3776ee8b2a338e3f8e6aa3d918d1d6123d246fbbd88659d342348167a3" exitCode=0 Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.189520 4585 generic.go:334] "Generic (PLEG): container finished" podID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" containerID="0d740a1be02b59a062ea8d5c44620cb57725a7ec494fd365e44a8980947c0d2e" exitCode=143 Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.189528 4585 generic.go:334] "Generic (PLEG): container finished" podID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" containerID="fdbccd3910729d23067d94c9ed2f5f3c917cd4880185a3b7e8c4c811cd30e609" exitCode=143 Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.189570 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" event={"ID":"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d","Type":"ContainerDied","Data":"3b2013660155359d058ac5bb7ff50988013ccaf6dd753976bfddfdfc11f9ec90"} Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.189600 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" event={"ID":"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d","Type":"ContainerDied","Data":"5c85de7cd35b11158d931065cec6bafaf456be350f4a6095b8a9d0851bf87b1a"} Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.189611 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" event={"ID":"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d","Type":"ContainerDied","Data":"ec93c685227d533ac1ab980dca6f72792f746e68511a7a546fb25189b02507d3"} Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.189622 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" event={"ID":"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d","Type":"ContainerDied","Data":"1b64c1a0d36f8addf6040455b1e933756582857ef8098c935b9cc5a115e8afee"} Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.189653 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" event={"ID":"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d","Type":"ContainerDied","Data":"37437844a9c9719627f96c0afa303f0d449c46def7249b8c69ebd44900661fa4"} Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.189665 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" event={"ID":"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d","Type":"ContainerDied","Data":"90163e3776ee8b2a338e3f8e6aa3d918d1d6123d246fbbd88659d342348167a3"} Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.189668 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.189676 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3b2013660155359d058ac5bb7ff50988013ccaf6dd753976bfddfdfc11f9ec90"} Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.191103 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6fdc8fb68081a35662e2944a74d74746db096b944121e6261161172793f792fd"} Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.191120 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5c85de7cd35b11158d931065cec6bafaf456be350f4a6095b8a9d0851bf87b1a"} Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.191128 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ec93c685227d533ac1ab980dca6f72792f746e68511a7a546fb25189b02507d3"} Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.191135 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1b64c1a0d36f8addf6040455b1e933756582857ef8098c935b9cc5a115e8afee"} Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.191143 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"37437844a9c9719627f96c0afa303f0d449c46def7249b8c69ebd44900661fa4"} Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.191251 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"90163e3776ee8b2a338e3f8e6aa3d918d1d6123d246fbbd88659d342348167a3"} Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.191264 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0d740a1be02b59a062ea8d5c44620cb57725a7ec494fd365e44a8980947c0d2e"} Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.191272 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fdbccd3910729d23067d94c9ed2f5f3c917cd4880185a3b7e8c4c811cd30e609"} Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.191279 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58"} Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.191293 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" event={"ID":"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d","Type":"ContainerDied","Data":"0d740a1be02b59a062ea8d5c44620cb57725a7ec494fd365e44a8980947c0d2e"} Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.191313 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3b2013660155359d058ac5bb7ff50988013ccaf6dd753976bfddfdfc11f9ec90"} Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.191322 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6fdc8fb68081a35662e2944a74d74746db096b944121e6261161172793f792fd"} Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.191328 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5c85de7cd35b11158d931065cec6bafaf456be350f4a6095b8a9d0851bf87b1a"} Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.191334 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ec93c685227d533ac1ab980dca6f72792f746e68511a7a546fb25189b02507d3"} Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.191341 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1b64c1a0d36f8addf6040455b1e933756582857ef8098c935b9cc5a115e8afee"} Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.191346 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"37437844a9c9719627f96c0afa303f0d449c46def7249b8c69ebd44900661fa4"} Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.191352 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"90163e3776ee8b2a338e3f8e6aa3d918d1d6123d246fbbd88659d342348167a3"} Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.191358 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0d740a1be02b59a062ea8d5c44620cb57725a7ec494fd365e44a8980947c0d2e"} Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.191363 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fdbccd3910729d23067d94c9ed2f5f3c917cd4880185a3b7e8c4c811cd30e609"} Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.191370 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58"} Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.191378 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" event={"ID":"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d","Type":"ContainerDied","Data":"fdbccd3910729d23067d94c9ed2f5f3c917cd4880185a3b7e8c4c811cd30e609"} Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.191388 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3b2013660155359d058ac5bb7ff50988013ccaf6dd753976bfddfdfc11f9ec90"} Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.191394 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6fdc8fb68081a35662e2944a74d74746db096b944121e6261161172793f792fd"} Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.191399 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5c85de7cd35b11158d931065cec6bafaf456be350f4a6095b8a9d0851bf87b1a"} Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.191405 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ec93c685227d533ac1ab980dca6f72792f746e68511a7a546fb25189b02507d3"} Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.191410 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1b64c1a0d36f8addf6040455b1e933756582857ef8098c935b9cc5a115e8afee"} Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.191415 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"37437844a9c9719627f96c0afa303f0d449c46def7249b8c69ebd44900661fa4"} Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.191421 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"90163e3776ee8b2a338e3f8e6aa3d918d1d6123d246fbbd88659d342348167a3"} Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.191427 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0d740a1be02b59a062ea8d5c44620cb57725a7ec494fd365e44a8980947c0d2e"} Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.191433 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fdbccd3910729d23067d94c9ed2f5f3c917cd4880185a3b7e8c4c811cd30e609"} Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.191439 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58"} Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.191447 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tjkqr" event={"ID":"b0b45150-070d-4f7c-b53a-d76dcbaa6e6d","Type":"ContainerDied","Data":"8009062339cfd4914b028580ac141efa8f7917ae1acd3198f12de89af62f34b4"} Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.191456 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3b2013660155359d058ac5bb7ff50988013ccaf6dd753976bfddfdfc11f9ec90"} Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.191463 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6fdc8fb68081a35662e2944a74d74746db096b944121e6261161172793f792fd"} Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.191470 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5c85de7cd35b11158d931065cec6bafaf456be350f4a6095b8a9d0851bf87b1a"} Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.191476 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ec93c685227d533ac1ab980dca6f72792f746e68511a7a546fb25189b02507d3"} Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.191482 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1b64c1a0d36f8addf6040455b1e933756582857ef8098c935b9cc5a115e8afee"} Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.191488 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"37437844a9c9719627f96c0afa303f0d449c46def7249b8c69ebd44900661fa4"} Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.191494 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"90163e3776ee8b2a338e3f8e6aa3d918d1d6123d246fbbd88659d342348167a3"} Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.191500 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0d740a1be02b59a062ea8d5c44620cb57725a7ec494fd365e44a8980947c0d2e"} Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.191506 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fdbccd3910729d23067d94c9ed2f5f3c917cd4880185a3b7e8c4c811cd30e609"} Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.191519 4585 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58"} Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.196254 4585 scope.go:117] "RemoveContainer" containerID="3b2013660155359d058ac5bb7ff50988013ccaf6dd753976bfddfdfc11f9ec90" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.266601 4585 scope.go:117] "RemoveContainer" containerID="6fdc8fb68081a35662e2944a74d74746db096b944121e6261161172793f792fd" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.303235 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-tjkqr"] Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.308558 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-tjkqr"] Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.311911 4585 scope.go:117] "RemoveContainer" containerID="5c85de7cd35b11158d931065cec6bafaf456be350f4a6095b8a9d0851bf87b1a" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.335213 4585 scope.go:117] "RemoveContainer" containerID="ec93c685227d533ac1ab980dca6f72792f746e68511a7a546fb25189b02507d3" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.349502 4585 scope.go:117] "RemoveContainer" containerID="1b64c1a0d36f8addf6040455b1e933756582857ef8098c935b9cc5a115e8afee" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.365020 4585 scope.go:117] "RemoveContainer" containerID="37437844a9c9719627f96c0afa303f0d449c46def7249b8c69ebd44900661fa4" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.388459 4585 scope.go:117] "RemoveContainer" containerID="90163e3776ee8b2a338e3f8e6aa3d918d1d6123d246fbbd88659d342348167a3" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.467685 4585 scope.go:117] "RemoveContainer" containerID="0d740a1be02b59a062ea8d5c44620cb57725a7ec494fd365e44a8980947c0d2e" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.493394 4585 scope.go:117] "RemoveContainer" containerID="fdbccd3910729d23067d94c9ed2f5f3c917cd4880185a3b7e8c4c811cd30e609" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.516474 4585 scope.go:117] "RemoveContainer" containerID="3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.533464 4585 scope.go:117] "RemoveContainer" containerID="3b2013660155359d058ac5bb7ff50988013ccaf6dd753976bfddfdfc11f9ec90" Dec 01 14:07:47 crc kubenswrapper[4585]: E1201 14:07:47.534103 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b2013660155359d058ac5bb7ff50988013ccaf6dd753976bfddfdfc11f9ec90\": container with ID starting with 3b2013660155359d058ac5bb7ff50988013ccaf6dd753976bfddfdfc11f9ec90 not found: ID does not exist" containerID="3b2013660155359d058ac5bb7ff50988013ccaf6dd753976bfddfdfc11f9ec90" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.534134 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b2013660155359d058ac5bb7ff50988013ccaf6dd753976bfddfdfc11f9ec90"} err="failed to get container status \"3b2013660155359d058ac5bb7ff50988013ccaf6dd753976bfddfdfc11f9ec90\": rpc error: code = NotFound desc = could not find container \"3b2013660155359d058ac5bb7ff50988013ccaf6dd753976bfddfdfc11f9ec90\": container with ID starting with 3b2013660155359d058ac5bb7ff50988013ccaf6dd753976bfddfdfc11f9ec90 not found: ID does not exist" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.534162 4585 scope.go:117] "RemoveContainer" containerID="6fdc8fb68081a35662e2944a74d74746db096b944121e6261161172793f792fd" Dec 01 14:07:47 crc kubenswrapper[4585]: E1201 14:07:47.534422 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fdc8fb68081a35662e2944a74d74746db096b944121e6261161172793f792fd\": container with ID starting with 6fdc8fb68081a35662e2944a74d74746db096b944121e6261161172793f792fd not found: ID does not exist" containerID="6fdc8fb68081a35662e2944a74d74746db096b944121e6261161172793f792fd" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.534444 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fdc8fb68081a35662e2944a74d74746db096b944121e6261161172793f792fd"} err="failed to get container status \"6fdc8fb68081a35662e2944a74d74746db096b944121e6261161172793f792fd\": rpc error: code = NotFound desc = could not find container \"6fdc8fb68081a35662e2944a74d74746db096b944121e6261161172793f792fd\": container with ID starting with 6fdc8fb68081a35662e2944a74d74746db096b944121e6261161172793f792fd not found: ID does not exist" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.534457 4585 scope.go:117] "RemoveContainer" containerID="5c85de7cd35b11158d931065cec6bafaf456be350f4a6095b8a9d0851bf87b1a" Dec 01 14:07:47 crc kubenswrapper[4585]: E1201 14:07:47.534704 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c85de7cd35b11158d931065cec6bafaf456be350f4a6095b8a9d0851bf87b1a\": container with ID starting with 5c85de7cd35b11158d931065cec6bafaf456be350f4a6095b8a9d0851bf87b1a not found: ID does not exist" containerID="5c85de7cd35b11158d931065cec6bafaf456be350f4a6095b8a9d0851bf87b1a" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.534719 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c85de7cd35b11158d931065cec6bafaf456be350f4a6095b8a9d0851bf87b1a"} err="failed to get container status \"5c85de7cd35b11158d931065cec6bafaf456be350f4a6095b8a9d0851bf87b1a\": rpc error: code = NotFound desc = could not find container \"5c85de7cd35b11158d931065cec6bafaf456be350f4a6095b8a9d0851bf87b1a\": container with ID starting with 5c85de7cd35b11158d931065cec6bafaf456be350f4a6095b8a9d0851bf87b1a not found: ID does not exist" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.534732 4585 scope.go:117] "RemoveContainer" containerID="ec93c685227d533ac1ab980dca6f72792f746e68511a7a546fb25189b02507d3" Dec 01 14:07:47 crc kubenswrapper[4585]: E1201 14:07:47.534950 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec93c685227d533ac1ab980dca6f72792f746e68511a7a546fb25189b02507d3\": container with ID starting with ec93c685227d533ac1ab980dca6f72792f746e68511a7a546fb25189b02507d3 not found: ID does not exist" containerID="ec93c685227d533ac1ab980dca6f72792f746e68511a7a546fb25189b02507d3" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.534965 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec93c685227d533ac1ab980dca6f72792f746e68511a7a546fb25189b02507d3"} err="failed to get container status \"ec93c685227d533ac1ab980dca6f72792f746e68511a7a546fb25189b02507d3\": rpc error: code = NotFound desc = could not find container \"ec93c685227d533ac1ab980dca6f72792f746e68511a7a546fb25189b02507d3\": container with ID starting with ec93c685227d533ac1ab980dca6f72792f746e68511a7a546fb25189b02507d3 not found: ID does not exist" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.534994 4585 scope.go:117] "RemoveContainer" containerID="1b64c1a0d36f8addf6040455b1e933756582857ef8098c935b9cc5a115e8afee" Dec 01 14:07:47 crc kubenswrapper[4585]: E1201 14:07:47.535277 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b64c1a0d36f8addf6040455b1e933756582857ef8098c935b9cc5a115e8afee\": container with ID starting with 1b64c1a0d36f8addf6040455b1e933756582857ef8098c935b9cc5a115e8afee not found: ID does not exist" containerID="1b64c1a0d36f8addf6040455b1e933756582857ef8098c935b9cc5a115e8afee" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.535298 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b64c1a0d36f8addf6040455b1e933756582857ef8098c935b9cc5a115e8afee"} err="failed to get container status \"1b64c1a0d36f8addf6040455b1e933756582857ef8098c935b9cc5a115e8afee\": rpc error: code = NotFound desc = could not find container \"1b64c1a0d36f8addf6040455b1e933756582857ef8098c935b9cc5a115e8afee\": container with ID starting with 1b64c1a0d36f8addf6040455b1e933756582857ef8098c935b9cc5a115e8afee not found: ID does not exist" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.535313 4585 scope.go:117] "RemoveContainer" containerID="37437844a9c9719627f96c0afa303f0d449c46def7249b8c69ebd44900661fa4" Dec 01 14:07:47 crc kubenswrapper[4585]: E1201 14:07:47.535811 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37437844a9c9719627f96c0afa303f0d449c46def7249b8c69ebd44900661fa4\": container with ID starting with 37437844a9c9719627f96c0afa303f0d449c46def7249b8c69ebd44900661fa4 not found: ID does not exist" containerID="37437844a9c9719627f96c0afa303f0d449c46def7249b8c69ebd44900661fa4" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.535829 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37437844a9c9719627f96c0afa303f0d449c46def7249b8c69ebd44900661fa4"} err="failed to get container status \"37437844a9c9719627f96c0afa303f0d449c46def7249b8c69ebd44900661fa4\": rpc error: code = NotFound desc = could not find container \"37437844a9c9719627f96c0afa303f0d449c46def7249b8c69ebd44900661fa4\": container with ID starting with 37437844a9c9719627f96c0afa303f0d449c46def7249b8c69ebd44900661fa4 not found: ID does not exist" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.535841 4585 scope.go:117] "RemoveContainer" containerID="90163e3776ee8b2a338e3f8e6aa3d918d1d6123d246fbbd88659d342348167a3" Dec 01 14:07:47 crc kubenswrapper[4585]: E1201 14:07:47.536119 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90163e3776ee8b2a338e3f8e6aa3d918d1d6123d246fbbd88659d342348167a3\": container with ID starting with 90163e3776ee8b2a338e3f8e6aa3d918d1d6123d246fbbd88659d342348167a3 not found: ID does not exist" containerID="90163e3776ee8b2a338e3f8e6aa3d918d1d6123d246fbbd88659d342348167a3" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.536139 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90163e3776ee8b2a338e3f8e6aa3d918d1d6123d246fbbd88659d342348167a3"} err="failed to get container status \"90163e3776ee8b2a338e3f8e6aa3d918d1d6123d246fbbd88659d342348167a3\": rpc error: code = NotFound desc = could not find container \"90163e3776ee8b2a338e3f8e6aa3d918d1d6123d246fbbd88659d342348167a3\": container with ID starting with 90163e3776ee8b2a338e3f8e6aa3d918d1d6123d246fbbd88659d342348167a3 not found: ID does not exist" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.536151 4585 scope.go:117] "RemoveContainer" containerID="0d740a1be02b59a062ea8d5c44620cb57725a7ec494fd365e44a8980947c0d2e" Dec 01 14:07:47 crc kubenswrapper[4585]: E1201 14:07:47.536400 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d740a1be02b59a062ea8d5c44620cb57725a7ec494fd365e44a8980947c0d2e\": container with ID starting with 0d740a1be02b59a062ea8d5c44620cb57725a7ec494fd365e44a8980947c0d2e not found: ID does not exist" containerID="0d740a1be02b59a062ea8d5c44620cb57725a7ec494fd365e44a8980947c0d2e" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.536416 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d740a1be02b59a062ea8d5c44620cb57725a7ec494fd365e44a8980947c0d2e"} err="failed to get container status \"0d740a1be02b59a062ea8d5c44620cb57725a7ec494fd365e44a8980947c0d2e\": rpc error: code = NotFound desc = could not find container \"0d740a1be02b59a062ea8d5c44620cb57725a7ec494fd365e44a8980947c0d2e\": container with ID starting with 0d740a1be02b59a062ea8d5c44620cb57725a7ec494fd365e44a8980947c0d2e not found: ID does not exist" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.536428 4585 scope.go:117] "RemoveContainer" containerID="fdbccd3910729d23067d94c9ed2f5f3c917cd4880185a3b7e8c4c811cd30e609" Dec 01 14:07:47 crc kubenswrapper[4585]: E1201 14:07:47.536931 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdbccd3910729d23067d94c9ed2f5f3c917cd4880185a3b7e8c4c811cd30e609\": container with ID starting with fdbccd3910729d23067d94c9ed2f5f3c917cd4880185a3b7e8c4c811cd30e609 not found: ID does not exist" containerID="fdbccd3910729d23067d94c9ed2f5f3c917cd4880185a3b7e8c4c811cd30e609" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.536947 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdbccd3910729d23067d94c9ed2f5f3c917cd4880185a3b7e8c4c811cd30e609"} err="failed to get container status \"fdbccd3910729d23067d94c9ed2f5f3c917cd4880185a3b7e8c4c811cd30e609\": rpc error: code = NotFound desc = could not find container \"fdbccd3910729d23067d94c9ed2f5f3c917cd4880185a3b7e8c4c811cd30e609\": container with ID starting with fdbccd3910729d23067d94c9ed2f5f3c917cd4880185a3b7e8c4c811cd30e609 not found: ID does not exist" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.536960 4585 scope.go:117] "RemoveContainer" containerID="3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58" Dec 01 14:07:47 crc kubenswrapper[4585]: E1201 14:07:47.539964 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\": container with ID starting with 3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58 not found: ID does not exist" containerID="3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.540018 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58"} err="failed to get container status \"3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\": rpc error: code = NotFound desc = could not find container \"3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\": container with ID starting with 3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58 not found: ID does not exist" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.540039 4585 scope.go:117] "RemoveContainer" containerID="3b2013660155359d058ac5bb7ff50988013ccaf6dd753976bfddfdfc11f9ec90" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.540335 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b2013660155359d058ac5bb7ff50988013ccaf6dd753976bfddfdfc11f9ec90"} err="failed to get container status \"3b2013660155359d058ac5bb7ff50988013ccaf6dd753976bfddfdfc11f9ec90\": rpc error: code = NotFound desc = could not find container \"3b2013660155359d058ac5bb7ff50988013ccaf6dd753976bfddfdfc11f9ec90\": container with ID starting with 3b2013660155359d058ac5bb7ff50988013ccaf6dd753976bfddfdfc11f9ec90 not found: ID does not exist" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.540357 4585 scope.go:117] "RemoveContainer" containerID="6fdc8fb68081a35662e2944a74d74746db096b944121e6261161172793f792fd" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.541017 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fdc8fb68081a35662e2944a74d74746db096b944121e6261161172793f792fd"} err="failed to get container status \"6fdc8fb68081a35662e2944a74d74746db096b944121e6261161172793f792fd\": rpc error: code = NotFound desc = could not find container \"6fdc8fb68081a35662e2944a74d74746db096b944121e6261161172793f792fd\": container with ID starting with 6fdc8fb68081a35662e2944a74d74746db096b944121e6261161172793f792fd not found: ID does not exist" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.541038 4585 scope.go:117] "RemoveContainer" containerID="5c85de7cd35b11158d931065cec6bafaf456be350f4a6095b8a9d0851bf87b1a" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.541374 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c85de7cd35b11158d931065cec6bafaf456be350f4a6095b8a9d0851bf87b1a"} err="failed to get container status \"5c85de7cd35b11158d931065cec6bafaf456be350f4a6095b8a9d0851bf87b1a\": rpc error: code = NotFound desc = could not find container \"5c85de7cd35b11158d931065cec6bafaf456be350f4a6095b8a9d0851bf87b1a\": container with ID starting with 5c85de7cd35b11158d931065cec6bafaf456be350f4a6095b8a9d0851bf87b1a not found: ID does not exist" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.541396 4585 scope.go:117] "RemoveContainer" containerID="ec93c685227d533ac1ab980dca6f72792f746e68511a7a546fb25189b02507d3" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.541584 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec93c685227d533ac1ab980dca6f72792f746e68511a7a546fb25189b02507d3"} err="failed to get container status \"ec93c685227d533ac1ab980dca6f72792f746e68511a7a546fb25189b02507d3\": rpc error: code = NotFound desc = could not find container \"ec93c685227d533ac1ab980dca6f72792f746e68511a7a546fb25189b02507d3\": container with ID starting with ec93c685227d533ac1ab980dca6f72792f746e68511a7a546fb25189b02507d3 not found: ID does not exist" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.541599 4585 scope.go:117] "RemoveContainer" containerID="1b64c1a0d36f8addf6040455b1e933756582857ef8098c935b9cc5a115e8afee" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.541892 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b64c1a0d36f8addf6040455b1e933756582857ef8098c935b9cc5a115e8afee"} err="failed to get container status \"1b64c1a0d36f8addf6040455b1e933756582857ef8098c935b9cc5a115e8afee\": rpc error: code = NotFound desc = could not find container \"1b64c1a0d36f8addf6040455b1e933756582857ef8098c935b9cc5a115e8afee\": container with ID starting with 1b64c1a0d36f8addf6040455b1e933756582857ef8098c935b9cc5a115e8afee not found: ID does not exist" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.541966 4585 scope.go:117] "RemoveContainer" containerID="37437844a9c9719627f96c0afa303f0d449c46def7249b8c69ebd44900661fa4" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.542381 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37437844a9c9719627f96c0afa303f0d449c46def7249b8c69ebd44900661fa4"} err="failed to get container status \"37437844a9c9719627f96c0afa303f0d449c46def7249b8c69ebd44900661fa4\": rpc error: code = NotFound desc = could not find container \"37437844a9c9719627f96c0afa303f0d449c46def7249b8c69ebd44900661fa4\": container with ID starting with 37437844a9c9719627f96c0afa303f0d449c46def7249b8c69ebd44900661fa4 not found: ID does not exist" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.542401 4585 scope.go:117] "RemoveContainer" containerID="90163e3776ee8b2a338e3f8e6aa3d918d1d6123d246fbbd88659d342348167a3" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.542758 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90163e3776ee8b2a338e3f8e6aa3d918d1d6123d246fbbd88659d342348167a3"} err="failed to get container status \"90163e3776ee8b2a338e3f8e6aa3d918d1d6123d246fbbd88659d342348167a3\": rpc error: code = NotFound desc = could not find container \"90163e3776ee8b2a338e3f8e6aa3d918d1d6123d246fbbd88659d342348167a3\": container with ID starting with 90163e3776ee8b2a338e3f8e6aa3d918d1d6123d246fbbd88659d342348167a3 not found: ID does not exist" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.542778 4585 scope.go:117] "RemoveContainer" containerID="0d740a1be02b59a062ea8d5c44620cb57725a7ec494fd365e44a8980947c0d2e" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.542990 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d740a1be02b59a062ea8d5c44620cb57725a7ec494fd365e44a8980947c0d2e"} err="failed to get container status \"0d740a1be02b59a062ea8d5c44620cb57725a7ec494fd365e44a8980947c0d2e\": rpc error: code = NotFound desc = could not find container \"0d740a1be02b59a062ea8d5c44620cb57725a7ec494fd365e44a8980947c0d2e\": container with ID starting with 0d740a1be02b59a062ea8d5c44620cb57725a7ec494fd365e44a8980947c0d2e not found: ID does not exist" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.543007 4585 scope.go:117] "RemoveContainer" containerID="fdbccd3910729d23067d94c9ed2f5f3c917cd4880185a3b7e8c4c811cd30e609" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.543253 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdbccd3910729d23067d94c9ed2f5f3c917cd4880185a3b7e8c4c811cd30e609"} err="failed to get container status \"fdbccd3910729d23067d94c9ed2f5f3c917cd4880185a3b7e8c4c811cd30e609\": rpc error: code = NotFound desc = could not find container \"fdbccd3910729d23067d94c9ed2f5f3c917cd4880185a3b7e8c4c811cd30e609\": container with ID starting with fdbccd3910729d23067d94c9ed2f5f3c917cd4880185a3b7e8c4c811cd30e609 not found: ID does not exist" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.543278 4585 scope.go:117] "RemoveContainer" containerID="3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.543596 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58"} err="failed to get container status \"3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\": rpc error: code = NotFound desc = could not find container \"3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\": container with ID starting with 3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58 not found: ID does not exist" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.543612 4585 scope.go:117] "RemoveContainer" containerID="3b2013660155359d058ac5bb7ff50988013ccaf6dd753976bfddfdfc11f9ec90" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.543817 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b2013660155359d058ac5bb7ff50988013ccaf6dd753976bfddfdfc11f9ec90"} err="failed to get container status \"3b2013660155359d058ac5bb7ff50988013ccaf6dd753976bfddfdfc11f9ec90\": rpc error: code = NotFound desc = could not find container \"3b2013660155359d058ac5bb7ff50988013ccaf6dd753976bfddfdfc11f9ec90\": container with ID starting with 3b2013660155359d058ac5bb7ff50988013ccaf6dd753976bfddfdfc11f9ec90 not found: ID does not exist" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.543837 4585 scope.go:117] "RemoveContainer" containerID="6fdc8fb68081a35662e2944a74d74746db096b944121e6261161172793f792fd" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.544042 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fdc8fb68081a35662e2944a74d74746db096b944121e6261161172793f792fd"} err="failed to get container status \"6fdc8fb68081a35662e2944a74d74746db096b944121e6261161172793f792fd\": rpc error: code = NotFound desc = could not find container \"6fdc8fb68081a35662e2944a74d74746db096b944121e6261161172793f792fd\": container with ID starting with 6fdc8fb68081a35662e2944a74d74746db096b944121e6261161172793f792fd not found: ID does not exist" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.544065 4585 scope.go:117] "RemoveContainer" containerID="5c85de7cd35b11158d931065cec6bafaf456be350f4a6095b8a9d0851bf87b1a" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.544268 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c85de7cd35b11158d931065cec6bafaf456be350f4a6095b8a9d0851bf87b1a"} err="failed to get container status \"5c85de7cd35b11158d931065cec6bafaf456be350f4a6095b8a9d0851bf87b1a\": rpc error: code = NotFound desc = could not find container \"5c85de7cd35b11158d931065cec6bafaf456be350f4a6095b8a9d0851bf87b1a\": container with ID starting with 5c85de7cd35b11158d931065cec6bafaf456be350f4a6095b8a9d0851bf87b1a not found: ID does not exist" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.544286 4585 scope.go:117] "RemoveContainer" containerID="ec93c685227d533ac1ab980dca6f72792f746e68511a7a546fb25189b02507d3" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.544495 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec93c685227d533ac1ab980dca6f72792f746e68511a7a546fb25189b02507d3"} err="failed to get container status \"ec93c685227d533ac1ab980dca6f72792f746e68511a7a546fb25189b02507d3\": rpc error: code = NotFound desc = could not find container \"ec93c685227d533ac1ab980dca6f72792f746e68511a7a546fb25189b02507d3\": container with ID starting with ec93c685227d533ac1ab980dca6f72792f746e68511a7a546fb25189b02507d3 not found: ID does not exist" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.544513 4585 scope.go:117] "RemoveContainer" containerID="1b64c1a0d36f8addf6040455b1e933756582857ef8098c935b9cc5a115e8afee" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.544674 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b64c1a0d36f8addf6040455b1e933756582857ef8098c935b9cc5a115e8afee"} err="failed to get container status \"1b64c1a0d36f8addf6040455b1e933756582857ef8098c935b9cc5a115e8afee\": rpc error: code = NotFound desc = could not find container \"1b64c1a0d36f8addf6040455b1e933756582857ef8098c935b9cc5a115e8afee\": container with ID starting with 1b64c1a0d36f8addf6040455b1e933756582857ef8098c935b9cc5a115e8afee not found: ID does not exist" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.544691 4585 scope.go:117] "RemoveContainer" containerID="37437844a9c9719627f96c0afa303f0d449c46def7249b8c69ebd44900661fa4" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.544845 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37437844a9c9719627f96c0afa303f0d449c46def7249b8c69ebd44900661fa4"} err="failed to get container status \"37437844a9c9719627f96c0afa303f0d449c46def7249b8c69ebd44900661fa4\": rpc error: code = NotFound desc = could not find container \"37437844a9c9719627f96c0afa303f0d449c46def7249b8c69ebd44900661fa4\": container with ID starting with 37437844a9c9719627f96c0afa303f0d449c46def7249b8c69ebd44900661fa4 not found: ID does not exist" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.544862 4585 scope.go:117] "RemoveContainer" containerID="90163e3776ee8b2a338e3f8e6aa3d918d1d6123d246fbbd88659d342348167a3" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.545089 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90163e3776ee8b2a338e3f8e6aa3d918d1d6123d246fbbd88659d342348167a3"} err="failed to get container status \"90163e3776ee8b2a338e3f8e6aa3d918d1d6123d246fbbd88659d342348167a3\": rpc error: code = NotFound desc = could not find container \"90163e3776ee8b2a338e3f8e6aa3d918d1d6123d246fbbd88659d342348167a3\": container with ID starting with 90163e3776ee8b2a338e3f8e6aa3d918d1d6123d246fbbd88659d342348167a3 not found: ID does not exist" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.545121 4585 scope.go:117] "RemoveContainer" containerID="0d740a1be02b59a062ea8d5c44620cb57725a7ec494fd365e44a8980947c0d2e" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.545309 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d740a1be02b59a062ea8d5c44620cb57725a7ec494fd365e44a8980947c0d2e"} err="failed to get container status \"0d740a1be02b59a062ea8d5c44620cb57725a7ec494fd365e44a8980947c0d2e\": rpc error: code = NotFound desc = could not find container \"0d740a1be02b59a062ea8d5c44620cb57725a7ec494fd365e44a8980947c0d2e\": container with ID starting with 0d740a1be02b59a062ea8d5c44620cb57725a7ec494fd365e44a8980947c0d2e not found: ID does not exist" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.545328 4585 scope.go:117] "RemoveContainer" containerID="fdbccd3910729d23067d94c9ed2f5f3c917cd4880185a3b7e8c4c811cd30e609" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.545576 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdbccd3910729d23067d94c9ed2f5f3c917cd4880185a3b7e8c4c811cd30e609"} err="failed to get container status \"fdbccd3910729d23067d94c9ed2f5f3c917cd4880185a3b7e8c4c811cd30e609\": rpc error: code = NotFound desc = could not find container \"fdbccd3910729d23067d94c9ed2f5f3c917cd4880185a3b7e8c4c811cd30e609\": container with ID starting with fdbccd3910729d23067d94c9ed2f5f3c917cd4880185a3b7e8c4c811cd30e609 not found: ID does not exist" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.545602 4585 scope.go:117] "RemoveContainer" containerID="3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.545818 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58"} err="failed to get container status \"3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\": rpc error: code = NotFound desc = could not find container \"3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\": container with ID starting with 3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58 not found: ID does not exist" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.545841 4585 scope.go:117] "RemoveContainer" containerID="3b2013660155359d058ac5bb7ff50988013ccaf6dd753976bfddfdfc11f9ec90" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.546072 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b2013660155359d058ac5bb7ff50988013ccaf6dd753976bfddfdfc11f9ec90"} err="failed to get container status \"3b2013660155359d058ac5bb7ff50988013ccaf6dd753976bfddfdfc11f9ec90\": rpc error: code = NotFound desc = could not find container \"3b2013660155359d058ac5bb7ff50988013ccaf6dd753976bfddfdfc11f9ec90\": container with ID starting with 3b2013660155359d058ac5bb7ff50988013ccaf6dd753976bfddfdfc11f9ec90 not found: ID does not exist" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.546091 4585 scope.go:117] "RemoveContainer" containerID="6fdc8fb68081a35662e2944a74d74746db096b944121e6261161172793f792fd" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.546307 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fdc8fb68081a35662e2944a74d74746db096b944121e6261161172793f792fd"} err="failed to get container status \"6fdc8fb68081a35662e2944a74d74746db096b944121e6261161172793f792fd\": rpc error: code = NotFound desc = could not find container \"6fdc8fb68081a35662e2944a74d74746db096b944121e6261161172793f792fd\": container with ID starting with 6fdc8fb68081a35662e2944a74d74746db096b944121e6261161172793f792fd not found: ID does not exist" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.546325 4585 scope.go:117] "RemoveContainer" containerID="5c85de7cd35b11158d931065cec6bafaf456be350f4a6095b8a9d0851bf87b1a" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.546482 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c85de7cd35b11158d931065cec6bafaf456be350f4a6095b8a9d0851bf87b1a"} err="failed to get container status \"5c85de7cd35b11158d931065cec6bafaf456be350f4a6095b8a9d0851bf87b1a\": rpc error: code = NotFound desc = could not find container \"5c85de7cd35b11158d931065cec6bafaf456be350f4a6095b8a9d0851bf87b1a\": container with ID starting with 5c85de7cd35b11158d931065cec6bafaf456be350f4a6095b8a9d0851bf87b1a not found: ID does not exist" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.546501 4585 scope.go:117] "RemoveContainer" containerID="ec93c685227d533ac1ab980dca6f72792f746e68511a7a546fb25189b02507d3" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.546715 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec93c685227d533ac1ab980dca6f72792f746e68511a7a546fb25189b02507d3"} err="failed to get container status \"ec93c685227d533ac1ab980dca6f72792f746e68511a7a546fb25189b02507d3\": rpc error: code = NotFound desc = could not find container \"ec93c685227d533ac1ab980dca6f72792f746e68511a7a546fb25189b02507d3\": container with ID starting with ec93c685227d533ac1ab980dca6f72792f746e68511a7a546fb25189b02507d3 not found: ID does not exist" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.546736 4585 scope.go:117] "RemoveContainer" containerID="1b64c1a0d36f8addf6040455b1e933756582857ef8098c935b9cc5a115e8afee" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.546995 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b64c1a0d36f8addf6040455b1e933756582857ef8098c935b9cc5a115e8afee"} err="failed to get container status \"1b64c1a0d36f8addf6040455b1e933756582857ef8098c935b9cc5a115e8afee\": rpc error: code = NotFound desc = could not find container \"1b64c1a0d36f8addf6040455b1e933756582857ef8098c935b9cc5a115e8afee\": container with ID starting with 1b64c1a0d36f8addf6040455b1e933756582857ef8098c935b9cc5a115e8afee not found: ID does not exist" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.547014 4585 scope.go:117] "RemoveContainer" containerID="37437844a9c9719627f96c0afa303f0d449c46def7249b8c69ebd44900661fa4" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.547181 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37437844a9c9719627f96c0afa303f0d449c46def7249b8c69ebd44900661fa4"} err="failed to get container status \"37437844a9c9719627f96c0afa303f0d449c46def7249b8c69ebd44900661fa4\": rpc error: code = NotFound desc = could not find container \"37437844a9c9719627f96c0afa303f0d449c46def7249b8c69ebd44900661fa4\": container with ID starting with 37437844a9c9719627f96c0afa303f0d449c46def7249b8c69ebd44900661fa4 not found: ID does not exist" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.547198 4585 scope.go:117] "RemoveContainer" containerID="90163e3776ee8b2a338e3f8e6aa3d918d1d6123d246fbbd88659d342348167a3" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.547345 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90163e3776ee8b2a338e3f8e6aa3d918d1d6123d246fbbd88659d342348167a3"} err="failed to get container status \"90163e3776ee8b2a338e3f8e6aa3d918d1d6123d246fbbd88659d342348167a3\": rpc error: code = NotFound desc = could not find container \"90163e3776ee8b2a338e3f8e6aa3d918d1d6123d246fbbd88659d342348167a3\": container with ID starting with 90163e3776ee8b2a338e3f8e6aa3d918d1d6123d246fbbd88659d342348167a3 not found: ID does not exist" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.547362 4585 scope.go:117] "RemoveContainer" containerID="0d740a1be02b59a062ea8d5c44620cb57725a7ec494fd365e44a8980947c0d2e" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.547657 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d740a1be02b59a062ea8d5c44620cb57725a7ec494fd365e44a8980947c0d2e"} err="failed to get container status \"0d740a1be02b59a062ea8d5c44620cb57725a7ec494fd365e44a8980947c0d2e\": rpc error: code = NotFound desc = could not find container \"0d740a1be02b59a062ea8d5c44620cb57725a7ec494fd365e44a8980947c0d2e\": container with ID starting with 0d740a1be02b59a062ea8d5c44620cb57725a7ec494fd365e44a8980947c0d2e not found: ID does not exist" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.547676 4585 scope.go:117] "RemoveContainer" containerID="fdbccd3910729d23067d94c9ed2f5f3c917cd4880185a3b7e8c4c811cd30e609" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.547832 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdbccd3910729d23067d94c9ed2f5f3c917cd4880185a3b7e8c4c811cd30e609"} err="failed to get container status \"fdbccd3910729d23067d94c9ed2f5f3c917cd4880185a3b7e8c4c811cd30e609\": rpc error: code = NotFound desc = could not find container \"fdbccd3910729d23067d94c9ed2f5f3c917cd4880185a3b7e8c4c811cd30e609\": container with ID starting with fdbccd3910729d23067d94c9ed2f5f3c917cd4880185a3b7e8c4c811cd30e609 not found: ID does not exist" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.547849 4585 scope.go:117] "RemoveContainer" containerID="3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58" Dec 01 14:07:47 crc kubenswrapper[4585]: I1201 14:07:47.548022 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58"} err="failed to get container status \"3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\": rpc error: code = NotFound desc = could not find container \"3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58\": container with ID starting with 3ec28d0e19161204ce1aeedacc5c4a4cfa771a903d959818415f0670d8e45e58 not found: ID does not exist" Dec 01 14:07:48 crc kubenswrapper[4585]: I1201 14:07:48.204036 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9wjs5_6e7ad3ad-7937-409b-b1c9-9c801f937400/kube-multus/2.log" Dec 01 14:07:48 crc kubenswrapper[4585]: I1201 14:07:48.211019 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" event={"ID":"d7afad35-2240-4b10-8595-c7badfe07544","Type":"ContainerStarted","Data":"d0f816a0a188eca310d6677fb1be7d9dc5686d10131bf4ed411cf929303d4f9d"} Dec 01 14:07:48 crc kubenswrapper[4585]: I1201 14:07:48.211083 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" event={"ID":"d7afad35-2240-4b10-8595-c7badfe07544","Type":"ContainerStarted","Data":"468dc6b6bc312cddbaaca7a4de92b982ff2c24f0a8f68b4a9b1a3676f25eb31b"} Dec 01 14:07:48 crc kubenswrapper[4585]: I1201 14:07:48.211108 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" event={"ID":"d7afad35-2240-4b10-8595-c7badfe07544","Type":"ContainerStarted","Data":"6a2287018f8eec8ecb61c953c93c24e0e36bd1255d418e5cf609d97d4fb86c54"} Dec 01 14:07:48 crc kubenswrapper[4585]: I1201 14:07:48.211128 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" event={"ID":"d7afad35-2240-4b10-8595-c7badfe07544","Type":"ContainerStarted","Data":"462f19596d6656f38789f7301a7de69f4fbbd389f2b1433fe49d84f23f0d8cc5"} Dec 01 14:07:48 crc kubenswrapper[4585]: I1201 14:07:48.211148 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" event={"ID":"d7afad35-2240-4b10-8595-c7badfe07544","Type":"ContainerStarted","Data":"b885891786121d75c3f2517f6d817ad7523d5aba7826ba7874c0512a74a7fdc7"} Dec 01 14:07:48 crc kubenswrapper[4585]: I1201 14:07:48.211169 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" event={"ID":"d7afad35-2240-4b10-8595-c7badfe07544","Type":"ContainerStarted","Data":"5e0e489abdf8d7e7b5299267644e8db0b57e583f210982f1e71ba6e4d0ebb95a"} Dec 01 14:07:48 crc kubenswrapper[4585]: I1201 14:07:48.424100 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0b45150-070d-4f7c-b53a-d76dcbaa6e6d" path="/var/lib/kubelet/pods/b0b45150-070d-4f7c-b53a-d76dcbaa6e6d/volumes" Dec 01 14:07:50 crc kubenswrapper[4585]: I1201 14:07:50.239217 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" event={"ID":"d7afad35-2240-4b10-8595-c7badfe07544","Type":"ContainerStarted","Data":"f5759b83db1f2bedd79cacb4c5f3b1cb003191dea6e723d64dcdd673ea5de4fd"} Dec 01 14:07:53 crc kubenswrapper[4585]: I1201 14:07:53.262555 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" event={"ID":"d7afad35-2240-4b10-8595-c7badfe07544","Type":"ContainerStarted","Data":"2382ab90f7dcf45968fed9fdefd35929014666e4d55f418157c11eb1581bc33b"} Dec 01 14:07:53 crc kubenswrapper[4585]: I1201 14:07:53.263366 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:07:53 crc kubenswrapper[4585]: I1201 14:07:53.263387 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:07:53 crc kubenswrapper[4585]: I1201 14:07:53.291471 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:07:53 crc kubenswrapper[4585]: I1201 14:07:53.298847 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" podStartSLOduration=7.298821329 podStartE2EDuration="7.298821329s" podCreationTimestamp="2025-12-01 14:07:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:07:53.294965496 +0000 UTC m=+587.279179361" watchObservedRunningTime="2025-12-01 14:07:53.298821329 +0000 UTC m=+587.283035184" Dec 01 14:07:54 crc kubenswrapper[4585]: I1201 14:07:54.267664 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:07:54 crc kubenswrapper[4585]: I1201 14:07:54.313859 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:08:02 crc kubenswrapper[4585]: I1201 14:08:02.413598 4585 scope.go:117] "RemoveContainer" containerID="faa193efeada9b36721dd685be49ca406d22ffe8d1ba80f075c5d12bec3e3baf" Dec 01 14:08:02 crc kubenswrapper[4585]: E1201 14:08:02.414181 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-9wjs5_openshift-multus(6e7ad3ad-7937-409b-b1c9-9c801f937400)\"" pod="openshift-multus/multus-9wjs5" podUID="6e7ad3ad-7937-409b-b1c9-9c801f937400" Dec 01 14:08:06 crc kubenswrapper[4585]: I1201 14:08:06.734865 4585 scope.go:117] "RemoveContainer" containerID="5ffd1f9a2b8eaf70cf7ece20438894648d313b347b383cc6d739d279439a2c36" Dec 01 14:08:06 crc kubenswrapper[4585]: I1201 14:08:06.756897 4585 scope.go:117] "RemoveContainer" containerID="34943a310d82b3ede86ddc5c7f53cb9f2334ca164cc1c199a110fff8577a6e1c" Dec 01 14:08:13 crc kubenswrapper[4585]: I1201 14:08:13.716028 4585 patch_prober.go:28] interesting pod/machine-config-daemon-lj9gs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 14:08:13 crc kubenswrapper[4585]: I1201 14:08:13.716920 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 14:08:13 crc kubenswrapper[4585]: I1201 14:08:13.717006 4585 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" Dec 01 14:08:13 crc kubenswrapper[4585]: I1201 14:08:13.717807 4585 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"80328dc2704086bb4c5e275cec97e65017716d1273f5d086800acbe8844177d3"} pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 14:08:13 crc kubenswrapper[4585]: I1201 14:08:13.717884 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerName="machine-config-daemon" containerID="cri-o://80328dc2704086bb4c5e275cec97e65017716d1273f5d086800acbe8844177d3" gracePeriod=600 Dec 01 14:08:14 crc kubenswrapper[4585]: I1201 14:08:14.388556 4585 generic.go:334] "Generic (PLEG): container finished" podID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerID="80328dc2704086bb4c5e275cec97e65017716d1273f5d086800acbe8844177d3" exitCode=0 Dec 01 14:08:14 crc kubenswrapper[4585]: I1201 14:08:14.388631 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" event={"ID":"f7beb40d-bcd0-43c8-a9fe-c32408790a4c","Type":"ContainerDied","Data":"80328dc2704086bb4c5e275cec97e65017716d1273f5d086800acbe8844177d3"} Dec 01 14:08:14 crc kubenswrapper[4585]: I1201 14:08:14.389147 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" event={"ID":"f7beb40d-bcd0-43c8-a9fe-c32408790a4c","Type":"ContainerStarted","Data":"9c565360e1e1b852f24cf87ad3ed2b80ca20fd43a45c1f1f0ee3553f5b1d6b02"} Dec 01 14:08:14 crc kubenswrapper[4585]: I1201 14:08:14.389205 4585 scope.go:117] "RemoveContainer" containerID="8d257e580d36e30a7107591145ef0a5ff804617e4fa8a607c1d45bf357edd6a4" Dec 01 14:08:16 crc kubenswrapper[4585]: I1201 14:08:16.913361 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sblgv" Dec 01 14:08:17 crc kubenswrapper[4585]: I1201 14:08:17.412828 4585 scope.go:117] "RemoveContainer" containerID="faa193efeada9b36721dd685be49ca406d22ffe8d1ba80f075c5d12bec3e3baf" Dec 01 14:08:18 crc kubenswrapper[4585]: I1201 14:08:18.419369 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9wjs5_6e7ad3ad-7937-409b-b1c9-9c801f937400/kube-multus/2.log" Dec 01 14:08:18 crc kubenswrapper[4585]: I1201 14:08:18.421268 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9wjs5" event={"ID":"6e7ad3ad-7937-409b-b1c9-9c801f937400","Type":"ContainerStarted","Data":"f4d73e000fe0c691db77d87460ea633ab945891fc434d2fd1aff422c533532d6"} Dec 01 14:08:27 crc kubenswrapper[4585]: I1201 14:08:27.605253 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ff8sb7"] Dec 01 14:08:27 crc kubenswrapper[4585]: I1201 14:08:27.608301 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ff8sb7" Dec 01 14:08:27 crc kubenswrapper[4585]: I1201 14:08:27.613955 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 01 14:08:27 crc kubenswrapper[4585]: I1201 14:08:27.617688 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ff8sb7"] Dec 01 14:08:27 crc kubenswrapper[4585]: I1201 14:08:27.756916 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/37acd505-ab0f-4779-844d-3dbe65a936c0-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ff8sb7\" (UID: \"37acd505-ab0f-4779-844d-3dbe65a936c0\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ff8sb7" Dec 01 14:08:27 crc kubenswrapper[4585]: I1201 14:08:27.757250 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rs4q8\" (UniqueName: \"kubernetes.io/projected/37acd505-ab0f-4779-844d-3dbe65a936c0-kube-api-access-rs4q8\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ff8sb7\" (UID: \"37acd505-ab0f-4779-844d-3dbe65a936c0\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ff8sb7" Dec 01 14:08:27 crc kubenswrapper[4585]: I1201 14:08:27.757357 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/37acd505-ab0f-4779-844d-3dbe65a936c0-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ff8sb7\" (UID: \"37acd505-ab0f-4779-844d-3dbe65a936c0\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ff8sb7" Dec 01 14:08:27 crc kubenswrapper[4585]: I1201 14:08:27.858620 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/37acd505-ab0f-4779-844d-3dbe65a936c0-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ff8sb7\" (UID: \"37acd505-ab0f-4779-844d-3dbe65a936c0\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ff8sb7" Dec 01 14:08:27 crc kubenswrapper[4585]: I1201 14:08:27.859022 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rs4q8\" (UniqueName: \"kubernetes.io/projected/37acd505-ab0f-4779-844d-3dbe65a936c0-kube-api-access-rs4q8\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ff8sb7\" (UID: \"37acd505-ab0f-4779-844d-3dbe65a936c0\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ff8sb7" Dec 01 14:08:27 crc kubenswrapper[4585]: I1201 14:08:27.859134 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/37acd505-ab0f-4779-844d-3dbe65a936c0-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ff8sb7\" (UID: \"37acd505-ab0f-4779-844d-3dbe65a936c0\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ff8sb7" Dec 01 14:08:27 crc kubenswrapper[4585]: I1201 14:08:27.859362 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/37acd505-ab0f-4779-844d-3dbe65a936c0-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ff8sb7\" (UID: \"37acd505-ab0f-4779-844d-3dbe65a936c0\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ff8sb7" Dec 01 14:08:27 crc kubenswrapper[4585]: I1201 14:08:27.859662 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/37acd505-ab0f-4779-844d-3dbe65a936c0-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ff8sb7\" (UID: \"37acd505-ab0f-4779-844d-3dbe65a936c0\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ff8sb7" Dec 01 14:08:27 crc kubenswrapper[4585]: I1201 14:08:27.888684 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rs4q8\" (UniqueName: \"kubernetes.io/projected/37acd505-ab0f-4779-844d-3dbe65a936c0-kube-api-access-rs4q8\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ff8sb7\" (UID: \"37acd505-ab0f-4779-844d-3dbe65a936c0\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ff8sb7" Dec 01 14:08:27 crc kubenswrapper[4585]: I1201 14:08:27.963391 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ff8sb7" Dec 01 14:08:28 crc kubenswrapper[4585]: I1201 14:08:28.396404 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ff8sb7"] Dec 01 14:08:29 crc kubenswrapper[4585]: I1201 14:08:29.013315 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ff8sb7" event={"ID":"37acd505-ab0f-4779-844d-3dbe65a936c0","Type":"ContainerStarted","Data":"1f7e9475bc9b2ca842a6d96453dfbdf10dddf0e1872278c61891726b0f577a00"} Dec 01 14:08:30 crc kubenswrapper[4585]: I1201 14:08:30.020626 4585 generic.go:334] "Generic (PLEG): container finished" podID="37acd505-ab0f-4779-844d-3dbe65a936c0" containerID="cfd5a84cdc704165d40fa7ea15f6bb41f3467dc88f83497ca2dbeff63790a431" exitCode=0 Dec 01 14:08:30 crc kubenswrapper[4585]: I1201 14:08:30.021171 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ff8sb7" event={"ID":"37acd505-ab0f-4779-844d-3dbe65a936c0","Type":"ContainerDied","Data":"cfd5a84cdc704165d40fa7ea15f6bb41f3467dc88f83497ca2dbeff63790a431"} Dec 01 14:08:32 crc kubenswrapper[4585]: I1201 14:08:32.035231 4585 generic.go:334] "Generic (PLEG): container finished" podID="37acd505-ab0f-4779-844d-3dbe65a936c0" containerID="56f9632d9e3c1901f48dff23440de0cd9749ea79aa5a55941c9e37abfc9b6c23" exitCode=0 Dec 01 14:08:32 crc kubenswrapper[4585]: I1201 14:08:32.035289 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ff8sb7" event={"ID":"37acd505-ab0f-4779-844d-3dbe65a936c0","Type":"ContainerDied","Data":"56f9632d9e3c1901f48dff23440de0cd9749ea79aa5a55941c9e37abfc9b6c23"} Dec 01 14:08:33 crc kubenswrapper[4585]: I1201 14:08:33.056471 4585 generic.go:334] "Generic (PLEG): container finished" podID="37acd505-ab0f-4779-844d-3dbe65a936c0" containerID="a532d25123476ca4cff69727295f67b6c4ed34748d82fefee4b796f19ea519d9" exitCode=0 Dec 01 14:08:33 crc kubenswrapper[4585]: I1201 14:08:33.056550 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ff8sb7" event={"ID":"37acd505-ab0f-4779-844d-3dbe65a936c0","Type":"ContainerDied","Data":"a532d25123476ca4cff69727295f67b6c4ed34748d82fefee4b796f19ea519d9"} Dec 01 14:08:34 crc kubenswrapper[4585]: I1201 14:08:34.339338 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ff8sb7" Dec 01 14:08:34 crc kubenswrapper[4585]: I1201 14:08:34.457323 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rs4q8\" (UniqueName: \"kubernetes.io/projected/37acd505-ab0f-4779-844d-3dbe65a936c0-kube-api-access-rs4q8\") pod \"37acd505-ab0f-4779-844d-3dbe65a936c0\" (UID: \"37acd505-ab0f-4779-844d-3dbe65a936c0\") " Dec 01 14:08:34 crc kubenswrapper[4585]: I1201 14:08:34.457450 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/37acd505-ab0f-4779-844d-3dbe65a936c0-bundle\") pod \"37acd505-ab0f-4779-844d-3dbe65a936c0\" (UID: \"37acd505-ab0f-4779-844d-3dbe65a936c0\") " Dec 01 14:08:34 crc kubenswrapper[4585]: I1201 14:08:34.457531 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/37acd505-ab0f-4779-844d-3dbe65a936c0-util\") pod \"37acd505-ab0f-4779-844d-3dbe65a936c0\" (UID: \"37acd505-ab0f-4779-844d-3dbe65a936c0\") " Dec 01 14:08:34 crc kubenswrapper[4585]: I1201 14:08:34.459606 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37acd505-ab0f-4779-844d-3dbe65a936c0-bundle" (OuterVolumeSpecName: "bundle") pod "37acd505-ab0f-4779-844d-3dbe65a936c0" (UID: "37acd505-ab0f-4779-844d-3dbe65a936c0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:08:34 crc kubenswrapper[4585]: I1201 14:08:34.465232 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37acd505-ab0f-4779-844d-3dbe65a936c0-kube-api-access-rs4q8" (OuterVolumeSpecName: "kube-api-access-rs4q8") pod "37acd505-ab0f-4779-844d-3dbe65a936c0" (UID: "37acd505-ab0f-4779-844d-3dbe65a936c0"). InnerVolumeSpecName "kube-api-access-rs4q8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:08:34 crc kubenswrapper[4585]: I1201 14:08:34.478770 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37acd505-ab0f-4779-844d-3dbe65a936c0-util" (OuterVolumeSpecName: "util") pod "37acd505-ab0f-4779-844d-3dbe65a936c0" (UID: "37acd505-ab0f-4779-844d-3dbe65a936c0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:08:34 crc kubenswrapper[4585]: I1201 14:08:34.558467 4585 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/37acd505-ab0f-4779-844d-3dbe65a936c0-util\") on node \"crc\" DevicePath \"\"" Dec 01 14:08:34 crc kubenswrapper[4585]: I1201 14:08:34.558498 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rs4q8\" (UniqueName: \"kubernetes.io/projected/37acd505-ab0f-4779-844d-3dbe65a936c0-kube-api-access-rs4q8\") on node \"crc\" DevicePath \"\"" Dec 01 14:08:34 crc kubenswrapper[4585]: I1201 14:08:34.558510 4585 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/37acd505-ab0f-4779-844d-3dbe65a936c0-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 14:08:35 crc kubenswrapper[4585]: I1201 14:08:35.085359 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ff8sb7" event={"ID":"37acd505-ab0f-4779-844d-3dbe65a936c0","Type":"ContainerDied","Data":"1f7e9475bc9b2ca842a6d96453dfbdf10dddf0e1872278c61891726b0f577a00"} Dec 01 14:08:35 crc kubenswrapper[4585]: I1201 14:08:35.085406 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f7e9475bc9b2ca842a6d96453dfbdf10dddf0e1872278c61891726b0f577a00" Dec 01 14:08:35 crc kubenswrapper[4585]: I1201 14:08:35.085499 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ff8sb7" Dec 01 14:08:38 crc kubenswrapper[4585]: I1201 14:08:38.982330 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-c4cpl"] Dec 01 14:08:38 crc kubenswrapper[4585]: E1201 14:08:38.983093 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37acd505-ab0f-4779-844d-3dbe65a936c0" containerName="extract" Dec 01 14:08:38 crc kubenswrapper[4585]: I1201 14:08:38.983110 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="37acd505-ab0f-4779-844d-3dbe65a936c0" containerName="extract" Dec 01 14:08:38 crc kubenswrapper[4585]: E1201 14:08:38.983126 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37acd505-ab0f-4779-844d-3dbe65a936c0" containerName="pull" Dec 01 14:08:38 crc kubenswrapper[4585]: I1201 14:08:38.983147 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="37acd505-ab0f-4779-844d-3dbe65a936c0" containerName="pull" Dec 01 14:08:38 crc kubenswrapper[4585]: E1201 14:08:38.983168 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37acd505-ab0f-4779-844d-3dbe65a936c0" containerName="util" Dec 01 14:08:38 crc kubenswrapper[4585]: I1201 14:08:38.983176 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="37acd505-ab0f-4779-844d-3dbe65a936c0" containerName="util" Dec 01 14:08:38 crc kubenswrapper[4585]: I1201 14:08:38.983314 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="37acd505-ab0f-4779-844d-3dbe65a936c0" containerName="extract" Dec 01 14:08:38 crc kubenswrapper[4585]: I1201 14:08:38.983908 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-c4cpl" Dec 01 14:08:38 crc kubenswrapper[4585]: I1201 14:08:38.986664 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 01 14:08:38 crc kubenswrapper[4585]: I1201 14:08:38.986867 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-pf5kt" Dec 01 14:08:38 crc kubenswrapper[4585]: I1201 14:08:38.987679 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 01 14:08:39 crc kubenswrapper[4585]: I1201 14:08:39.050333 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-c4cpl"] Dec 01 14:08:39 crc kubenswrapper[4585]: I1201 14:08:39.130793 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpmlx\" (UniqueName: \"kubernetes.io/projected/dfe8ef28-9d20-49ce-8084-bfdfbc024e0c-kube-api-access-fpmlx\") pod \"nmstate-operator-5b5b58f5c8-c4cpl\" (UID: \"dfe8ef28-9d20-49ce-8084-bfdfbc024e0c\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-c4cpl" Dec 01 14:08:39 crc kubenswrapper[4585]: I1201 14:08:39.232064 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpmlx\" (UniqueName: \"kubernetes.io/projected/dfe8ef28-9d20-49ce-8084-bfdfbc024e0c-kube-api-access-fpmlx\") pod \"nmstate-operator-5b5b58f5c8-c4cpl\" (UID: \"dfe8ef28-9d20-49ce-8084-bfdfbc024e0c\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-c4cpl" Dec 01 14:08:39 crc kubenswrapper[4585]: I1201 14:08:39.255007 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpmlx\" (UniqueName: \"kubernetes.io/projected/dfe8ef28-9d20-49ce-8084-bfdfbc024e0c-kube-api-access-fpmlx\") pod \"nmstate-operator-5b5b58f5c8-c4cpl\" (UID: \"dfe8ef28-9d20-49ce-8084-bfdfbc024e0c\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-c4cpl" Dec 01 14:08:39 crc kubenswrapper[4585]: I1201 14:08:39.306156 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-c4cpl" Dec 01 14:08:39 crc kubenswrapper[4585]: I1201 14:08:39.610217 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-c4cpl"] Dec 01 14:08:40 crc kubenswrapper[4585]: I1201 14:08:40.134415 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-c4cpl" event={"ID":"dfe8ef28-9d20-49ce-8084-bfdfbc024e0c","Type":"ContainerStarted","Data":"f8593afaa58342a7f4bcc0c3c64ec34c283b27cbad1f25187ca96af8cdbfe332"} Dec 01 14:08:42 crc kubenswrapper[4585]: I1201 14:08:42.149871 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-c4cpl" event={"ID":"dfe8ef28-9d20-49ce-8084-bfdfbc024e0c","Type":"ContainerStarted","Data":"329747d449fd91c181668422761812cc21b662ac159c7aa4f96b2dba6afb3333"} Dec 01 14:08:42 crc kubenswrapper[4585]: I1201 14:08:42.171879 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-c4cpl" podStartSLOduration=1.906344295 podStartE2EDuration="4.17185333s" podCreationTimestamp="2025-12-01 14:08:38 +0000 UTC" firstStartedPulling="2025-12-01 14:08:39.615455163 +0000 UTC m=+633.599669028" lastFinishedPulling="2025-12-01 14:08:41.880964208 +0000 UTC m=+635.865178063" observedRunningTime="2025-12-01 14:08:42.167484193 +0000 UTC m=+636.151698068" watchObservedRunningTime="2025-12-01 14:08:42.17185333 +0000 UTC m=+636.156067185" Dec 01 14:08:47 crc kubenswrapper[4585]: I1201 14:08:47.780515 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-w84ck"] Dec 01 14:08:47 crc kubenswrapper[4585]: I1201 14:08:47.782777 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-w84ck" Dec 01 14:08:47 crc kubenswrapper[4585]: I1201 14:08:47.785134 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-cbvd8" Dec 01 14:08:47 crc kubenswrapper[4585]: I1201 14:08:47.795889 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-w84ck"] Dec 01 14:08:47 crc kubenswrapper[4585]: I1201 14:08:47.811493 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-rvcwb"] Dec 01 14:08:47 crc kubenswrapper[4585]: I1201 14:08:47.812252 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-rvcwb" Dec 01 14:08:47 crc kubenswrapper[4585]: I1201 14:08:47.824571 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 01 14:08:47 crc kubenswrapper[4585]: I1201 14:08:47.849460 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-rvcwb"] Dec 01 14:08:47 crc kubenswrapper[4585]: I1201 14:08:47.854131 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-jj8n7"] Dec 01 14:08:47 crc kubenswrapper[4585]: I1201 14:08:47.855043 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-jj8n7" Dec 01 14:08:47 crc kubenswrapper[4585]: I1201 14:08:47.881825 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxp6f\" (UniqueName: \"kubernetes.io/projected/d03ec6db-a14f-40ee-80b7-2232ffc0a321-kube-api-access-zxp6f\") pod \"nmstate-metrics-7f946cbc9-w84ck\" (UID: \"d03ec6db-a14f-40ee-80b7-2232ffc0a321\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-w84ck" Dec 01 14:08:47 crc kubenswrapper[4585]: I1201 14:08:47.881887 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xvt8\" (UniqueName: \"kubernetes.io/projected/6c8cbbf5-3146-44bc-8533-17523cd27750-kube-api-access-6xvt8\") pod \"nmstate-handler-jj8n7\" (UID: \"6c8cbbf5-3146-44bc-8533-17523cd27750\") " pod="openshift-nmstate/nmstate-handler-jj8n7" Dec 01 14:08:47 crc kubenswrapper[4585]: I1201 14:08:47.881947 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/a4a15dc7-9cbc-4c39-b9ec-f73877001cd7-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-rvcwb\" (UID: \"a4a15dc7-9cbc-4c39-b9ec-f73877001cd7\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-rvcwb" Dec 01 14:08:47 crc kubenswrapper[4585]: I1201 14:08:47.882009 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48bzx\" (UniqueName: \"kubernetes.io/projected/a4a15dc7-9cbc-4c39-b9ec-f73877001cd7-kube-api-access-48bzx\") pod \"nmstate-webhook-5f6d4c5ccb-rvcwb\" (UID: \"a4a15dc7-9cbc-4c39-b9ec-f73877001cd7\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-rvcwb" Dec 01 14:08:47 crc kubenswrapper[4585]: I1201 14:08:47.882025 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/6c8cbbf5-3146-44bc-8533-17523cd27750-ovs-socket\") pod \"nmstate-handler-jj8n7\" (UID: \"6c8cbbf5-3146-44bc-8533-17523cd27750\") " pod="openshift-nmstate/nmstate-handler-jj8n7" Dec 01 14:08:47 crc kubenswrapper[4585]: I1201 14:08:47.882079 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/6c8cbbf5-3146-44bc-8533-17523cd27750-dbus-socket\") pod \"nmstate-handler-jj8n7\" (UID: \"6c8cbbf5-3146-44bc-8533-17523cd27750\") " pod="openshift-nmstate/nmstate-handler-jj8n7" Dec 01 14:08:47 crc kubenswrapper[4585]: I1201 14:08:47.882100 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/6c8cbbf5-3146-44bc-8533-17523cd27750-nmstate-lock\") pod \"nmstate-handler-jj8n7\" (UID: \"6c8cbbf5-3146-44bc-8533-17523cd27750\") " pod="openshift-nmstate/nmstate-handler-jj8n7" Dec 01 14:08:47 crc kubenswrapper[4585]: I1201 14:08:47.980912 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kmtzd"] Dec 01 14:08:47 crc kubenswrapper[4585]: I1201 14:08:47.981892 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kmtzd" Dec 01 14:08:47 crc kubenswrapper[4585]: I1201 14:08:47.982826 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48bzx\" (UniqueName: \"kubernetes.io/projected/a4a15dc7-9cbc-4c39-b9ec-f73877001cd7-kube-api-access-48bzx\") pod \"nmstate-webhook-5f6d4c5ccb-rvcwb\" (UID: \"a4a15dc7-9cbc-4c39-b9ec-f73877001cd7\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-rvcwb" Dec 01 14:08:47 crc kubenswrapper[4585]: I1201 14:08:47.982869 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/6c8cbbf5-3146-44bc-8533-17523cd27750-ovs-socket\") pod \"nmstate-handler-jj8n7\" (UID: \"6c8cbbf5-3146-44bc-8533-17523cd27750\") " pod="openshift-nmstate/nmstate-handler-jj8n7" Dec 01 14:08:47 crc kubenswrapper[4585]: I1201 14:08:47.982892 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/6c8cbbf5-3146-44bc-8533-17523cd27750-dbus-socket\") pod \"nmstate-handler-jj8n7\" (UID: \"6c8cbbf5-3146-44bc-8533-17523cd27750\") " pod="openshift-nmstate/nmstate-handler-jj8n7" Dec 01 14:08:47 crc kubenswrapper[4585]: I1201 14:08:47.982910 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/6c8cbbf5-3146-44bc-8533-17523cd27750-nmstate-lock\") pod \"nmstate-handler-jj8n7\" (UID: \"6c8cbbf5-3146-44bc-8533-17523cd27750\") " pod="openshift-nmstate/nmstate-handler-jj8n7" Dec 01 14:08:47 crc kubenswrapper[4585]: I1201 14:08:47.982948 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxp6f\" (UniqueName: \"kubernetes.io/projected/d03ec6db-a14f-40ee-80b7-2232ffc0a321-kube-api-access-zxp6f\") pod \"nmstate-metrics-7f946cbc9-w84ck\" (UID: \"d03ec6db-a14f-40ee-80b7-2232ffc0a321\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-w84ck" Dec 01 14:08:47 crc kubenswrapper[4585]: I1201 14:08:47.983005 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xvt8\" (UniqueName: \"kubernetes.io/projected/6c8cbbf5-3146-44bc-8533-17523cd27750-kube-api-access-6xvt8\") pod \"nmstate-handler-jj8n7\" (UID: \"6c8cbbf5-3146-44bc-8533-17523cd27750\") " pod="openshift-nmstate/nmstate-handler-jj8n7" Dec 01 14:08:47 crc kubenswrapper[4585]: I1201 14:08:47.983024 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/a4a15dc7-9cbc-4c39-b9ec-f73877001cd7-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-rvcwb\" (UID: \"a4a15dc7-9cbc-4c39-b9ec-f73877001cd7\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-rvcwb" Dec 01 14:08:47 crc kubenswrapper[4585]: E1201 14:08:47.983144 4585 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Dec 01 14:08:47 crc kubenswrapper[4585]: E1201 14:08:47.983193 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4a15dc7-9cbc-4c39-b9ec-f73877001cd7-tls-key-pair podName:a4a15dc7-9cbc-4c39-b9ec-f73877001cd7 nodeName:}" failed. No retries permitted until 2025-12-01 14:08:48.48317478 +0000 UTC m=+642.467388635 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/a4a15dc7-9cbc-4c39-b9ec-f73877001cd7-tls-key-pair") pod "nmstate-webhook-5f6d4c5ccb-rvcwb" (UID: "a4a15dc7-9cbc-4c39-b9ec-f73877001cd7") : secret "openshift-nmstate-webhook" not found Dec 01 14:08:47 crc kubenswrapper[4585]: I1201 14:08:47.983372 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/6c8cbbf5-3146-44bc-8533-17523cd27750-ovs-socket\") pod \"nmstate-handler-jj8n7\" (UID: \"6c8cbbf5-3146-44bc-8533-17523cd27750\") " pod="openshift-nmstate/nmstate-handler-jj8n7" Dec 01 14:08:47 crc kubenswrapper[4585]: I1201 14:08:47.983613 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/6c8cbbf5-3146-44bc-8533-17523cd27750-nmstate-lock\") pod \"nmstate-handler-jj8n7\" (UID: \"6c8cbbf5-3146-44bc-8533-17523cd27750\") " pod="openshift-nmstate/nmstate-handler-jj8n7" Dec 01 14:08:47 crc kubenswrapper[4585]: I1201 14:08:47.983780 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/6c8cbbf5-3146-44bc-8533-17523cd27750-dbus-socket\") pod \"nmstate-handler-jj8n7\" (UID: \"6c8cbbf5-3146-44bc-8533-17523cd27750\") " pod="openshift-nmstate/nmstate-handler-jj8n7" Dec 01 14:08:47 crc kubenswrapper[4585]: I1201 14:08:47.991039 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 01 14:08:47 crc kubenswrapper[4585]: I1201 14:08:47.991750 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 01 14:08:47 crc kubenswrapper[4585]: I1201 14:08:47.999492 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-6lksm" Dec 01 14:08:48 crc kubenswrapper[4585]: I1201 14:08:48.018206 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxp6f\" (UniqueName: \"kubernetes.io/projected/d03ec6db-a14f-40ee-80b7-2232ffc0a321-kube-api-access-zxp6f\") pod \"nmstate-metrics-7f946cbc9-w84ck\" (UID: \"d03ec6db-a14f-40ee-80b7-2232ffc0a321\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-w84ck" Dec 01 14:08:48 crc kubenswrapper[4585]: I1201 14:08:48.022066 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kmtzd"] Dec 01 14:08:48 crc kubenswrapper[4585]: I1201 14:08:48.025952 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xvt8\" (UniqueName: \"kubernetes.io/projected/6c8cbbf5-3146-44bc-8533-17523cd27750-kube-api-access-6xvt8\") pod \"nmstate-handler-jj8n7\" (UID: \"6c8cbbf5-3146-44bc-8533-17523cd27750\") " pod="openshift-nmstate/nmstate-handler-jj8n7" Dec 01 14:08:48 crc kubenswrapper[4585]: I1201 14:08:48.033004 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48bzx\" (UniqueName: \"kubernetes.io/projected/a4a15dc7-9cbc-4c39-b9ec-f73877001cd7-kube-api-access-48bzx\") pod \"nmstate-webhook-5f6d4c5ccb-rvcwb\" (UID: \"a4a15dc7-9cbc-4c39-b9ec-f73877001cd7\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-rvcwb" Dec 01 14:08:48 crc kubenswrapper[4585]: I1201 14:08:48.083522 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/5de4a007-93f4-45e6-a70a-5a036ff4377c-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-kmtzd\" (UID: \"5de4a007-93f4-45e6-a70a-5a036ff4377c\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kmtzd" Dec 01 14:08:48 crc kubenswrapper[4585]: I1201 14:08:48.083628 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5de4a007-93f4-45e6-a70a-5a036ff4377c-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-kmtzd\" (UID: \"5de4a007-93f4-45e6-a70a-5a036ff4377c\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kmtzd" Dec 01 14:08:48 crc kubenswrapper[4585]: I1201 14:08:48.083673 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxwq9\" (UniqueName: \"kubernetes.io/projected/5de4a007-93f4-45e6-a70a-5a036ff4377c-kube-api-access-zxwq9\") pod \"nmstate-console-plugin-7fbb5f6569-kmtzd\" (UID: \"5de4a007-93f4-45e6-a70a-5a036ff4377c\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kmtzd" Dec 01 14:08:48 crc kubenswrapper[4585]: I1201 14:08:48.108409 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-w84ck" Dec 01 14:08:48 crc kubenswrapper[4585]: I1201 14:08:48.171488 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-jj8n7" Dec 01 14:08:48 crc kubenswrapper[4585]: I1201 14:08:48.184153 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5de4a007-93f4-45e6-a70a-5a036ff4377c-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-kmtzd\" (UID: \"5de4a007-93f4-45e6-a70a-5a036ff4377c\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kmtzd" Dec 01 14:08:48 crc kubenswrapper[4585]: I1201 14:08:48.184226 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxwq9\" (UniqueName: \"kubernetes.io/projected/5de4a007-93f4-45e6-a70a-5a036ff4377c-kube-api-access-zxwq9\") pod \"nmstate-console-plugin-7fbb5f6569-kmtzd\" (UID: \"5de4a007-93f4-45e6-a70a-5a036ff4377c\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kmtzd" Dec 01 14:08:48 crc kubenswrapper[4585]: I1201 14:08:48.184300 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/5de4a007-93f4-45e6-a70a-5a036ff4377c-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-kmtzd\" (UID: \"5de4a007-93f4-45e6-a70a-5a036ff4377c\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kmtzd" Dec 01 14:08:48 crc kubenswrapper[4585]: E1201 14:08:48.184438 4585 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Dec 01 14:08:48 crc kubenswrapper[4585]: E1201 14:08:48.184502 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5de4a007-93f4-45e6-a70a-5a036ff4377c-plugin-serving-cert podName:5de4a007-93f4-45e6-a70a-5a036ff4377c nodeName:}" failed. No retries permitted until 2025-12-01 14:08:48.684481216 +0000 UTC m=+642.668695081 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/5de4a007-93f4-45e6-a70a-5a036ff4377c-plugin-serving-cert") pod "nmstate-console-plugin-7fbb5f6569-kmtzd" (UID: "5de4a007-93f4-45e6-a70a-5a036ff4377c") : secret "plugin-serving-cert" not found Dec 01 14:08:48 crc kubenswrapper[4585]: I1201 14:08:48.185112 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5de4a007-93f4-45e6-a70a-5a036ff4377c-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-kmtzd\" (UID: \"5de4a007-93f4-45e6-a70a-5a036ff4377c\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kmtzd" Dec 01 14:08:48 crc kubenswrapper[4585]: W1201 14:08:48.203724 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c8cbbf5_3146_44bc_8533_17523cd27750.slice/crio-f253b476821226ef85a50a0e7a53252bd0dddaa0088cb0592a8d377e1289f12e WatchSource:0}: Error finding container f253b476821226ef85a50a0e7a53252bd0dddaa0088cb0592a8d377e1289f12e: Status 404 returned error can't find the container with id f253b476821226ef85a50a0e7a53252bd0dddaa0088cb0592a8d377e1289f12e Dec 01 14:08:48 crc kubenswrapper[4585]: I1201 14:08:48.228079 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxwq9\" (UniqueName: \"kubernetes.io/projected/5de4a007-93f4-45e6-a70a-5a036ff4377c-kube-api-access-zxwq9\") pod \"nmstate-console-plugin-7fbb5f6569-kmtzd\" (UID: \"5de4a007-93f4-45e6-a70a-5a036ff4377c\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kmtzd" Dec 01 14:08:48 crc kubenswrapper[4585]: I1201 14:08:48.252210 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-865d78dcd8-tnnnr"] Dec 01 14:08:48 crc kubenswrapper[4585]: I1201 14:08:48.253035 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-865d78dcd8-tnnnr" Dec 01 14:08:48 crc kubenswrapper[4585]: I1201 14:08:48.285965 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/374a48e7-1d43-48e5-acb9-02e38c2c9fa4-trusted-ca-bundle\") pod \"console-865d78dcd8-tnnnr\" (UID: \"374a48e7-1d43-48e5-acb9-02e38c2c9fa4\") " pod="openshift-console/console-865d78dcd8-tnnnr" Dec 01 14:08:48 crc kubenswrapper[4585]: I1201 14:08:48.286039 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/374a48e7-1d43-48e5-acb9-02e38c2c9fa4-console-serving-cert\") pod \"console-865d78dcd8-tnnnr\" (UID: \"374a48e7-1d43-48e5-acb9-02e38c2c9fa4\") " pod="openshift-console/console-865d78dcd8-tnnnr" Dec 01 14:08:48 crc kubenswrapper[4585]: I1201 14:08:48.286104 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/374a48e7-1d43-48e5-acb9-02e38c2c9fa4-console-config\") pod \"console-865d78dcd8-tnnnr\" (UID: \"374a48e7-1d43-48e5-acb9-02e38c2c9fa4\") " pod="openshift-console/console-865d78dcd8-tnnnr" Dec 01 14:08:48 crc kubenswrapper[4585]: I1201 14:08:48.286140 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpsbd\" (UniqueName: \"kubernetes.io/projected/374a48e7-1d43-48e5-acb9-02e38c2c9fa4-kube-api-access-wpsbd\") pod \"console-865d78dcd8-tnnnr\" (UID: \"374a48e7-1d43-48e5-acb9-02e38c2c9fa4\") " pod="openshift-console/console-865d78dcd8-tnnnr" Dec 01 14:08:48 crc kubenswrapper[4585]: I1201 14:08:48.286166 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/374a48e7-1d43-48e5-acb9-02e38c2c9fa4-console-oauth-config\") pod \"console-865d78dcd8-tnnnr\" (UID: \"374a48e7-1d43-48e5-acb9-02e38c2c9fa4\") " pod="openshift-console/console-865d78dcd8-tnnnr" Dec 01 14:08:48 crc kubenswrapper[4585]: I1201 14:08:48.286192 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/374a48e7-1d43-48e5-acb9-02e38c2c9fa4-service-ca\") pod \"console-865d78dcd8-tnnnr\" (UID: \"374a48e7-1d43-48e5-acb9-02e38c2c9fa4\") " pod="openshift-console/console-865d78dcd8-tnnnr" Dec 01 14:08:48 crc kubenswrapper[4585]: I1201 14:08:48.286284 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/374a48e7-1d43-48e5-acb9-02e38c2c9fa4-oauth-serving-cert\") pod \"console-865d78dcd8-tnnnr\" (UID: \"374a48e7-1d43-48e5-acb9-02e38c2c9fa4\") " pod="openshift-console/console-865d78dcd8-tnnnr" Dec 01 14:08:48 crc kubenswrapper[4585]: I1201 14:08:48.291197 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-865d78dcd8-tnnnr"] Dec 01 14:08:48 crc kubenswrapper[4585]: I1201 14:08:48.389823 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/374a48e7-1d43-48e5-acb9-02e38c2c9fa4-console-config\") pod \"console-865d78dcd8-tnnnr\" (UID: \"374a48e7-1d43-48e5-acb9-02e38c2c9fa4\") " pod="openshift-console/console-865d78dcd8-tnnnr" Dec 01 14:08:48 crc kubenswrapper[4585]: I1201 14:08:48.390378 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpsbd\" (UniqueName: \"kubernetes.io/projected/374a48e7-1d43-48e5-acb9-02e38c2c9fa4-kube-api-access-wpsbd\") pod \"console-865d78dcd8-tnnnr\" (UID: \"374a48e7-1d43-48e5-acb9-02e38c2c9fa4\") " pod="openshift-console/console-865d78dcd8-tnnnr" Dec 01 14:08:48 crc kubenswrapper[4585]: I1201 14:08:48.390410 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/374a48e7-1d43-48e5-acb9-02e38c2c9fa4-console-oauth-config\") pod \"console-865d78dcd8-tnnnr\" (UID: \"374a48e7-1d43-48e5-acb9-02e38c2c9fa4\") " pod="openshift-console/console-865d78dcd8-tnnnr" Dec 01 14:08:48 crc kubenswrapper[4585]: I1201 14:08:48.390429 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/374a48e7-1d43-48e5-acb9-02e38c2c9fa4-service-ca\") pod \"console-865d78dcd8-tnnnr\" (UID: \"374a48e7-1d43-48e5-acb9-02e38c2c9fa4\") " pod="openshift-console/console-865d78dcd8-tnnnr" Dec 01 14:08:48 crc kubenswrapper[4585]: I1201 14:08:48.390452 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/374a48e7-1d43-48e5-acb9-02e38c2c9fa4-oauth-serving-cert\") pod \"console-865d78dcd8-tnnnr\" (UID: \"374a48e7-1d43-48e5-acb9-02e38c2c9fa4\") " pod="openshift-console/console-865d78dcd8-tnnnr" Dec 01 14:08:48 crc kubenswrapper[4585]: I1201 14:08:48.390504 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/374a48e7-1d43-48e5-acb9-02e38c2c9fa4-trusted-ca-bundle\") pod \"console-865d78dcd8-tnnnr\" (UID: \"374a48e7-1d43-48e5-acb9-02e38c2c9fa4\") " pod="openshift-console/console-865d78dcd8-tnnnr" Dec 01 14:08:48 crc kubenswrapper[4585]: I1201 14:08:48.390542 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/374a48e7-1d43-48e5-acb9-02e38c2c9fa4-console-serving-cert\") pod \"console-865d78dcd8-tnnnr\" (UID: \"374a48e7-1d43-48e5-acb9-02e38c2c9fa4\") " pod="openshift-console/console-865d78dcd8-tnnnr" Dec 01 14:08:48 crc kubenswrapper[4585]: I1201 14:08:48.393818 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/374a48e7-1d43-48e5-acb9-02e38c2c9fa4-console-config\") pod \"console-865d78dcd8-tnnnr\" (UID: \"374a48e7-1d43-48e5-acb9-02e38c2c9fa4\") " pod="openshift-console/console-865d78dcd8-tnnnr" Dec 01 14:08:48 crc kubenswrapper[4585]: I1201 14:08:48.394309 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/374a48e7-1d43-48e5-acb9-02e38c2c9fa4-oauth-serving-cert\") pod \"console-865d78dcd8-tnnnr\" (UID: \"374a48e7-1d43-48e5-acb9-02e38c2c9fa4\") " pod="openshift-console/console-865d78dcd8-tnnnr" Dec 01 14:08:48 crc kubenswrapper[4585]: I1201 14:08:48.395719 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/374a48e7-1d43-48e5-acb9-02e38c2c9fa4-trusted-ca-bundle\") pod \"console-865d78dcd8-tnnnr\" (UID: \"374a48e7-1d43-48e5-acb9-02e38c2c9fa4\") " pod="openshift-console/console-865d78dcd8-tnnnr" Dec 01 14:08:48 crc kubenswrapper[4585]: I1201 14:08:48.396269 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/374a48e7-1d43-48e5-acb9-02e38c2c9fa4-service-ca\") pod \"console-865d78dcd8-tnnnr\" (UID: \"374a48e7-1d43-48e5-acb9-02e38c2c9fa4\") " pod="openshift-console/console-865d78dcd8-tnnnr" Dec 01 14:08:48 crc kubenswrapper[4585]: I1201 14:08:48.397869 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/374a48e7-1d43-48e5-acb9-02e38c2c9fa4-console-serving-cert\") pod \"console-865d78dcd8-tnnnr\" (UID: \"374a48e7-1d43-48e5-acb9-02e38c2c9fa4\") " pod="openshift-console/console-865d78dcd8-tnnnr" Dec 01 14:08:48 crc kubenswrapper[4585]: I1201 14:08:48.400535 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/374a48e7-1d43-48e5-acb9-02e38c2c9fa4-console-oauth-config\") pod \"console-865d78dcd8-tnnnr\" (UID: \"374a48e7-1d43-48e5-acb9-02e38c2c9fa4\") " pod="openshift-console/console-865d78dcd8-tnnnr" Dec 01 14:08:48 crc kubenswrapper[4585]: I1201 14:08:48.417920 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpsbd\" (UniqueName: \"kubernetes.io/projected/374a48e7-1d43-48e5-acb9-02e38c2c9fa4-kube-api-access-wpsbd\") pod \"console-865d78dcd8-tnnnr\" (UID: \"374a48e7-1d43-48e5-acb9-02e38c2c9fa4\") " pod="openshift-console/console-865d78dcd8-tnnnr" Dec 01 14:08:48 crc kubenswrapper[4585]: I1201 14:08:48.484764 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-w84ck"] Dec 01 14:08:48 crc kubenswrapper[4585]: I1201 14:08:48.491442 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/a4a15dc7-9cbc-4c39-b9ec-f73877001cd7-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-rvcwb\" (UID: \"a4a15dc7-9cbc-4c39-b9ec-f73877001cd7\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-rvcwb" Dec 01 14:08:48 crc kubenswrapper[4585]: W1201 14:08:48.491929 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd03ec6db_a14f_40ee_80b7_2232ffc0a321.slice/crio-4148ff919a5f6f0b2fec98bb91854bb7e547c732d52013db17c845ee8298dab4 WatchSource:0}: Error finding container 4148ff919a5f6f0b2fec98bb91854bb7e547c732d52013db17c845ee8298dab4: Status 404 returned error can't find the container with id 4148ff919a5f6f0b2fec98bb91854bb7e547c732d52013db17c845ee8298dab4 Dec 01 14:08:48 crc kubenswrapper[4585]: I1201 14:08:48.495209 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/a4a15dc7-9cbc-4c39-b9ec-f73877001cd7-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-rvcwb\" (UID: \"a4a15dc7-9cbc-4c39-b9ec-f73877001cd7\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-rvcwb" Dec 01 14:08:48 crc kubenswrapper[4585]: I1201 14:08:48.568316 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-865d78dcd8-tnnnr" Dec 01 14:08:48 crc kubenswrapper[4585]: I1201 14:08:48.694604 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/5de4a007-93f4-45e6-a70a-5a036ff4377c-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-kmtzd\" (UID: \"5de4a007-93f4-45e6-a70a-5a036ff4377c\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kmtzd" Dec 01 14:08:48 crc kubenswrapper[4585]: I1201 14:08:48.700245 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/5de4a007-93f4-45e6-a70a-5a036ff4377c-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-kmtzd\" (UID: \"5de4a007-93f4-45e6-a70a-5a036ff4377c\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kmtzd" Dec 01 14:08:48 crc kubenswrapper[4585]: I1201 14:08:48.733379 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-rvcwb" Dec 01 14:08:48 crc kubenswrapper[4585]: I1201 14:08:48.801463 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-865d78dcd8-tnnnr"] Dec 01 14:08:48 crc kubenswrapper[4585]: W1201 14:08:48.815592 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod374a48e7_1d43_48e5_acb9_02e38c2c9fa4.slice/crio-0871e6e54d200fd7f7e6263c9ada10676cce5b69b306bc735170466f901a524c WatchSource:0}: Error finding container 0871e6e54d200fd7f7e6263c9ada10676cce5b69b306bc735170466f901a524c: Status 404 returned error can't find the container with id 0871e6e54d200fd7f7e6263c9ada10676cce5b69b306bc735170466f901a524c Dec 01 14:08:48 crc kubenswrapper[4585]: I1201 14:08:48.903272 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kmtzd" Dec 01 14:08:48 crc kubenswrapper[4585]: I1201 14:08:48.985874 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-rvcwb"] Dec 01 14:08:48 crc kubenswrapper[4585]: W1201 14:08:48.999505 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4a15dc7_9cbc_4c39_b9ec_f73877001cd7.slice/crio-4d47dd3e267ec002ac8675afa6adb7cdc2a1b923123945f744ec66148cbab41e WatchSource:0}: Error finding container 4d47dd3e267ec002ac8675afa6adb7cdc2a1b923123945f744ec66148cbab41e: Status 404 returned error can't find the container with id 4d47dd3e267ec002ac8675afa6adb7cdc2a1b923123945f744ec66148cbab41e Dec 01 14:08:49 crc kubenswrapper[4585]: I1201 14:08:49.133862 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kmtzd"] Dec 01 14:08:49 crc kubenswrapper[4585]: W1201 14:08:49.137814 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5de4a007_93f4_45e6_a70a_5a036ff4377c.slice/crio-c4ebe39efddbc1bde101d34ed0fccaf9f6f7f8c489084a96122115186526b9c6 WatchSource:0}: Error finding container c4ebe39efddbc1bde101d34ed0fccaf9f6f7f8c489084a96122115186526b9c6: Status 404 returned error can't find the container with id c4ebe39efddbc1bde101d34ed0fccaf9f6f7f8c489084a96122115186526b9c6 Dec 01 14:08:49 crc kubenswrapper[4585]: I1201 14:08:49.202119 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-w84ck" event={"ID":"d03ec6db-a14f-40ee-80b7-2232ffc0a321","Type":"ContainerStarted","Data":"4148ff919a5f6f0b2fec98bb91854bb7e547c732d52013db17c845ee8298dab4"} Dec 01 14:08:49 crc kubenswrapper[4585]: I1201 14:08:49.203380 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-rvcwb" event={"ID":"a4a15dc7-9cbc-4c39-b9ec-f73877001cd7","Type":"ContainerStarted","Data":"4d47dd3e267ec002ac8675afa6adb7cdc2a1b923123945f744ec66148cbab41e"} Dec 01 14:08:49 crc kubenswrapper[4585]: I1201 14:08:49.204355 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kmtzd" event={"ID":"5de4a007-93f4-45e6-a70a-5a036ff4377c","Type":"ContainerStarted","Data":"c4ebe39efddbc1bde101d34ed0fccaf9f6f7f8c489084a96122115186526b9c6"} Dec 01 14:08:49 crc kubenswrapper[4585]: I1201 14:08:49.205694 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-865d78dcd8-tnnnr" event={"ID":"374a48e7-1d43-48e5-acb9-02e38c2c9fa4","Type":"ContainerStarted","Data":"736b5ef4af45397caa0c3f527951d36e51683be03f83c1d77c7060878111585c"} Dec 01 14:08:49 crc kubenswrapper[4585]: I1201 14:08:49.205718 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-865d78dcd8-tnnnr" event={"ID":"374a48e7-1d43-48e5-acb9-02e38c2c9fa4","Type":"ContainerStarted","Data":"0871e6e54d200fd7f7e6263c9ada10676cce5b69b306bc735170466f901a524c"} Dec 01 14:08:49 crc kubenswrapper[4585]: I1201 14:08:49.207433 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-jj8n7" event={"ID":"6c8cbbf5-3146-44bc-8533-17523cd27750","Type":"ContainerStarted","Data":"f253b476821226ef85a50a0e7a53252bd0dddaa0088cb0592a8d377e1289f12e"} Dec 01 14:08:49 crc kubenswrapper[4585]: I1201 14:08:49.230392 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-865d78dcd8-tnnnr" podStartSLOduration=1.230362384 podStartE2EDuration="1.230362384s" podCreationTimestamp="2025-12-01 14:08:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:08:49.224538389 +0000 UTC m=+643.208752264" watchObservedRunningTime="2025-12-01 14:08:49.230362384 +0000 UTC m=+643.214576239" Dec 01 14:08:53 crc kubenswrapper[4585]: I1201 14:08:53.241151 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-jj8n7" event={"ID":"6c8cbbf5-3146-44bc-8533-17523cd27750","Type":"ContainerStarted","Data":"0546261d724b76b56b2becd1ef103b47fc4bfe9715d2adb9ff2c870bca348ffd"} Dec 01 14:08:53 crc kubenswrapper[4585]: I1201 14:08:53.242031 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-jj8n7" Dec 01 14:08:53 crc kubenswrapper[4585]: I1201 14:08:53.244606 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-w84ck" event={"ID":"d03ec6db-a14f-40ee-80b7-2232ffc0a321","Type":"ContainerStarted","Data":"8e3e5e57699f6cf7d52ac605346520a2c5ab5d71efa9436ab369418e34c7234c"} Dec 01 14:08:53 crc kubenswrapper[4585]: I1201 14:08:53.249289 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-rvcwb" event={"ID":"a4a15dc7-9cbc-4c39-b9ec-f73877001cd7","Type":"ContainerStarted","Data":"642ceb908e049c6eea455b6a68f8f5014c6d26c965b46948a23cab019062518b"} Dec 01 14:08:53 crc kubenswrapper[4585]: I1201 14:08:53.249444 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-rvcwb" Dec 01 14:08:53 crc kubenswrapper[4585]: I1201 14:08:53.251294 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kmtzd" event={"ID":"5de4a007-93f4-45e6-a70a-5a036ff4377c","Type":"ContainerStarted","Data":"fdd09b205deef0b40cc8d40fac0617acfc889f42d636fa241e9275ae70142e48"} Dec 01 14:08:53 crc kubenswrapper[4585]: I1201 14:08:53.268280 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-jj8n7" podStartSLOduration=2.060508511 podStartE2EDuration="6.268244172s" podCreationTimestamp="2025-12-01 14:08:47 +0000 UTC" firstStartedPulling="2025-12-01 14:08:48.227178818 +0000 UTC m=+642.211392673" lastFinishedPulling="2025-12-01 14:08:52.434914449 +0000 UTC m=+646.419128334" observedRunningTime="2025-12-01 14:08:53.265082827 +0000 UTC m=+647.249296702" watchObservedRunningTime="2025-12-01 14:08:53.268244172 +0000 UTC m=+647.252458087" Dec 01 14:08:53 crc kubenswrapper[4585]: I1201 14:08:53.314724 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-kmtzd" podStartSLOduration=3.058290252 podStartE2EDuration="6.314690044s" podCreationTimestamp="2025-12-01 14:08:47 +0000 UTC" firstStartedPulling="2025-12-01 14:08:49.140730296 +0000 UTC m=+643.124944151" lastFinishedPulling="2025-12-01 14:08:52.397130078 +0000 UTC m=+646.381343943" observedRunningTime="2025-12-01 14:08:53.287355953 +0000 UTC m=+647.271569808" watchObservedRunningTime="2025-12-01 14:08:53.314690044 +0000 UTC m=+647.298903899" Dec 01 14:08:53 crc kubenswrapper[4585]: I1201 14:08:53.324343 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-rvcwb" podStartSLOduration=2.9224002369999997 podStartE2EDuration="6.324307872s" podCreationTimestamp="2025-12-01 14:08:47 +0000 UTC" firstStartedPulling="2025-12-01 14:08:49.007009329 +0000 UTC m=+642.991223184" lastFinishedPulling="2025-12-01 14:08:52.408916964 +0000 UTC m=+646.393130819" observedRunningTime="2025-12-01 14:08:53.322539554 +0000 UTC m=+647.306753399" watchObservedRunningTime="2025-12-01 14:08:53.324307872 +0000 UTC m=+647.308521737" Dec 01 14:08:55 crc kubenswrapper[4585]: I1201 14:08:55.271216 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-w84ck" event={"ID":"d03ec6db-a14f-40ee-80b7-2232ffc0a321","Type":"ContainerStarted","Data":"2016e6c2a114a551ce8c167e6c2128c6d171fae68906f7b449b706605e3c2247"} Dec 01 14:08:58 crc kubenswrapper[4585]: I1201 14:08:58.211006 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-jj8n7" Dec 01 14:08:58 crc kubenswrapper[4585]: I1201 14:08:58.242245 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-w84ck" podStartSLOduration=4.635793992 podStartE2EDuration="11.24220356s" podCreationTimestamp="2025-12-01 14:08:47 +0000 UTC" firstStartedPulling="2025-12-01 14:08:48.493680607 +0000 UTC m=+642.477894462" lastFinishedPulling="2025-12-01 14:08:55.100090175 +0000 UTC m=+649.084304030" observedRunningTime="2025-12-01 14:08:55.298050511 +0000 UTC m=+649.282264406" watchObservedRunningTime="2025-12-01 14:08:58.24220356 +0000 UTC m=+652.226417425" Dec 01 14:08:58 crc kubenswrapper[4585]: I1201 14:08:58.569147 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-865d78dcd8-tnnnr" Dec 01 14:08:58 crc kubenswrapper[4585]: I1201 14:08:58.569208 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-865d78dcd8-tnnnr" Dec 01 14:08:58 crc kubenswrapper[4585]: I1201 14:08:58.574259 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-865d78dcd8-tnnnr" Dec 01 14:08:59 crc kubenswrapper[4585]: I1201 14:08:59.309097 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-865d78dcd8-tnnnr" Dec 01 14:08:59 crc kubenswrapper[4585]: I1201 14:08:59.374354 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-f9k95"] Dec 01 14:09:08 crc kubenswrapper[4585]: I1201 14:09:08.741094 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-rvcwb" Dec 01 14:09:22 crc kubenswrapper[4585]: I1201 14:09:22.161486 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bnmj8"] Dec 01 14:09:22 crc kubenswrapper[4585]: I1201 14:09:22.163290 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bnmj8" Dec 01 14:09:22 crc kubenswrapper[4585]: I1201 14:09:22.166107 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 01 14:09:22 crc kubenswrapper[4585]: I1201 14:09:22.184501 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bnmj8"] Dec 01 14:09:22 crc kubenswrapper[4585]: I1201 14:09:22.348470 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lz8nd\" (UniqueName: \"kubernetes.io/projected/2fb40ba9-c26f-4fe8-900d-c5bd775febf6-kube-api-access-lz8nd\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bnmj8\" (UID: \"2fb40ba9-c26f-4fe8-900d-c5bd775febf6\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bnmj8" Dec 01 14:09:22 crc kubenswrapper[4585]: I1201 14:09:22.348582 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2fb40ba9-c26f-4fe8-900d-c5bd775febf6-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bnmj8\" (UID: \"2fb40ba9-c26f-4fe8-900d-c5bd775febf6\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bnmj8" Dec 01 14:09:22 crc kubenswrapper[4585]: I1201 14:09:22.348617 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2fb40ba9-c26f-4fe8-900d-c5bd775febf6-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bnmj8\" (UID: \"2fb40ba9-c26f-4fe8-900d-c5bd775febf6\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bnmj8" Dec 01 14:09:22 crc kubenswrapper[4585]: I1201 14:09:22.449647 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lz8nd\" (UniqueName: \"kubernetes.io/projected/2fb40ba9-c26f-4fe8-900d-c5bd775febf6-kube-api-access-lz8nd\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bnmj8\" (UID: \"2fb40ba9-c26f-4fe8-900d-c5bd775febf6\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bnmj8" Dec 01 14:09:22 crc kubenswrapper[4585]: I1201 14:09:22.450030 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2fb40ba9-c26f-4fe8-900d-c5bd775febf6-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bnmj8\" (UID: \"2fb40ba9-c26f-4fe8-900d-c5bd775febf6\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bnmj8" Dec 01 14:09:22 crc kubenswrapper[4585]: I1201 14:09:22.450062 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2fb40ba9-c26f-4fe8-900d-c5bd775febf6-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bnmj8\" (UID: \"2fb40ba9-c26f-4fe8-900d-c5bd775febf6\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bnmj8" Dec 01 14:09:22 crc kubenswrapper[4585]: I1201 14:09:22.450580 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2fb40ba9-c26f-4fe8-900d-c5bd775febf6-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bnmj8\" (UID: \"2fb40ba9-c26f-4fe8-900d-c5bd775febf6\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bnmj8" Dec 01 14:09:22 crc kubenswrapper[4585]: I1201 14:09:22.450723 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2fb40ba9-c26f-4fe8-900d-c5bd775febf6-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bnmj8\" (UID: \"2fb40ba9-c26f-4fe8-900d-c5bd775febf6\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bnmj8" Dec 01 14:09:22 crc kubenswrapper[4585]: I1201 14:09:22.474860 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lz8nd\" (UniqueName: \"kubernetes.io/projected/2fb40ba9-c26f-4fe8-900d-c5bd775febf6-kube-api-access-lz8nd\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bnmj8\" (UID: \"2fb40ba9-c26f-4fe8-900d-c5bd775febf6\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bnmj8" Dec 01 14:09:22 crc kubenswrapper[4585]: I1201 14:09:22.488626 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bnmj8" Dec 01 14:09:22 crc kubenswrapper[4585]: I1201 14:09:22.900808 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bnmj8"] Dec 01 14:09:23 crc kubenswrapper[4585]: I1201 14:09:23.485409 4585 generic.go:334] "Generic (PLEG): container finished" podID="2fb40ba9-c26f-4fe8-900d-c5bd775febf6" containerID="c973a95a12e572cba9b59937a2b7b61ec9a9c9e3a1b6e36fdb103b173e41db55" exitCode=0 Dec 01 14:09:23 crc kubenswrapper[4585]: I1201 14:09:23.485508 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bnmj8" event={"ID":"2fb40ba9-c26f-4fe8-900d-c5bd775febf6","Type":"ContainerDied","Data":"c973a95a12e572cba9b59937a2b7b61ec9a9c9e3a1b6e36fdb103b173e41db55"} Dec 01 14:09:23 crc kubenswrapper[4585]: I1201 14:09:23.487734 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bnmj8" event={"ID":"2fb40ba9-c26f-4fe8-900d-c5bd775febf6","Type":"ContainerStarted","Data":"0bc7a489ae767912b829f4955f0b81e0832dfa78f49f36dbd63499925a5ae560"} Dec 01 14:09:24 crc kubenswrapper[4585]: I1201 14:09:24.425165 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-f9k95" podUID="bb6e47d0-5966-48d3-be81-97265e7e7a4f" containerName="console" containerID="cri-o://44719bd98646ac79145902ccb432144f564a1b267ced682f893392bd75c0241a" gracePeriod=15 Dec 01 14:09:24 crc kubenswrapper[4585]: I1201 14:09:24.752257 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-f9k95_bb6e47d0-5966-48d3-be81-97265e7e7a4f/console/0.log" Dec 01 14:09:24 crc kubenswrapper[4585]: I1201 14:09:24.752331 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-f9k95" Dec 01 14:09:24 crc kubenswrapper[4585]: I1201 14:09:24.787227 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bb6e47d0-5966-48d3-be81-97265e7e7a4f-console-serving-cert\") pod \"bb6e47d0-5966-48d3-be81-97265e7e7a4f\" (UID: \"bb6e47d0-5966-48d3-be81-97265e7e7a4f\") " Dec 01 14:09:24 crc kubenswrapper[4585]: I1201 14:09:24.787296 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bb6e47d0-5966-48d3-be81-97265e7e7a4f-console-config\") pod \"bb6e47d0-5966-48d3-be81-97265e7e7a4f\" (UID: \"bb6e47d0-5966-48d3-be81-97265e7e7a4f\") " Dec 01 14:09:24 crc kubenswrapper[4585]: I1201 14:09:24.787357 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bb6e47d0-5966-48d3-be81-97265e7e7a4f-console-oauth-config\") pod \"bb6e47d0-5966-48d3-be81-97265e7e7a4f\" (UID: \"bb6e47d0-5966-48d3-be81-97265e7e7a4f\") " Dec 01 14:09:24 crc kubenswrapper[4585]: I1201 14:09:24.787400 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bb6e47d0-5966-48d3-be81-97265e7e7a4f-service-ca\") pod \"bb6e47d0-5966-48d3-be81-97265e7e7a4f\" (UID: \"bb6e47d0-5966-48d3-be81-97265e7e7a4f\") " Dec 01 14:09:24 crc kubenswrapper[4585]: I1201 14:09:24.787435 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lm7wr\" (UniqueName: \"kubernetes.io/projected/bb6e47d0-5966-48d3-be81-97265e7e7a4f-kube-api-access-lm7wr\") pod \"bb6e47d0-5966-48d3-be81-97265e7e7a4f\" (UID: \"bb6e47d0-5966-48d3-be81-97265e7e7a4f\") " Dec 01 14:09:24 crc kubenswrapper[4585]: I1201 14:09:24.787453 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb6e47d0-5966-48d3-be81-97265e7e7a4f-trusted-ca-bundle\") pod \"bb6e47d0-5966-48d3-be81-97265e7e7a4f\" (UID: \"bb6e47d0-5966-48d3-be81-97265e7e7a4f\") " Dec 01 14:09:24 crc kubenswrapper[4585]: I1201 14:09:24.787489 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bb6e47d0-5966-48d3-be81-97265e7e7a4f-oauth-serving-cert\") pod \"bb6e47d0-5966-48d3-be81-97265e7e7a4f\" (UID: \"bb6e47d0-5966-48d3-be81-97265e7e7a4f\") " Dec 01 14:09:24 crc kubenswrapper[4585]: I1201 14:09:24.788153 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb6e47d0-5966-48d3-be81-97265e7e7a4f-console-config" (OuterVolumeSpecName: "console-config") pod "bb6e47d0-5966-48d3-be81-97265e7e7a4f" (UID: "bb6e47d0-5966-48d3-be81-97265e7e7a4f"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:09:24 crc kubenswrapper[4585]: I1201 14:09:24.788165 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb6e47d0-5966-48d3-be81-97265e7e7a4f-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "bb6e47d0-5966-48d3-be81-97265e7e7a4f" (UID: "bb6e47d0-5966-48d3-be81-97265e7e7a4f"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:09:24 crc kubenswrapper[4585]: I1201 14:09:24.788178 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb6e47d0-5966-48d3-be81-97265e7e7a4f-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "bb6e47d0-5966-48d3-be81-97265e7e7a4f" (UID: "bb6e47d0-5966-48d3-be81-97265e7e7a4f"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:09:24 crc kubenswrapper[4585]: I1201 14:09:24.788520 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb6e47d0-5966-48d3-be81-97265e7e7a4f-service-ca" (OuterVolumeSpecName: "service-ca") pod "bb6e47d0-5966-48d3-be81-97265e7e7a4f" (UID: "bb6e47d0-5966-48d3-be81-97265e7e7a4f"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:09:24 crc kubenswrapper[4585]: I1201 14:09:24.799328 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb6e47d0-5966-48d3-be81-97265e7e7a4f-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "bb6e47d0-5966-48d3-be81-97265e7e7a4f" (UID: "bb6e47d0-5966-48d3-be81-97265e7e7a4f"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:09:24 crc kubenswrapper[4585]: I1201 14:09:24.799453 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb6e47d0-5966-48d3-be81-97265e7e7a4f-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "bb6e47d0-5966-48d3-be81-97265e7e7a4f" (UID: "bb6e47d0-5966-48d3-be81-97265e7e7a4f"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:09:24 crc kubenswrapper[4585]: I1201 14:09:24.809390 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb6e47d0-5966-48d3-be81-97265e7e7a4f-kube-api-access-lm7wr" (OuterVolumeSpecName: "kube-api-access-lm7wr") pod "bb6e47d0-5966-48d3-be81-97265e7e7a4f" (UID: "bb6e47d0-5966-48d3-be81-97265e7e7a4f"). InnerVolumeSpecName "kube-api-access-lm7wr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:09:24 crc kubenswrapper[4585]: I1201 14:09:24.889066 4585 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bb6e47d0-5966-48d3-be81-97265e7e7a4f-console-config\") on node \"crc\" DevicePath \"\"" Dec 01 14:09:24 crc kubenswrapper[4585]: I1201 14:09:24.889139 4585 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bb6e47d0-5966-48d3-be81-97265e7e7a4f-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 01 14:09:24 crc kubenswrapper[4585]: I1201 14:09:24.889509 4585 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bb6e47d0-5966-48d3-be81-97265e7e7a4f-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 14:09:24 crc kubenswrapper[4585]: I1201 14:09:24.889531 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lm7wr\" (UniqueName: \"kubernetes.io/projected/bb6e47d0-5966-48d3-be81-97265e7e7a4f-kube-api-access-lm7wr\") on node \"crc\" DevicePath \"\"" Dec 01 14:09:24 crc kubenswrapper[4585]: I1201 14:09:24.889541 4585 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb6e47d0-5966-48d3-be81-97265e7e7a4f-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 14:09:24 crc kubenswrapper[4585]: I1201 14:09:24.889549 4585 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bb6e47d0-5966-48d3-be81-97265e7e7a4f-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 14:09:24 crc kubenswrapper[4585]: I1201 14:09:24.889557 4585 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bb6e47d0-5966-48d3-be81-97265e7e7a4f-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 14:09:25 crc kubenswrapper[4585]: I1201 14:09:25.507103 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-f9k95_bb6e47d0-5966-48d3-be81-97265e7e7a4f/console/0.log" Dec 01 14:09:25 crc kubenswrapper[4585]: I1201 14:09:25.508071 4585 generic.go:334] "Generic (PLEG): container finished" podID="bb6e47d0-5966-48d3-be81-97265e7e7a4f" containerID="44719bd98646ac79145902ccb432144f564a1b267ced682f893392bd75c0241a" exitCode=2 Dec 01 14:09:25 crc kubenswrapper[4585]: I1201 14:09:25.508147 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-f9k95" Dec 01 14:09:25 crc kubenswrapper[4585]: I1201 14:09:25.508158 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-f9k95" event={"ID":"bb6e47d0-5966-48d3-be81-97265e7e7a4f","Type":"ContainerDied","Data":"44719bd98646ac79145902ccb432144f564a1b267ced682f893392bd75c0241a"} Dec 01 14:09:25 crc kubenswrapper[4585]: I1201 14:09:25.508204 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-f9k95" event={"ID":"bb6e47d0-5966-48d3-be81-97265e7e7a4f","Type":"ContainerDied","Data":"3d988e6436ba0817468a47dcfc9bd8caf37924a1c3905dad8be2ca04e6e53c99"} Dec 01 14:09:25 crc kubenswrapper[4585]: I1201 14:09:25.508226 4585 scope.go:117] "RemoveContainer" containerID="44719bd98646ac79145902ccb432144f564a1b267ced682f893392bd75c0241a" Dec 01 14:09:25 crc kubenswrapper[4585]: I1201 14:09:25.510223 4585 generic.go:334] "Generic (PLEG): container finished" podID="2fb40ba9-c26f-4fe8-900d-c5bd775febf6" containerID="ef8d1df93a652e218a77523f3cc06542715e58cec872ae199504a96245fb4a74" exitCode=0 Dec 01 14:09:25 crc kubenswrapper[4585]: I1201 14:09:25.510310 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bnmj8" event={"ID":"2fb40ba9-c26f-4fe8-900d-c5bd775febf6","Type":"ContainerDied","Data":"ef8d1df93a652e218a77523f3cc06542715e58cec872ae199504a96245fb4a74"} Dec 01 14:09:25 crc kubenswrapper[4585]: I1201 14:09:25.549095 4585 scope.go:117] "RemoveContainer" containerID="44719bd98646ac79145902ccb432144f564a1b267ced682f893392bd75c0241a" Dec 01 14:09:25 crc kubenswrapper[4585]: E1201 14:09:25.550358 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44719bd98646ac79145902ccb432144f564a1b267ced682f893392bd75c0241a\": container with ID starting with 44719bd98646ac79145902ccb432144f564a1b267ced682f893392bd75c0241a not found: ID does not exist" containerID="44719bd98646ac79145902ccb432144f564a1b267ced682f893392bd75c0241a" Dec 01 14:09:25 crc kubenswrapper[4585]: I1201 14:09:25.550414 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44719bd98646ac79145902ccb432144f564a1b267ced682f893392bd75c0241a"} err="failed to get container status \"44719bd98646ac79145902ccb432144f564a1b267ced682f893392bd75c0241a\": rpc error: code = NotFound desc = could not find container \"44719bd98646ac79145902ccb432144f564a1b267ced682f893392bd75c0241a\": container with ID starting with 44719bd98646ac79145902ccb432144f564a1b267ced682f893392bd75c0241a not found: ID does not exist" Dec 01 14:09:25 crc kubenswrapper[4585]: I1201 14:09:25.622618 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-f9k95"] Dec 01 14:09:25 crc kubenswrapper[4585]: I1201 14:09:25.627235 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-f9k95"] Dec 01 14:09:26 crc kubenswrapper[4585]: I1201 14:09:26.419618 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb6e47d0-5966-48d3-be81-97265e7e7a4f" path="/var/lib/kubelet/pods/bb6e47d0-5966-48d3-be81-97265e7e7a4f/volumes" Dec 01 14:09:26 crc kubenswrapper[4585]: I1201 14:09:26.520752 4585 generic.go:334] "Generic (PLEG): container finished" podID="2fb40ba9-c26f-4fe8-900d-c5bd775febf6" containerID="f5c7cae8a5365c89d8142d0aff38d5be7f4db602349b11926c7fed9212ef7a17" exitCode=0 Dec 01 14:09:26 crc kubenswrapper[4585]: I1201 14:09:26.520791 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bnmj8" event={"ID":"2fb40ba9-c26f-4fe8-900d-c5bd775febf6","Type":"ContainerDied","Data":"f5c7cae8a5365c89d8142d0aff38d5be7f4db602349b11926c7fed9212ef7a17"} Dec 01 14:09:27 crc kubenswrapper[4585]: I1201 14:09:27.749420 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bnmj8" Dec 01 14:09:27 crc kubenswrapper[4585]: I1201 14:09:27.871320 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz8nd\" (UniqueName: \"kubernetes.io/projected/2fb40ba9-c26f-4fe8-900d-c5bd775febf6-kube-api-access-lz8nd\") pod \"2fb40ba9-c26f-4fe8-900d-c5bd775febf6\" (UID: \"2fb40ba9-c26f-4fe8-900d-c5bd775febf6\") " Dec 01 14:09:27 crc kubenswrapper[4585]: I1201 14:09:27.871394 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2fb40ba9-c26f-4fe8-900d-c5bd775febf6-util\") pod \"2fb40ba9-c26f-4fe8-900d-c5bd775febf6\" (UID: \"2fb40ba9-c26f-4fe8-900d-c5bd775febf6\") " Dec 01 14:09:27 crc kubenswrapper[4585]: I1201 14:09:27.871417 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2fb40ba9-c26f-4fe8-900d-c5bd775febf6-bundle\") pod \"2fb40ba9-c26f-4fe8-900d-c5bd775febf6\" (UID: \"2fb40ba9-c26f-4fe8-900d-c5bd775febf6\") " Dec 01 14:09:27 crc kubenswrapper[4585]: I1201 14:09:27.872596 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fb40ba9-c26f-4fe8-900d-c5bd775febf6-bundle" (OuterVolumeSpecName: "bundle") pod "2fb40ba9-c26f-4fe8-900d-c5bd775febf6" (UID: "2fb40ba9-c26f-4fe8-900d-c5bd775febf6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:09:27 crc kubenswrapper[4585]: I1201 14:09:27.877582 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fb40ba9-c26f-4fe8-900d-c5bd775febf6-kube-api-access-lz8nd" (OuterVolumeSpecName: "kube-api-access-lz8nd") pod "2fb40ba9-c26f-4fe8-900d-c5bd775febf6" (UID: "2fb40ba9-c26f-4fe8-900d-c5bd775febf6"). InnerVolumeSpecName "kube-api-access-lz8nd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:09:27 crc kubenswrapper[4585]: I1201 14:09:27.886526 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fb40ba9-c26f-4fe8-900d-c5bd775febf6-util" (OuterVolumeSpecName: "util") pod "2fb40ba9-c26f-4fe8-900d-c5bd775febf6" (UID: "2fb40ba9-c26f-4fe8-900d-c5bd775febf6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:09:27 crc kubenswrapper[4585]: I1201 14:09:27.974011 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz8nd\" (UniqueName: \"kubernetes.io/projected/2fb40ba9-c26f-4fe8-900d-c5bd775febf6-kube-api-access-lz8nd\") on node \"crc\" DevicePath \"\"" Dec 01 14:09:27 crc kubenswrapper[4585]: I1201 14:09:27.974040 4585 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2fb40ba9-c26f-4fe8-900d-c5bd775febf6-util\") on node \"crc\" DevicePath \"\"" Dec 01 14:09:27 crc kubenswrapper[4585]: I1201 14:09:27.974085 4585 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2fb40ba9-c26f-4fe8-900d-c5bd775febf6-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 14:09:28 crc kubenswrapper[4585]: I1201 14:09:28.538949 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bnmj8" event={"ID":"2fb40ba9-c26f-4fe8-900d-c5bd775febf6","Type":"ContainerDied","Data":"0bc7a489ae767912b829f4955f0b81e0832dfa78f49f36dbd63499925a5ae560"} Dec 01 14:09:28 crc kubenswrapper[4585]: I1201 14:09:28.539018 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0bc7a489ae767912b829f4955f0b81e0832dfa78f49f36dbd63499925a5ae560" Dec 01 14:09:28 crc kubenswrapper[4585]: I1201 14:09:28.539104 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bnmj8" Dec 01 14:09:41 crc kubenswrapper[4585]: I1201 14:09:41.125504 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-65958ffb48-2555t"] Dec 01 14:09:41 crc kubenswrapper[4585]: E1201 14:09:41.127826 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fb40ba9-c26f-4fe8-900d-c5bd775febf6" containerName="pull" Dec 01 14:09:41 crc kubenswrapper[4585]: I1201 14:09:41.127922 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fb40ba9-c26f-4fe8-900d-c5bd775febf6" containerName="pull" Dec 01 14:09:41 crc kubenswrapper[4585]: E1201 14:09:41.128033 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fb40ba9-c26f-4fe8-900d-c5bd775febf6" containerName="extract" Dec 01 14:09:41 crc kubenswrapper[4585]: I1201 14:09:41.128116 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fb40ba9-c26f-4fe8-900d-c5bd775febf6" containerName="extract" Dec 01 14:09:41 crc kubenswrapper[4585]: E1201 14:09:41.128208 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fb40ba9-c26f-4fe8-900d-c5bd775febf6" containerName="util" Dec 01 14:09:41 crc kubenswrapper[4585]: I1201 14:09:41.128313 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fb40ba9-c26f-4fe8-900d-c5bd775febf6" containerName="util" Dec 01 14:09:41 crc kubenswrapper[4585]: E1201 14:09:41.128404 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb6e47d0-5966-48d3-be81-97265e7e7a4f" containerName="console" Dec 01 14:09:41 crc kubenswrapper[4585]: I1201 14:09:41.128477 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb6e47d0-5966-48d3-be81-97265e7e7a4f" containerName="console" Dec 01 14:09:41 crc kubenswrapper[4585]: I1201 14:09:41.128711 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fb40ba9-c26f-4fe8-900d-c5bd775febf6" containerName="extract" Dec 01 14:09:41 crc kubenswrapper[4585]: I1201 14:09:41.128819 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb6e47d0-5966-48d3-be81-97265e7e7a4f" containerName="console" Dec 01 14:09:41 crc kubenswrapper[4585]: I1201 14:09:41.129460 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-65958ffb48-2555t" Dec 01 14:09:41 crc kubenswrapper[4585]: I1201 14:09:41.134774 4585 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 01 14:09:41 crc kubenswrapper[4585]: I1201 14:09:41.135076 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 01 14:09:41 crc kubenswrapper[4585]: I1201 14:09:41.135397 4585 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 01 14:09:41 crc kubenswrapper[4585]: I1201 14:09:41.135464 4585 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-k6z24" Dec 01 14:09:41 crc kubenswrapper[4585]: I1201 14:09:41.135874 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 01 14:09:41 crc kubenswrapper[4585]: I1201 14:09:41.151471 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-65958ffb48-2555t"] Dec 01 14:09:41 crc kubenswrapper[4585]: I1201 14:09:41.180594 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/661d1aa1-ad66-45b2-8562-69776e5fb5af-webhook-cert\") pod \"metallb-operator-controller-manager-65958ffb48-2555t\" (UID: \"661d1aa1-ad66-45b2-8562-69776e5fb5af\") " pod="metallb-system/metallb-operator-controller-manager-65958ffb48-2555t" Dec 01 14:09:41 crc kubenswrapper[4585]: I1201 14:09:41.180961 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chnzt\" (UniqueName: \"kubernetes.io/projected/661d1aa1-ad66-45b2-8562-69776e5fb5af-kube-api-access-chnzt\") pod \"metallb-operator-controller-manager-65958ffb48-2555t\" (UID: \"661d1aa1-ad66-45b2-8562-69776e5fb5af\") " pod="metallb-system/metallb-operator-controller-manager-65958ffb48-2555t" Dec 01 14:09:41 crc kubenswrapper[4585]: I1201 14:09:41.181170 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/661d1aa1-ad66-45b2-8562-69776e5fb5af-apiservice-cert\") pod \"metallb-operator-controller-manager-65958ffb48-2555t\" (UID: \"661d1aa1-ad66-45b2-8562-69776e5fb5af\") " pod="metallb-system/metallb-operator-controller-manager-65958ffb48-2555t" Dec 01 14:09:41 crc kubenswrapper[4585]: I1201 14:09:41.282635 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chnzt\" (UniqueName: \"kubernetes.io/projected/661d1aa1-ad66-45b2-8562-69776e5fb5af-kube-api-access-chnzt\") pod \"metallb-operator-controller-manager-65958ffb48-2555t\" (UID: \"661d1aa1-ad66-45b2-8562-69776e5fb5af\") " pod="metallb-system/metallb-operator-controller-manager-65958ffb48-2555t" Dec 01 14:09:41 crc kubenswrapper[4585]: I1201 14:09:41.282728 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/661d1aa1-ad66-45b2-8562-69776e5fb5af-apiservice-cert\") pod \"metallb-operator-controller-manager-65958ffb48-2555t\" (UID: \"661d1aa1-ad66-45b2-8562-69776e5fb5af\") " pod="metallb-system/metallb-operator-controller-manager-65958ffb48-2555t" Dec 01 14:09:41 crc kubenswrapper[4585]: I1201 14:09:41.282802 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/661d1aa1-ad66-45b2-8562-69776e5fb5af-webhook-cert\") pod \"metallb-operator-controller-manager-65958ffb48-2555t\" (UID: \"661d1aa1-ad66-45b2-8562-69776e5fb5af\") " pod="metallb-system/metallb-operator-controller-manager-65958ffb48-2555t" Dec 01 14:09:41 crc kubenswrapper[4585]: I1201 14:09:41.303590 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/661d1aa1-ad66-45b2-8562-69776e5fb5af-webhook-cert\") pod \"metallb-operator-controller-manager-65958ffb48-2555t\" (UID: \"661d1aa1-ad66-45b2-8562-69776e5fb5af\") " pod="metallb-system/metallb-operator-controller-manager-65958ffb48-2555t" Dec 01 14:09:41 crc kubenswrapper[4585]: I1201 14:09:41.306656 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/661d1aa1-ad66-45b2-8562-69776e5fb5af-apiservice-cert\") pod \"metallb-operator-controller-manager-65958ffb48-2555t\" (UID: \"661d1aa1-ad66-45b2-8562-69776e5fb5af\") " pod="metallb-system/metallb-operator-controller-manager-65958ffb48-2555t" Dec 01 14:09:41 crc kubenswrapper[4585]: I1201 14:09:41.317799 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chnzt\" (UniqueName: \"kubernetes.io/projected/661d1aa1-ad66-45b2-8562-69776e5fb5af-kube-api-access-chnzt\") pod \"metallb-operator-controller-manager-65958ffb48-2555t\" (UID: \"661d1aa1-ad66-45b2-8562-69776e5fb5af\") " pod="metallb-system/metallb-operator-controller-manager-65958ffb48-2555t" Dec 01 14:09:41 crc kubenswrapper[4585]: I1201 14:09:41.449031 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-65958ffb48-2555t" Dec 01 14:09:41 crc kubenswrapper[4585]: I1201 14:09:41.745984 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-64878d448f-cvc5q"] Dec 01 14:09:41 crc kubenswrapper[4585]: I1201 14:09:41.747480 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-64878d448f-cvc5q" Dec 01 14:09:41 crc kubenswrapper[4585]: I1201 14:09:41.761666 4585 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 01 14:09:41 crc kubenswrapper[4585]: I1201 14:09:41.761845 4585 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 01 14:09:41 crc kubenswrapper[4585]: I1201 14:09:41.774652 4585 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-zb7b7" Dec 01 14:09:41 crc kubenswrapper[4585]: I1201 14:09:41.783563 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-64878d448f-cvc5q"] Dec 01 14:09:41 crc kubenswrapper[4585]: I1201 14:09:41.894503 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-65958ffb48-2555t"] Dec 01 14:09:41 crc kubenswrapper[4585]: I1201 14:09:41.898177 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cpdk\" (UniqueName: \"kubernetes.io/projected/99217243-f8e1-4533-925c-a3fac9b81346-kube-api-access-5cpdk\") pod \"metallb-operator-webhook-server-64878d448f-cvc5q\" (UID: \"99217243-f8e1-4533-925c-a3fac9b81346\") " pod="metallb-system/metallb-operator-webhook-server-64878d448f-cvc5q" Dec 01 14:09:41 crc kubenswrapper[4585]: I1201 14:09:41.898215 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/99217243-f8e1-4533-925c-a3fac9b81346-webhook-cert\") pod \"metallb-operator-webhook-server-64878d448f-cvc5q\" (UID: \"99217243-f8e1-4533-925c-a3fac9b81346\") " pod="metallb-system/metallb-operator-webhook-server-64878d448f-cvc5q" Dec 01 14:09:41 crc kubenswrapper[4585]: I1201 14:09:41.898237 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/99217243-f8e1-4533-925c-a3fac9b81346-apiservice-cert\") pod \"metallb-operator-webhook-server-64878d448f-cvc5q\" (UID: \"99217243-f8e1-4533-925c-a3fac9b81346\") " pod="metallb-system/metallb-operator-webhook-server-64878d448f-cvc5q" Dec 01 14:09:41 crc kubenswrapper[4585]: I1201 14:09:41.999357 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cpdk\" (UniqueName: \"kubernetes.io/projected/99217243-f8e1-4533-925c-a3fac9b81346-kube-api-access-5cpdk\") pod \"metallb-operator-webhook-server-64878d448f-cvc5q\" (UID: \"99217243-f8e1-4533-925c-a3fac9b81346\") " pod="metallb-system/metallb-operator-webhook-server-64878d448f-cvc5q" Dec 01 14:09:41 crc kubenswrapper[4585]: I1201 14:09:41.999414 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/99217243-f8e1-4533-925c-a3fac9b81346-webhook-cert\") pod \"metallb-operator-webhook-server-64878d448f-cvc5q\" (UID: \"99217243-f8e1-4533-925c-a3fac9b81346\") " pod="metallb-system/metallb-operator-webhook-server-64878d448f-cvc5q" Dec 01 14:09:41 crc kubenswrapper[4585]: I1201 14:09:41.999443 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/99217243-f8e1-4533-925c-a3fac9b81346-apiservice-cert\") pod \"metallb-operator-webhook-server-64878d448f-cvc5q\" (UID: \"99217243-f8e1-4533-925c-a3fac9b81346\") " pod="metallb-system/metallb-operator-webhook-server-64878d448f-cvc5q" Dec 01 14:09:42 crc kubenswrapper[4585]: I1201 14:09:42.010242 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/99217243-f8e1-4533-925c-a3fac9b81346-apiservice-cert\") pod \"metallb-operator-webhook-server-64878d448f-cvc5q\" (UID: \"99217243-f8e1-4533-925c-a3fac9b81346\") " pod="metallb-system/metallb-operator-webhook-server-64878d448f-cvc5q" Dec 01 14:09:42 crc kubenswrapper[4585]: I1201 14:09:42.010729 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/99217243-f8e1-4533-925c-a3fac9b81346-webhook-cert\") pod \"metallb-operator-webhook-server-64878d448f-cvc5q\" (UID: \"99217243-f8e1-4533-925c-a3fac9b81346\") " pod="metallb-system/metallb-operator-webhook-server-64878d448f-cvc5q" Dec 01 14:09:42 crc kubenswrapper[4585]: I1201 14:09:42.024052 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cpdk\" (UniqueName: \"kubernetes.io/projected/99217243-f8e1-4533-925c-a3fac9b81346-kube-api-access-5cpdk\") pod \"metallb-operator-webhook-server-64878d448f-cvc5q\" (UID: \"99217243-f8e1-4533-925c-a3fac9b81346\") " pod="metallb-system/metallb-operator-webhook-server-64878d448f-cvc5q" Dec 01 14:09:42 crc kubenswrapper[4585]: I1201 14:09:42.075944 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-64878d448f-cvc5q" Dec 01 14:09:42 crc kubenswrapper[4585]: I1201 14:09:42.345827 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-64878d448f-cvc5q"] Dec 01 14:09:42 crc kubenswrapper[4585]: W1201 14:09:42.354773 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod99217243_f8e1_4533_925c_a3fac9b81346.slice/crio-9316f9d063e80a89dbdc2b4fd45353c6b3e9876615314bfeb9f7173cf94c4d0a WatchSource:0}: Error finding container 9316f9d063e80a89dbdc2b4fd45353c6b3e9876615314bfeb9f7173cf94c4d0a: Status 404 returned error can't find the container with id 9316f9d063e80a89dbdc2b4fd45353c6b3e9876615314bfeb9f7173cf94c4d0a Dec 01 14:09:42 crc kubenswrapper[4585]: I1201 14:09:42.660927 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-65958ffb48-2555t" event={"ID":"661d1aa1-ad66-45b2-8562-69776e5fb5af","Type":"ContainerStarted","Data":"1799d0b48876cd0a7c80e91ed9b5c7c2772a0767c366502552050132da4e9abf"} Dec 01 14:09:42 crc kubenswrapper[4585]: I1201 14:09:42.662282 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-64878d448f-cvc5q" event={"ID":"99217243-f8e1-4533-925c-a3fac9b81346","Type":"ContainerStarted","Data":"9316f9d063e80a89dbdc2b4fd45353c6b3e9876615314bfeb9f7173cf94c4d0a"} Dec 01 14:09:48 crc kubenswrapper[4585]: I1201 14:09:48.726875 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-65958ffb48-2555t" event={"ID":"661d1aa1-ad66-45b2-8562-69776e5fb5af","Type":"ContainerStarted","Data":"1295e2878c8164452cbab2633ee3397004556781e81b6ff167d82a4a17462d91"} Dec 01 14:09:48 crc kubenswrapper[4585]: I1201 14:09:48.727894 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-65958ffb48-2555t" Dec 01 14:09:48 crc kubenswrapper[4585]: I1201 14:09:48.735326 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-64878d448f-cvc5q" event={"ID":"99217243-f8e1-4533-925c-a3fac9b81346","Type":"ContainerStarted","Data":"37a071dc4036ba67696511c3c2c171b0cbaf21c239bf70ff8087e704ab25b7f2"} Dec 01 14:09:48 crc kubenswrapper[4585]: I1201 14:09:48.735517 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-64878d448f-cvc5q" Dec 01 14:09:48 crc kubenswrapper[4585]: I1201 14:09:48.760956 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-65958ffb48-2555t" podStartSLOduration=1.23892961 podStartE2EDuration="7.76093224s" podCreationTimestamp="2025-12-01 14:09:41 +0000 UTC" firstStartedPulling="2025-12-01 14:09:41.925532866 +0000 UTC m=+695.909746711" lastFinishedPulling="2025-12-01 14:09:48.447535486 +0000 UTC m=+702.431749341" observedRunningTime="2025-12-01 14:09:48.755740911 +0000 UTC m=+702.739954766" watchObservedRunningTime="2025-12-01 14:09:48.76093224 +0000 UTC m=+702.745146105" Dec 01 14:09:48 crc kubenswrapper[4585]: I1201 14:09:48.786302 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-64878d448f-cvc5q" podStartSLOduration=1.676427684 podStartE2EDuration="7.786277018s" podCreationTimestamp="2025-12-01 14:09:41 +0000 UTC" firstStartedPulling="2025-12-01 14:09:42.357901203 +0000 UTC m=+696.342115058" lastFinishedPulling="2025-12-01 14:09:48.467750537 +0000 UTC m=+702.451964392" observedRunningTime="2025-12-01 14:09:48.783462993 +0000 UTC m=+702.767676848" watchObservedRunningTime="2025-12-01 14:09:48.786277018 +0000 UTC m=+702.770490873" Dec 01 14:10:02 crc kubenswrapper[4585]: I1201 14:10:02.086781 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-64878d448f-cvc5q" Dec 01 14:10:21 crc kubenswrapper[4585]: I1201 14:10:21.451610 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-65958ffb48-2555t" Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.192370 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-nklgw"] Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.195926 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-nklgw" Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.198399 4585 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.199993 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.200463 4585 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-wrvk9" Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.201725 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-lvvzw"] Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.202574 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-lvvzw" Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.205991 4585 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.230152 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-lvvzw"] Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.304682 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/acd2a938-907c-443a-bd52-c0dfd4fbd455-metrics-certs\") pod \"frr-k8s-nklgw\" (UID: \"acd2a938-907c-443a-bd52-c0dfd4fbd455\") " pod="metallb-system/frr-k8s-nklgw" Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.304742 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/acd2a938-907c-443a-bd52-c0dfd4fbd455-metrics\") pod \"frr-k8s-nklgw\" (UID: \"acd2a938-907c-443a-bd52-c0dfd4fbd455\") " pod="metallb-system/frr-k8s-nklgw" Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.304772 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/acd2a938-907c-443a-bd52-c0dfd4fbd455-frr-sockets\") pod \"frr-k8s-nklgw\" (UID: \"acd2a938-907c-443a-bd52-c0dfd4fbd455\") " pod="metallb-system/frr-k8s-nklgw" Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.304787 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/acd2a938-907c-443a-bd52-c0dfd4fbd455-reloader\") pod \"frr-k8s-nklgw\" (UID: \"acd2a938-907c-443a-bd52-c0dfd4fbd455\") " pod="metallb-system/frr-k8s-nklgw" Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.304851 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/acd2a938-907c-443a-bd52-c0dfd4fbd455-frr-startup\") pod \"frr-k8s-nklgw\" (UID: \"acd2a938-907c-443a-bd52-c0dfd4fbd455\") " pod="metallb-system/frr-k8s-nklgw" Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.304875 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/acd2a938-907c-443a-bd52-c0dfd4fbd455-frr-conf\") pod \"frr-k8s-nklgw\" (UID: \"acd2a938-907c-443a-bd52-c0dfd4fbd455\") " pod="metallb-system/frr-k8s-nklgw" Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.304895 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f62hk\" (UniqueName: \"kubernetes.io/projected/acd2a938-907c-443a-bd52-c0dfd4fbd455-kube-api-access-f62hk\") pod \"frr-k8s-nklgw\" (UID: \"acd2a938-907c-443a-bd52-c0dfd4fbd455\") " pod="metallb-system/frr-k8s-nklgw" Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.304914 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7ee13572-ff22-43ea-8570-cc0f3a64d44e-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-lvvzw\" (UID: \"7ee13572-ff22-43ea-8570-cc0f3a64d44e\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-lvvzw" Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.304944 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzc6f\" (UniqueName: \"kubernetes.io/projected/7ee13572-ff22-43ea-8570-cc0f3a64d44e-kube-api-access-qzc6f\") pod \"frr-k8s-webhook-server-7fcb986d4-lvvzw\" (UID: \"7ee13572-ff22-43ea-8570-cc0f3a64d44e\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-lvvzw" Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.319399 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-tnnzj"] Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.320468 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-tnnzj" Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.324139 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.328283 4585 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.328303 4585 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-fzb78" Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.331776 4585 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.332776 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-hx8mm"] Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.333600 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-hx8mm" Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.336754 4585 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.380782 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-hx8mm"] Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.406231 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/acd2a938-907c-443a-bd52-c0dfd4fbd455-frr-sockets\") pod \"frr-k8s-nklgw\" (UID: \"acd2a938-907c-443a-bd52-c0dfd4fbd455\") " pod="metallb-system/frr-k8s-nklgw" Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.406269 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/acd2a938-907c-443a-bd52-c0dfd4fbd455-reloader\") pod \"frr-k8s-nklgw\" (UID: \"acd2a938-907c-443a-bd52-c0dfd4fbd455\") " pod="metallb-system/frr-k8s-nklgw" Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.406295 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/24c44b85-a153-4622-864f-a0f690044361-memberlist\") pod \"speaker-tnnzj\" (UID: \"24c44b85-a153-4622-864f-a0f690044361\") " pod="metallb-system/speaker-tnnzj" Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.406321 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/acd2a938-907c-443a-bd52-c0dfd4fbd455-frr-startup\") pod \"frr-k8s-nklgw\" (UID: \"acd2a938-907c-443a-bd52-c0dfd4fbd455\") " pod="metallb-system/frr-k8s-nklgw" Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.406337 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/24c44b85-a153-4622-864f-a0f690044361-metrics-certs\") pod \"speaker-tnnzj\" (UID: \"24c44b85-a153-4622-864f-a0f690044361\") " pod="metallb-system/speaker-tnnzj" Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.406352 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqwqk\" (UniqueName: \"kubernetes.io/projected/24c44b85-a153-4622-864f-a0f690044361-kube-api-access-wqwqk\") pod \"speaker-tnnzj\" (UID: \"24c44b85-a153-4622-864f-a0f690044361\") " pod="metallb-system/speaker-tnnzj" Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.406380 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3028ccae-b87c-4752-9558-1399dc8fa279-metrics-certs\") pod \"controller-f8648f98b-hx8mm\" (UID: \"3028ccae-b87c-4752-9558-1399dc8fa279\") " pod="metallb-system/controller-f8648f98b-hx8mm" Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.406397 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/acd2a938-907c-443a-bd52-c0dfd4fbd455-frr-conf\") pod \"frr-k8s-nklgw\" (UID: \"acd2a938-907c-443a-bd52-c0dfd4fbd455\") " pod="metallb-system/frr-k8s-nklgw" Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.406421 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f62hk\" (UniqueName: \"kubernetes.io/projected/acd2a938-907c-443a-bd52-c0dfd4fbd455-kube-api-access-f62hk\") pod \"frr-k8s-nklgw\" (UID: \"acd2a938-907c-443a-bd52-c0dfd4fbd455\") " pod="metallb-system/frr-k8s-nklgw" Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.406437 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7ee13572-ff22-43ea-8570-cc0f3a64d44e-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-lvvzw\" (UID: \"7ee13572-ff22-43ea-8570-cc0f3a64d44e\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-lvvzw" Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.406468 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzc6f\" (UniqueName: \"kubernetes.io/projected/7ee13572-ff22-43ea-8570-cc0f3a64d44e-kube-api-access-qzc6f\") pod \"frr-k8s-webhook-server-7fcb986d4-lvvzw\" (UID: \"7ee13572-ff22-43ea-8570-cc0f3a64d44e\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-lvvzw" Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.406490 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/24c44b85-a153-4622-864f-a0f690044361-metallb-excludel2\") pod \"speaker-tnnzj\" (UID: \"24c44b85-a153-4622-864f-a0f690044361\") " pod="metallb-system/speaker-tnnzj" Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.406514 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/acd2a938-907c-443a-bd52-c0dfd4fbd455-metrics-certs\") pod \"frr-k8s-nklgw\" (UID: \"acd2a938-907c-443a-bd52-c0dfd4fbd455\") " pod="metallb-system/frr-k8s-nklgw" Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.406535 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25hrh\" (UniqueName: \"kubernetes.io/projected/3028ccae-b87c-4752-9558-1399dc8fa279-kube-api-access-25hrh\") pod \"controller-f8648f98b-hx8mm\" (UID: \"3028ccae-b87c-4752-9558-1399dc8fa279\") " pod="metallb-system/controller-f8648f98b-hx8mm" Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.406551 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3028ccae-b87c-4752-9558-1399dc8fa279-cert\") pod \"controller-f8648f98b-hx8mm\" (UID: \"3028ccae-b87c-4752-9558-1399dc8fa279\") " pod="metallb-system/controller-f8648f98b-hx8mm" Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.406566 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/acd2a938-907c-443a-bd52-c0dfd4fbd455-metrics\") pod \"frr-k8s-nklgw\" (UID: \"acd2a938-907c-443a-bd52-c0dfd4fbd455\") " pod="metallb-system/frr-k8s-nklgw" Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.407114 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/acd2a938-907c-443a-bd52-c0dfd4fbd455-metrics\") pod \"frr-k8s-nklgw\" (UID: \"acd2a938-907c-443a-bd52-c0dfd4fbd455\") " pod="metallb-system/frr-k8s-nklgw" Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.407314 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/acd2a938-907c-443a-bd52-c0dfd4fbd455-frr-sockets\") pod \"frr-k8s-nklgw\" (UID: \"acd2a938-907c-443a-bd52-c0dfd4fbd455\") " pod="metallb-system/frr-k8s-nklgw" Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.407513 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/acd2a938-907c-443a-bd52-c0dfd4fbd455-reloader\") pod \"frr-k8s-nklgw\" (UID: \"acd2a938-907c-443a-bd52-c0dfd4fbd455\") " pod="metallb-system/frr-k8s-nklgw" Dec 01 14:10:22 crc kubenswrapper[4585]: E1201 14:10:22.408204 4585 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Dec 01 14:10:22 crc kubenswrapper[4585]: E1201 14:10:22.408245 4585 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Dec 01 14:10:22 crc kubenswrapper[4585]: E1201 14:10:22.408297 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/acd2a938-907c-443a-bd52-c0dfd4fbd455-metrics-certs podName:acd2a938-907c-443a-bd52-c0dfd4fbd455 nodeName:}" failed. No retries permitted until 2025-12-01 14:10:22.908277923 +0000 UTC m=+736.892491778 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/acd2a938-907c-443a-bd52-c0dfd4fbd455-metrics-certs") pod "frr-k8s-nklgw" (UID: "acd2a938-907c-443a-bd52-c0dfd4fbd455") : secret "frr-k8s-certs-secret" not found Dec 01 14:10:22 crc kubenswrapper[4585]: E1201 14:10:22.408324 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ee13572-ff22-43ea-8570-cc0f3a64d44e-cert podName:7ee13572-ff22-43ea-8570-cc0f3a64d44e nodeName:}" failed. No retries permitted until 2025-12-01 14:10:22.908314404 +0000 UTC m=+736.892528399 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7ee13572-ff22-43ea-8570-cc0f3a64d44e-cert") pod "frr-k8s-webhook-server-7fcb986d4-lvvzw" (UID: "7ee13572-ff22-43ea-8570-cc0f3a64d44e") : secret "frr-k8s-webhook-server-cert" not found Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.408329 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/acd2a938-907c-443a-bd52-c0dfd4fbd455-frr-startup\") pod \"frr-k8s-nklgw\" (UID: \"acd2a938-907c-443a-bd52-c0dfd4fbd455\") " pod="metallb-system/frr-k8s-nklgw" Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.408528 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/acd2a938-907c-443a-bd52-c0dfd4fbd455-frr-conf\") pod \"frr-k8s-nklgw\" (UID: \"acd2a938-907c-443a-bd52-c0dfd4fbd455\") " pod="metallb-system/frr-k8s-nklgw" Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.432416 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzc6f\" (UniqueName: \"kubernetes.io/projected/7ee13572-ff22-43ea-8570-cc0f3a64d44e-kube-api-access-qzc6f\") pod \"frr-k8s-webhook-server-7fcb986d4-lvvzw\" (UID: \"7ee13572-ff22-43ea-8570-cc0f3a64d44e\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-lvvzw" Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.443519 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f62hk\" (UniqueName: \"kubernetes.io/projected/acd2a938-907c-443a-bd52-c0dfd4fbd455-kube-api-access-f62hk\") pod \"frr-k8s-nklgw\" (UID: \"acd2a938-907c-443a-bd52-c0dfd4fbd455\") " pod="metallb-system/frr-k8s-nklgw" Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.507269 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/24c44b85-a153-4622-864f-a0f690044361-metrics-certs\") pod \"speaker-tnnzj\" (UID: \"24c44b85-a153-4622-864f-a0f690044361\") " pod="metallb-system/speaker-tnnzj" Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.507919 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqwqk\" (UniqueName: \"kubernetes.io/projected/24c44b85-a153-4622-864f-a0f690044361-kube-api-access-wqwqk\") pod \"speaker-tnnzj\" (UID: \"24c44b85-a153-4622-864f-a0f690044361\") " pod="metallb-system/speaker-tnnzj" Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.508538 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3028ccae-b87c-4752-9558-1399dc8fa279-metrics-certs\") pod \"controller-f8648f98b-hx8mm\" (UID: \"3028ccae-b87c-4752-9558-1399dc8fa279\") " pod="metallb-system/controller-f8648f98b-hx8mm" Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.508718 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/24c44b85-a153-4622-864f-a0f690044361-metallb-excludel2\") pod \"speaker-tnnzj\" (UID: \"24c44b85-a153-4622-864f-a0f690044361\") " pod="metallb-system/speaker-tnnzj" Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.508844 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25hrh\" (UniqueName: \"kubernetes.io/projected/3028ccae-b87c-4752-9558-1399dc8fa279-kube-api-access-25hrh\") pod \"controller-f8648f98b-hx8mm\" (UID: \"3028ccae-b87c-4752-9558-1399dc8fa279\") " pod="metallb-system/controller-f8648f98b-hx8mm" Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.508944 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3028ccae-b87c-4752-9558-1399dc8fa279-cert\") pod \"controller-f8648f98b-hx8mm\" (UID: \"3028ccae-b87c-4752-9558-1399dc8fa279\") " pod="metallb-system/controller-f8648f98b-hx8mm" Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.509055 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/24c44b85-a153-4622-864f-a0f690044361-memberlist\") pod \"speaker-tnnzj\" (UID: \"24c44b85-a153-4622-864f-a0f690044361\") " pod="metallb-system/speaker-tnnzj" Dec 01 14:10:22 crc kubenswrapper[4585]: E1201 14:10:22.508073 4585 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Dec 01 14:10:22 crc kubenswrapper[4585]: E1201 14:10:22.509945 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/24c44b85-a153-4622-864f-a0f690044361-metrics-certs podName:24c44b85-a153-4622-864f-a0f690044361 nodeName:}" failed. No retries permitted until 2025-12-01 14:10:23.009931682 +0000 UTC m=+736.994145537 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/24c44b85-a153-4622-864f-a0f690044361-metrics-certs") pod "speaker-tnnzj" (UID: "24c44b85-a153-4622-864f-a0f690044361") : secret "speaker-certs-secret" not found Dec 01 14:10:22 crc kubenswrapper[4585]: E1201 14:10:22.509288 4585 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 01 14:10:22 crc kubenswrapper[4585]: E1201 14:10:22.510190 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/24c44b85-a153-4622-864f-a0f690044361-memberlist podName:24c44b85-a153-4622-864f-a0f690044361 nodeName:}" failed. No retries permitted until 2025-12-01 14:10:23.010161149 +0000 UTC m=+736.994375004 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/24c44b85-a153-4622-864f-a0f690044361-memberlist") pod "speaker-tnnzj" (UID: "24c44b85-a153-4622-864f-a0f690044361") : secret "metallb-memberlist" not found Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.510997 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/24c44b85-a153-4622-864f-a0f690044361-metallb-excludel2\") pod \"speaker-tnnzj\" (UID: \"24c44b85-a153-4622-864f-a0f690044361\") " pod="metallb-system/speaker-tnnzj" Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.512820 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3028ccae-b87c-4752-9558-1399dc8fa279-metrics-certs\") pod \"controller-f8648f98b-hx8mm\" (UID: \"3028ccae-b87c-4752-9558-1399dc8fa279\") " pod="metallb-system/controller-f8648f98b-hx8mm" Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.514522 4585 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.525322 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3028ccae-b87c-4752-9558-1399dc8fa279-cert\") pod \"controller-f8648f98b-hx8mm\" (UID: \"3028ccae-b87c-4752-9558-1399dc8fa279\") " pod="metallb-system/controller-f8648f98b-hx8mm" Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.534522 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqwqk\" (UniqueName: \"kubernetes.io/projected/24c44b85-a153-4622-864f-a0f690044361-kube-api-access-wqwqk\") pod \"speaker-tnnzj\" (UID: \"24c44b85-a153-4622-864f-a0f690044361\") " pod="metallb-system/speaker-tnnzj" Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.539347 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25hrh\" (UniqueName: \"kubernetes.io/projected/3028ccae-b87c-4752-9558-1399dc8fa279-kube-api-access-25hrh\") pod \"controller-f8648f98b-hx8mm\" (UID: \"3028ccae-b87c-4752-9558-1399dc8fa279\") " pod="metallb-system/controller-f8648f98b-hx8mm" Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.644919 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-hx8mm" Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.852231 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-hx8mm"] Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.914041 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7ee13572-ff22-43ea-8570-cc0f3a64d44e-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-lvvzw\" (UID: \"7ee13572-ff22-43ea-8570-cc0f3a64d44e\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-lvvzw" Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.914121 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/acd2a938-907c-443a-bd52-c0dfd4fbd455-metrics-certs\") pod \"frr-k8s-nklgw\" (UID: \"acd2a938-907c-443a-bd52-c0dfd4fbd455\") " pod="metallb-system/frr-k8s-nklgw" Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.919283 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/acd2a938-907c-443a-bd52-c0dfd4fbd455-metrics-certs\") pod \"frr-k8s-nklgw\" (UID: \"acd2a938-907c-443a-bd52-c0dfd4fbd455\") " pod="metallb-system/frr-k8s-nklgw" Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.919883 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7ee13572-ff22-43ea-8570-cc0f3a64d44e-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-lvvzw\" (UID: \"7ee13572-ff22-43ea-8570-cc0f3a64d44e\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-lvvzw" Dec 01 14:10:22 crc kubenswrapper[4585]: I1201 14:10:22.945435 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-hx8mm" event={"ID":"3028ccae-b87c-4752-9558-1399dc8fa279","Type":"ContainerStarted","Data":"7933e851f460f6433a1906af23f5db348f4262c38c35bb435a134da912ce9ee9"} Dec 01 14:10:23 crc kubenswrapper[4585]: I1201 14:10:23.015099 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/24c44b85-a153-4622-864f-a0f690044361-memberlist\") pod \"speaker-tnnzj\" (UID: \"24c44b85-a153-4622-864f-a0f690044361\") " pod="metallb-system/speaker-tnnzj" Dec 01 14:10:23 crc kubenswrapper[4585]: I1201 14:10:23.015189 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/24c44b85-a153-4622-864f-a0f690044361-metrics-certs\") pod \"speaker-tnnzj\" (UID: \"24c44b85-a153-4622-864f-a0f690044361\") " pod="metallb-system/speaker-tnnzj" Dec 01 14:10:23 crc kubenswrapper[4585]: E1201 14:10:23.015358 4585 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 01 14:10:23 crc kubenswrapper[4585]: E1201 14:10:23.015477 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/24c44b85-a153-4622-864f-a0f690044361-memberlist podName:24c44b85-a153-4622-864f-a0f690044361 nodeName:}" failed. No retries permitted until 2025-12-01 14:10:24.015424655 +0000 UTC m=+737.999638510 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/24c44b85-a153-4622-864f-a0f690044361-memberlist") pod "speaker-tnnzj" (UID: "24c44b85-a153-4622-864f-a0f690044361") : secret "metallb-memberlist" not found Dec 01 14:10:23 crc kubenswrapper[4585]: I1201 14:10:23.024027 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/24c44b85-a153-4622-864f-a0f690044361-metrics-certs\") pod \"speaker-tnnzj\" (UID: \"24c44b85-a153-4622-864f-a0f690044361\") " pod="metallb-system/speaker-tnnzj" Dec 01 14:10:23 crc kubenswrapper[4585]: I1201 14:10:23.120352 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-nklgw" Dec 01 14:10:23 crc kubenswrapper[4585]: I1201 14:10:23.132547 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-lvvzw" Dec 01 14:10:23 crc kubenswrapper[4585]: I1201 14:10:23.359760 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-lvvzw"] Dec 01 14:10:23 crc kubenswrapper[4585]: W1201 14:10:23.365259 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ee13572_ff22_43ea_8570_cc0f3a64d44e.slice/crio-27c6972cc10ad581e510bf2221b3ea16e979bed0104b6364225a3f9882a90b91 WatchSource:0}: Error finding container 27c6972cc10ad581e510bf2221b3ea16e979bed0104b6364225a3f9882a90b91: Status 404 returned error can't find the container with id 27c6972cc10ad581e510bf2221b3ea16e979bed0104b6364225a3f9882a90b91 Dec 01 14:10:23 crc kubenswrapper[4585]: I1201 14:10:23.953032 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-hx8mm" event={"ID":"3028ccae-b87c-4752-9558-1399dc8fa279","Type":"ContainerStarted","Data":"06c7d37c9b2024a0e5254a9b5081eafc0c4bd3d675443152a049e256e346b58e"} Dec 01 14:10:23 crc kubenswrapper[4585]: I1201 14:10:23.953085 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-hx8mm" event={"ID":"3028ccae-b87c-4752-9558-1399dc8fa279","Type":"ContainerStarted","Data":"6f7d87f888ed16ce988828975308e8cf8e20f79bbe49fe5574e4083d55f4d6b7"} Dec 01 14:10:23 crc kubenswrapper[4585]: I1201 14:10:23.953358 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-hx8mm" Dec 01 14:10:23 crc kubenswrapper[4585]: I1201 14:10:23.954023 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-lvvzw" event={"ID":"7ee13572-ff22-43ea-8570-cc0f3a64d44e","Type":"ContainerStarted","Data":"27c6972cc10ad581e510bf2221b3ea16e979bed0104b6364225a3f9882a90b91"} Dec 01 14:10:23 crc kubenswrapper[4585]: I1201 14:10:23.954962 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-nklgw" event={"ID":"acd2a938-907c-443a-bd52-c0dfd4fbd455","Type":"ContainerStarted","Data":"16c5227b12f56caa165fdf5107b4dcffce714837eb9259dd4f767fea734f6f2e"} Dec 01 14:10:23 crc kubenswrapper[4585]: I1201 14:10:23.977562 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-hx8mm" podStartSLOduration=1.977529281 podStartE2EDuration="1.977529281s" podCreationTimestamp="2025-12-01 14:10:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:10:23.976812552 +0000 UTC m=+737.961026437" watchObservedRunningTime="2025-12-01 14:10:23.977529281 +0000 UTC m=+737.961743136" Dec 01 14:10:24 crc kubenswrapper[4585]: I1201 14:10:24.028219 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/24c44b85-a153-4622-864f-a0f690044361-memberlist\") pod \"speaker-tnnzj\" (UID: \"24c44b85-a153-4622-864f-a0f690044361\") " pod="metallb-system/speaker-tnnzj" Dec 01 14:10:24 crc kubenswrapper[4585]: I1201 14:10:24.036518 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/24c44b85-a153-4622-864f-a0f690044361-memberlist\") pod \"speaker-tnnzj\" (UID: \"24c44b85-a153-4622-864f-a0f690044361\") " pod="metallb-system/speaker-tnnzj" Dec 01 14:10:24 crc kubenswrapper[4585]: I1201 14:10:24.132114 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-tnnzj" Dec 01 14:10:24 crc kubenswrapper[4585]: W1201 14:10:24.156203 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24c44b85_a153_4622_864f_a0f690044361.slice/crio-119c5e4c4e5364c8803ee21f315190dbb4dd913ec83e973bdc15060ff60501bc WatchSource:0}: Error finding container 119c5e4c4e5364c8803ee21f315190dbb4dd913ec83e973bdc15060ff60501bc: Status 404 returned error can't find the container with id 119c5e4c4e5364c8803ee21f315190dbb4dd913ec83e973bdc15060ff60501bc Dec 01 14:10:24 crc kubenswrapper[4585]: I1201 14:10:24.968147 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-tnnzj" event={"ID":"24c44b85-a153-4622-864f-a0f690044361","Type":"ContainerStarted","Data":"a2589655626863e3ae7d0dbd4ad2baa41795176dcbd2c963efac38eb8f0bd60b"} Dec 01 14:10:24 crc kubenswrapper[4585]: I1201 14:10:24.968571 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-tnnzj" event={"ID":"24c44b85-a153-4622-864f-a0f690044361","Type":"ContainerStarted","Data":"34f7dc2dab0dde52a167ef1431546ac45fbc4d91fd5ddd8fac30c6b745495aaf"} Dec 01 14:10:24 crc kubenswrapper[4585]: I1201 14:10:24.968582 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-tnnzj" event={"ID":"24c44b85-a153-4622-864f-a0f690044361","Type":"ContainerStarted","Data":"119c5e4c4e5364c8803ee21f315190dbb4dd913ec83e973bdc15060ff60501bc"} Dec 01 14:10:24 crc kubenswrapper[4585]: I1201 14:10:24.968777 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-tnnzj" Dec 01 14:10:26 crc kubenswrapper[4585]: I1201 14:10:26.438209 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-tnnzj" podStartSLOduration=4.438192987 podStartE2EDuration="4.438192987s" podCreationTimestamp="2025-12-01 14:10:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:10:24.995483073 +0000 UTC m=+738.979696928" watchObservedRunningTime="2025-12-01 14:10:26.438192987 +0000 UTC m=+740.422406842" Dec 01 14:10:32 crc kubenswrapper[4585]: I1201 14:10:32.030770 4585 generic.go:334] "Generic (PLEG): container finished" podID="acd2a938-907c-443a-bd52-c0dfd4fbd455" containerID="9b2c0aad2154f0efef8835dc38460aaf8f8d1e872033054841612bd30db6e4f3" exitCode=0 Dec 01 14:10:32 crc kubenswrapper[4585]: I1201 14:10:32.030910 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-nklgw" event={"ID":"acd2a938-907c-443a-bd52-c0dfd4fbd455","Type":"ContainerDied","Data":"9b2c0aad2154f0efef8835dc38460aaf8f8d1e872033054841612bd30db6e4f3"} Dec 01 14:10:32 crc kubenswrapper[4585]: I1201 14:10:32.035276 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-lvvzw" event={"ID":"7ee13572-ff22-43ea-8570-cc0f3a64d44e","Type":"ContainerStarted","Data":"f0ba6788386308f7ab37d65074b9525d50ffb73a5da7c06905af479e25e0024e"} Dec 01 14:10:32 crc kubenswrapper[4585]: I1201 14:10:32.035389 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-lvvzw" Dec 01 14:10:32 crc kubenswrapper[4585]: I1201 14:10:32.072149 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-lvvzw" podStartSLOduration=1.7757807460000001 podStartE2EDuration="10.072126591s" podCreationTimestamp="2025-12-01 14:10:22 +0000 UTC" firstStartedPulling="2025-12-01 14:10:23.366906808 +0000 UTC m=+737.351120663" lastFinishedPulling="2025-12-01 14:10:31.663252653 +0000 UTC m=+745.647466508" observedRunningTime="2025-12-01 14:10:32.066794268 +0000 UTC m=+746.051008143" watchObservedRunningTime="2025-12-01 14:10:32.072126591 +0000 UTC m=+746.056340446" Dec 01 14:10:33 crc kubenswrapper[4585]: I1201 14:10:33.046038 4585 generic.go:334] "Generic (PLEG): container finished" podID="acd2a938-907c-443a-bd52-c0dfd4fbd455" containerID="5bb015e361dc1d2cb27f46597abe4330bb209cb8b3752c7e7a073dc95c47ae8e" exitCode=0 Dec 01 14:10:33 crc kubenswrapper[4585]: I1201 14:10:33.046156 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-nklgw" event={"ID":"acd2a938-907c-443a-bd52-c0dfd4fbd455","Type":"ContainerDied","Data":"5bb015e361dc1d2cb27f46597abe4330bb209cb8b3752c7e7a073dc95c47ae8e"} Dec 01 14:10:34 crc kubenswrapper[4585]: I1201 14:10:34.057169 4585 generic.go:334] "Generic (PLEG): container finished" podID="acd2a938-907c-443a-bd52-c0dfd4fbd455" containerID="12520982a4135bbf3e077c5835534571ddcd1b08a4b73183499b07dec034fc37" exitCode=0 Dec 01 14:10:34 crc kubenswrapper[4585]: I1201 14:10:34.057200 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-nklgw" event={"ID":"acd2a938-907c-443a-bd52-c0dfd4fbd455","Type":"ContainerDied","Data":"12520982a4135bbf3e077c5835534571ddcd1b08a4b73183499b07dec034fc37"} Dec 01 14:10:34 crc kubenswrapper[4585]: I1201 14:10:34.135803 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-tnnzj" Dec 01 14:10:35 crc kubenswrapper[4585]: I1201 14:10:35.065210 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-nklgw" event={"ID":"acd2a938-907c-443a-bd52-c0dfd4fbd455","Type":"ContainerStarted","Data":"96e12d68e8a79b3b2b416c07beb904b63da52305b0ce72418420b172bfedbacf"} Dec 01 14:10:35 crc kubenswrapper[4585]: I1201 14:10:35.065542 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-nklgw" event={"ID":"acd2a938-907c-443a-bd52-c0dfd4fbd455","Type":"ContainerStarted","Data":"77e09df6d965f31a5bf6bdbcc88f00a63626033f06e43fd83ec94bb688d499e3"} Dec 01 14:10:35 crc kubenswrapper[4585]: I1201 14:10:35.065557 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-nklgw" event={"ID":"acd2a938-907c-443a-bd52-c0dfd4fbd455","Type":"ContainerStarted","Data":"9cd57886a34f418d2b1a9845cf5e17b0e96490d8b8561811124eeda79763753b"} Dec 01 14:10:35 crc kubenswrapper[4585]: I1201 14:10:35.536108 4585 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 01 14:10:36 crc kubenswrapper[4585]: I1201 14:10:36.087468 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-nklgw" event={"ID":"acd2a938-907c-443a-bd52-c0dfd4fbd455","Type":"ContainerStarted","Data":"3634b1e8256a34e2e48328874d7a1194293d94945215f6744eb1aeb97eb04f12"} Dec 01 14:10:36 crc kubenswrapper[4585]: I1201 14:10:36.087529 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-nklgw" event={"ID":"acd2a938-907c-443a-bd52-c0dfd4fbd455","Type":"ContainerStarted","Data":"092cf9914a8cbf619b6140908efab1f35d590abbbbe1a06d844c14855b516594"} Dec 01 14:10:36 crc kubenswrapper[4585]: I1201 14:10:36.087546 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-nklgw" event={"ID":"acd2a938-907c-443a-bd52-c0dfd4fbd455","Type":"ContainerStarted","Data":"74cd16d7ed39e76f08ccc9e4014327897f7c99e01936cddff79645d6051701ec"} Dec 01 14:10:36 crc kubenswrapper[4585]: I1201 14:10:36.088785 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-nklgw" Dec 01 14:10:36 crc kubenswrapper[4585]: I1201 14:10:36.116095 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-nklgw" podStartSLOduration=5.747802372 podStartE2EDuration="14.116077041s" podCreationTimestamp="2025-12-01 14:10:22 +0000 UTC" firstStartedPulling="2025-12-01 14:10:23.267343654 +0000 UTC m=+737.251557509" lastFinishedPulling="2025-12-01 14:10:31.635618323 +0000 UTC m=+745.619832178" observedRunningTime="2025-12-01 14:10:36.112088474 +0000 UTC m=+750.096302339" watchObservedRunningTime="2025-12-01 14:10:36.116077041 +0000 UTC m=+750.100290896" Dec 01 14:10:37 crc kubenswrapper[4585]: I1201 14:10:37.194871 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-77rsx"] Dec 01 14:10:37 crc kubenswrapper[4585]: I1201 14:10:37.196033 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-77rsx" Dec 01 14:10:37 crc kubenswrapper[4585]: I1201 14:10:37.198478 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-gz8gg" Dec 01 14:10:37 crc kubenswrapper[4585]: I1201 14:10:37.198745 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 01 14:10:37 crc kubenswrapper[4585]: I1201 14:10:37.199053 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 01 14:10:37 crc kubenswrapper[4585]: I1201 14:10:37.211835 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-77rsx"] Dec 01 14:10:37 crc kubenswrapper[4585]: I1201 14:10:37.331987 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhx8b\" (UniqueName: \"kubernetes.io/projected/fea090b7-3832-4a1b-9d73-5f5407633e9f-kube-api-access-dhx8b\") pod \"openstack-operator-index-77rsx\" (UID: \"fea090b7-3832-4a1b-9d73-5f5407633e9f\") " pod="openstack-operators/openstack-operator-index-77rsx" Dec 01 14:10:37 crc kubenswrapper[4585]: I1201 14:10:37.433613 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhx8b\" (UniqueName: \"kubernetes.io/projected/fea090b7-3832-4a1b-9d73-5f5407633e9f-kube-api-access-dhx8b\") pod \"openstack-operator-index-77rsx\" (UID: \"fea090b7-3832-4a1b-9d73-5f5407633e9f\") " pod="openstack-operators/openstack-operator-index-77rsx" Dec 01 14:10:37 crc kubenswrapper[4585]: I1201 14:10:37.465024 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhx8b\" (UniqueName: \"kubernetes.io/projected/fea090b7-3832-4a1b-9d73-5f5407633e9f-kube-api-access-dhx8b\") pod \"openstack-operator-index-77rsx\" (UID: \"fea090b7-3832-4a1b-9d73-5f5407633e9f\") " pod="openstack-operators/openstack-operator-index-77rsx" Dec 01 14:10:37 crc kubenswrapper[4585]: I1201 14:10:37.513199 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-77rsx" Dec 01 14:10:37 crc kubenswrapper[4585]: I1201 14:10:37.813592 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-77rsx"] Dec 01 14:10:37 crc kubenswrapper[4585]: W1201 14:10:37.822115 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfea090b7_3832_4a1b_9d73_5f5407633e9f.slice/crio-db7705b74a87a64752b01d6677d5b79f91bf9a7e714715f384b3ba2704c71bc3 WatchSource:0}: Error finding container db7705b74a87a64752b01d6677d5b79f91bf9a7e714715f384b3ba2704c71bc3: Status 404 returned error can't find the container with id db7705b74a87a64752b01d6677d5b79f91bf9a7e714715f384b3ba2704c71bc3 Dec 01 14:10:38 crc kubenswrapper[4585]: I1201 14:10:38.114096 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-77rsx" event={"ID":"fea090b7-3832-4a1b-9d73-5f5407633e9f","Type":"ContainerStarted","Data":"db7705b74a87a64752b01d6677d5b79f91bf9a7e714715f384b3ba2704c71bc3"} Dec 01 14:10:38 crc kubenswrapper[4585]: I1201 14:10:38.121439 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-nklgw" Dec 01 14:10:38 crc kubenswrapper[4585]: I1201 14:10:38.168590 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-nklgw" Dec 01 14:10:40 crc kubenswrapper[4585]: I1201 14:10:40.371321 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-77rsx"] Dec 01 14:10:40 crc kubenswrapper[4585]: I1201 14:10:40.980463 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-47c9p"] Dec 01 14:10:40 crc kubenswrapper[4585]: I1201 14:10:40.982315 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-47c9p" Dec 01 14:10:40 crc kubenswrapper[4585]: I1201 14:10:40.990262 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-47c9p"] Dec 01 14:10:41 crc kubenswrapper[4585]: I1201 14:10:41.085774 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ww9ck\" (UniqueName: \"kubernetes.io/projected/12b19089-35d3-41e8-b50f-385c3d8bb27a-kube-api-access-ww9ck\") pod \"openstack-operator-index-47c9p\" (UID: \"12b19089-35d3-41e8-b50f-385c3d8bb27a\") " pod="openstack-operators/openstack-operator-index-47c9p" Dec 01 14:10:41 crc kubenswrapper[4585]: I1201 14:10:41.133092 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-77rsx" event={"ID":"fea090b7-3832-4a1b-9d73-5f5407633e9f","Type":"ContainerStarted","Data":"3a9616f3bd43b401692b06fee41b7120a23a72ccfc0136a34584f9982cb705b5"} Dec 01 14:10:41 crc kubenswrapper[4585]: I1201 14:10:41.148186 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-77rsx" podStartSLOduration=1.882405015 podStartE2EDuration="4.148167536s" podCreationTimestamp="2025-12-01 14:10:37 +0000 UTC" firstStartedPulling="2025-12-01 14:10:37.823411375 +0000 UTC m=+751.807625230" lastFinishedPulling="2025-12-01 14:10:40.089173896 +0000 UTC m=+754.073387751" observedRunningTime="2025-12-01 14:10:41.144997481 +0000 UTC m=+755.129211356" watchObservedRunningTime="2025-12-01 14:10:41.148167536 +0000 UTC m=+755.132381391" Dec 01 14:10:41 crc kubenswrapper[4585]: I1201 14:10:41.187810 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ww9ck\" (UniqueName: \"kubernetes.io/projected/12b19089-35d3-41e8-b50f-385c3d8bb27a-kube-api-access-ww9ck\") pod \"openstack-operator-index-47c9p\" (UID: \"12b19089-35d3-41e8-b50f-385c3d8bb27a\") " pod="openstack-operators/openstack-operator-index-47c9p" Dec 01 14:10:41 crc kubenswrapper[4585]: I1201 14:10:41.204469 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ww9ck\" (UniqueName: \"kubernetes.io/projected/12b19089-35d3-41e8-b50f-385c3d8bb27a-kube-api-access-ww9ck\") pod \"openstack-operator-index-47c9p\" (UID: \"12b19089-35d3-41e8-b50f-385c3d8bb27a\") " pod="openstack-operators/openstack-operator-index-47c9p" Dec 01 14:10:41 crc kubenswrapper[4585]: I1201 14:10:41.301502 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-47c9p" Dec 01 14:10:41 crc kubenswrapper[4585]: I1201 14:10:41.682183 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-47c9p"] Dec 01 14:10:42 crc kubenswrapper[4585]: I1201 14:10:42.141233 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-47c9p" event={"ID":"12b19089-35d3-41e8-b50f-385c3d8bb27a","Type":"ContainerStarted","Data":"12c0bf59fdf43b8cfb953770d9df51d71d6424d0996a33932243842d0066c197"} Dec 01 14:10:42 crc kubenswrapper[4585]: I1201 14:10:42.141509 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-47c9p" event={"ID":"12b19089-35d3-41e8-b50f-385c3d8bb27a","Type":"ContainerStarted","Data":"3f0997324d8e9c4c31462190ba198377063d47b9c4cc0f2cdff9f1c45e47364d"} Dec 01 14:10:42 crc kubenswrapper[4585]: I1201 14:10:42.141291 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-77rsx" podUID="fea090b7-3832-4a1b-9d73-5f5407633e9f" containerName="registry-server" containerID="cri-o://3a9616f3bd43b401692b06fee41b7120a23a72ccfc0136a34584f9982cb705b5" gracePeriod=2 Dec 01 14:10:42 crc kubenswrapper[4585]: I1201 14:10:42.157451 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-47c9p" podStartSLOduration=2.053654438 podStartE2EDuration="2.157424073s" podCreationTimestamp="2025-12-01 14:10:40 +0000 UTC" firstStartedPulling="2025-12-01 14:10:41.692525057 +0000 UTC m=+755.676738932" lastFinishedPulling="2025-12-01 14:10:41.796294712 +0000 UTC m=+755.780508567" observedRunningTime="2025-12-01 14:10:42.154743371 +0000 UTC m=+756.138957226" watchObservedRunningTime="2025-12-01 14:10:42.157424073 +0000 UTC m=+756.141637928" Dec 01 14:10:42 crc kubenswrapper[4585]: I1201 14:10:42.485411 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-77rsx" Dec 01 14:10:42 crc kubenswrapper[4585]: I1201 14:10:42.608017 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhx8b\" (UniqueName: \"kubernetes.io/projected/fea090b7-3832-4a1b-9d73-5f5407633e9f-kube-api-access-dhx8b\") pod \"fea090b7-3832-4a1b-9d73-5f5407633e9f\" (UID: \"fea090b7-3832-4a1b-9d73-5f5407633e9f\") " Dec 01 14:10:42 crc kubenswrapper[4585]: I1201 14:10:42.612611 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fea090b7-3832-4a1b-9d73-5f5407633e9f-kube-api-access-dhx8b" (OuterVolumeSpecName: "kube-api-access-dhx8b") pod "fea090b7-3832-4a1b-9d73-5f5407633e9f" (UID: "fea090b7-3832-4a1b-9d73-5f5407633e9f"). InnerVolumeSpecName "kube-api-access-dhx8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:10:42 crc kubenswrapper[4585]: I1201 14:10:42.650310 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-hx8mm" Dec 01 14:10:42 crc kubenswrapper[4585]: I1201 14:10:42.709413 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhx8b\" (UniqueName: \"kubernetes.io/projected/fea090b7-3832-4a1b-9d73-5f5407633e9f-kube-api-access-dhx8b\") on node \"crc\" DevicePath \"\"" Dec 01 14:10:43 crc kubenswrapper[4585]: I1201 14:10:43.136936 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-lvvzw" Dec 01 14:10:43 crc kubenswrapper[4585]: I1201 14:10:43.147505 4585 generic.go:334] "Generic (PLEG): container finished" podID="fea090b7-3832-4a1b-9d73-5f5407633e9f" containerID="3a9616f3bd43b401692b06fee41b7120a23a72ccfc0136a34584f9982cb705b5" exitCode=0 Dec 01 14:10:43 crc kubenswrapper[4585]: I1201 14:10:43.147947 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-77rsx" Dec 01 14:10:43 crc kubenswrapper[4585]: I1201 14:10:43.148040 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-77rsx" event={"ID":"fea090b7-3832-4a1b-9d73-5f5407633e9f","Type":"ContainerDied","Data":"3a9616f3bd43b401692b06fee41b7120a23a72ccfc0136a34584f9982cb705b5"} Dec 01 14:10:43 crc kubenswrapper[4585]: I1201 14:10:43.148113 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-77rsx" event={"ID":"fea090b7-3832-4a1b-9d73-5f5407633e9f","Type":"ContainerDied","Data":"db7705b74a87a64752b01d6677d5b79f91bf9a7e714715f384b3ba2704c71bc3"} Dec 01 14:10:43 crc kubenswrapper[4585]: I1201 14:10:43.148143 4585 scope.go:117] "RemoveContainer" containerID="3a9616f3bd43b401692b06fee41b7120a23a72ccfc0136a34584f9982cb705b5" Dec 01 14:10:43 crc kubenswrapper[4585]: I1201 14:10:43.172313 4585 scope.go:117] "RemoveContainer" containerID="3a9616f3bd43b401692b06fee41b7120a23a72ccfc0136a34584f9982cb705b5" Dec 01 14:10:43 crc kubenswrapper[4585]: E1201 14:10:43.173573 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a9616f3bd43b401692b06fee41b7120a23a72ccfc0136a34584f9982cb705b5\": container with ID starting with 3a9616f3bd43b401692b06fee41b7120a23a72ccfc0136a34584f9982cb705b5 not found: ID does not exist" containerID="3a9616f3bd43b401692b06fee41b7120a23a72ccfc0136a34584f9982cb705b5" Dec 01 14:10:43 crc kubenswrapper[4585]: I1201 14:10:43.173629 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a9616f3bd43b401692b06fee41b7120a23a72ccfc0136a34584f9982cb705b5"} err="failed to get container status \"3a9616f3bd43b401692b06fee41b7120a23a72ccfc0136a34584f9982cb705b5\": rpc error: code = NotFound desc = could not find container \"3a9616f3bd43b401692b06fee41b7120a23a72ccfc0136a34584f9982cb705b5\": container with ID starting with 3a9616f3bd43b401692b06fee41b7120a23a72ccfc0136a34584f9982cb705b5 not found: ID does not exist" Dec 01 14:10:43 crc kubenswrapper[4585]: I1201 14:10:43.192353 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-77rsx"] Dec 01 14:10:43 crc kubenswrapper[4585]: I1201 14:10:43.198088 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-77rsx"] Dec 01 14:10:43 crc kubenswrapper[4585]: I1201 14:10:43.717040 4585 patch_prober.go:28] interesting pod/machine-config-daemon-lj9gs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 14:10:43 crc kubenswrapper[4585]: I1201 14:10:43.717161 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 14:10:44 crc kubenswrapper[4585]: I1201 14:10:44.420954 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fea090b7-3832-4a1b-9d73-5f5407633e9f" path="/var/lib/kubelet/pods/fea090b7-3832-4a1b-9d73-5f5407633e9f/volumes" Dec 01 14:10:51 crc kubenswrapper[4585]: I1201 14:10:51.301729 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-47c9p" Dec 01 14:10:51 crc kubenswrapper[4585]: I1201 14:10:51.302264 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-47c9p" Dec 01 14:10:51 crc kubenswrapper[4585]: I1201 14:10:51.327107 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-47c9p" Dec 01 14:10:52 crc kubenswrapper[4585]: I1201 14:10:52.236839 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-47c9p" Dec 01 14:10:53 crc kubenswrapper[4585]: I1201 14:10:53.122356 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-nklgw" Dec 01 14:10:53 crc kubenswrapper[4585]: I1201 14:10:53.616112 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/92745d039728c2a50a72aca57f7e5e275bb4aa1a0150662e177c6352fbgxw5c"] Dec 01 14:10:53 crc kubenswrapper[4585]: E1201 14:10:53.616464 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fea090b7-3832-4a1b-9d73-5f5407633e9f" containerName="registry-server" Dec 01 14:10:53 crc kubenswrapper[4585]: I1201 14:10:53.616485 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="fea090b7-3832-4a1b-9d73-5f5407633e9f" containerName="registry-server" Dec 01 14:10:53 crc kubenswrapper[4585]: I1201 14:10:53.616694 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="fea090b7-3832-4a1b-9d73-5f5407633e9f" containerName="registry-server" Dec 01 14:10:53 crc kubenswrapper[4585]: I1201 14:10:53.618242 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/92745d039728c2a50a72aca57f7e5e275bb4aa1a0150662e177c6352fbgxw5c" Dec 01 14:10:53 crc kubenswrapper[4585]: I1201 14:10:53.620402 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-9hjlq" Dec 01 14:10:53 crc kubenswrapper[4585]: I1201 14:10:53.626614 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/92745d039728c2a50a72aca57f7e5e275bb4aa1a0150662e177c6352fbgxw5c"] Dec 01 14:10:53 crc kubenswrapper[4585]: I1201 14:10:53.748424 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9vbf\" (UniqueName: \"kubernetes.io/projected/d29365aa-3c8b-46c7-8b46-eb101a582cc2-kube-api-access-v9vbf\") pod \"92745d039728c2a50a72aca57f7e5e275bb4aa1a0150662e177c6352fbgxw5c\" (UID: \"d29365aa-3c8b-46c7-8b46-eb101a582cc2\") " pod="openstack-operators/92745d039728c2a50a72aca57f7e5e275bb4aa1a0150662e177c6352fbgxw5c" Dec 01 14:10:53 crc kubenswrapper[4585]: I1201 14:10:53.748510 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d29365aa-3c8b-46c7-8b46-eb101a582cc2-bundle\") pod \"92745d039728c2a50a72aca57f7e5e275bb4aa1a0150662e177c6352fbgxw5c\" (UID: \"d29365aa-3c8b-46c7-8b46-eb101a582cc2\") " pod="openstack-operators/92745d039728c2a50a72aca57f7e5e275bb4aa1a0150662e177c6352fbgxw5c" Dec 01 14:10:53 crc kubenswrapper[4585]: I1201 14:10:53.748530 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d29365aa-3c8b-46c7-8b46-eb101a582cc2-util\") pod \"92745d039728c2a50a72aca57f7e5e275bb4aa1a0150662e177c6352fbgxw5c\" (UID: \"d29365aa-3c8b-46c7-8b46-eb101a582cc2\") " pod="openstack-operators/92745d039728c2a50a72aca57f7e5e275bb4aa1a0150662e177c6352fbgxw5c" Dec 01 14:10:53 crc kubenswrapper[4585]: I1201 14:10:53.849502 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9vbf\" (UniqueName: \"kubernetes.io/projected/d29365aa-3c8b-46c7-8b46-eb101a582cc2-kube-api-access-v9vbf\") pod \"92745d039728c2a50a72aca57f7e5e275bb4aa1a0150662e177c6352fbgxw5c\" (UID: \"d29365aa-3c8b-46c7-8b46-eb101a582cc2\") " pod="openstack-operators/92745d039728c2a50a72aca57f7e5e275bb4aa1a0150662e177c6352fbgxw5c" Dec 01 14:10:53 crc kubenswrapper[4585]: I1201 14:10:53.849789 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d29365aa-3c8b-46c7-8b46-eb101a582cc2-bundle\") pod \"92745d039728c2a50a72aca57f7e5e275bb4aa1a0150662e177c6352fbgxw5c\" (UID: \"d29365aa-3c8b-46c7-8b46-eb101a582cc2\") " pod="openstack-operators/92745d039728c2a50a72aca57f7e5e275bb4aa1a0150662e177c6352fbgxw5c" Dec 01 14:10:53 crc kubenswrapper[4585]: I1201 14:10:53.849907 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d29365aa-3c8b-46c7-8b46-eb101a582cc2-util\") pod \"92745d039728c2a50a72aca57f7e5e275bb4aa1a0150662e177c6352fbgxw5c\" (UID: \"d29365aa-3c8b-46c7-8b46-eb101a582cc2\") " pod="openstack-operators/92745d039728c2a50a72aca57f7e5e275bb4aa1a0150662e177c6352fbgxw5c" Dec 01 14:10:53 crc kubenswrapper[4585]: I1201 14:10:53.850488 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d29365aa-3c8b-46c7-8b46-eb101a582cc2-util\") pod \"92745d039728c2a50a72aca57f7e5e275bb4aa1a0150662e177c6352fbgxw5c\" (UID: \"d29365aa-3c8b-46c7-8b46-eb101a582cc2\") " pod="openstack-operators/92745d039728c2a50a72aca57f7e5e275bb4aa1a0150662e177c6352fbgxw5c" Dec 01 14:10:53 crc kubenswrapper[4585]: I1201 14:10:53.850864 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d29365aa-3c8b-46c7-8b46-eb101a582cc2-bundle\") pod \"92745d039728c2a50a72aca57f7e5e275bb4aa1a0150662e177c6352fbgxw5c\" (UID: \"d29365aa-3c8b-46c7-8b46-eb101a582cc2\") " pod="openstack-operators/92745d039728c2a50a72aca57f7e5e275bb4aa1a0150662e177c6352fbgxw5c" Dec 01 14:10:53 crc kubenswrapper[4585]: I1201 14:10:53.867844 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9vbf\" (UniqueName: \"kubernetes.io/projected/d29365aa-3c8b-46c7-8b46-eb101a582cc2-kube-api-access-v9vbf\") pod \"92745d039728c2a50a72aca57f7e5e275bb4aa1a0150662e177c6352fbgxw5c\" (UID: \"d29365aa-3c8b-46c7-8b46-eb101a582cc2\") " pod="openstack-operators/92745d039728c2a50a72aca57f7e5e275bb4aa1a0150662e177c6352fbgxw5c" Dec 01 14:10:53 crc kubenswrapper[4585]: I1201 14:10:53.947488 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/92745d039728c2a50a72aca57f7e5e275bb4aa1a0150662e177c6352fbgxw5c" Dec 01 14:10:54 crc kubenswrapper[4585]: I1201 14:10:54.393114 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/92745d039728c2a50a72aca57f7e5e275bb4aa1a0150662e177c6352fbgxw5c"] Dec 01 14:10:55 crc kubenswrapper[4585]: I1201 14:10:55.231195 4585 generic.go:334] "Generic (PLEG): container finished" podID="d29365aa-3c8b-46c7-8b46-eb101a582cc2" containerID="33e38d0400b61991567ea77d2ceeb4ddd25f920addfaa908dfebace9617d7c71" exitCode=0 Dec 01 14:10:55 crc kubenswrapper[4585]: I1201 14:10:55.231244 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/92745d039728c2a50a72aca57f7e5e275bb4aa1a0150662e177c6352fbgxw5c" event={"ID":"d29365aa-3c8b-46c7-8b46-eb101a582cc2","Type":"ContainerDied","Data":"33e38d0400b61991567ea77d2ceeb4ddd25f920addfaa908dfebace9617d7c71"} Dec 01 14:10:55 crc kubenswrapper[4585]: I1201 14:10:55.231273 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/92745d039728c2a50a72aca57f7e5e275bb4aa1a0150662e177c6352fbgxw5c" event={"ID":"d29365aa-3c8b-46c7-8b46-eb101a582cc2","Type":"ContainerStarted","Data":"19353cf0aac6fdf4deb5d8641cf0664aa53997a3dc44bebd900ac290f60ce7bd"} Dec 01 14:10:56 crc kubenswrapper[4585]: I1201 14:10:56.240724 4585 generic.go:334] "Generic (PLEG): container finished" podID="d29365aa-3c8b-46c7-8b46-eb101a582cc2" containerID="16e2aab742dc4d8628c0812b9a7d28da21ff92fbec431c7e0e2a38bca6b90023" exitCode=0 Dec 01 14:10:56 crc kubenswrapper[4585]: I1201 14:10:56.240836 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/92745d039728c2a50a72aca57f7e5e275bb4aa1a0150662e177c6352fbgxw5c" event={"ID":"d29365aa-3c8b-46c7-8b46-eb101a582cc2","Type":"ContainerDied","Data":"16e2aab742dc4d8628c0812b9a7d28da21ff92fbec431c7e0e2a38bca6b90023"} Dec 01 14:10:57 crc kubenswrapper[4585]: I1201 14:10:57.252509 4585 generic.go:334] "Generic (PLEG): container finished" podID="d29365aa-3c8b-46c7-8b46-eb101a582cc2" containerID="d1678f72e68b7693af828a6e9b9fd962603a539d15d4ff3f411e97ad7672d986" exitCode=0 Dec 01 14:10:57 crc kubenswrapper[4585]: I1201 14:10:57.252733 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/92745d039728c2a50a72aca57f7e5e275bb4aa1a0150662e177c6352fbgxw5c" event={"ID":"d29365aa-3c8b-46c7-8b46-eb101a582cc2","Type":"ContainerDied","Data":"d1678f72e68b7693af828a6e9b9fd962603a539d15d4ff3f411e97ad7672d986"} Dec 01 14:10:58 crc kubenswrapper[4585]: I1201 14:10:58.503468 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/92745d039728c2a50a72aca57f7e5e275bb4aa1a0150662e177c6352fbgxw5c" Dec 01 14:10:58 crc kubenswrapper[4585]: I1201 14:10:58.644668 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d29365aa-3c8b-46c7-8b46-eb101a582cc2-util\") pod \"d29365aa-3c8b-46c7-8b46-eb101a582cc2\" (UID: \"d29365aa-3c8b-46c7-8b46-eb101a582cc2\") " Dec 01 14:10:58 crc kubenswrapper[4585]: I1201 14:10:58.644815 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d29365aa-3c8b-46c7-8b46-eb101a582cc2-bundle\") pod \"d29365aa-3c8b-46c7-8b46-eb101a582cc2\" (UID: \"d29365aa-3c8b-46c7-8b46-eb101a582cc2\") " Dec 01 14:10:58 crc kubenswrapper[4585]: I1201 14:10:58.644881 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9vbf\" (UniqueName: \"kubernetes.io/projected/d29365aa-3c8b-46c7-8b46-eb101a582cc2-kube-api-access-v9vbf\") pod \"d29365aa-3c8b-46c7-8b46-eb101a582cc2\" (UID: \"d29365aa-3c8b-46c7-8b46-eb101a582cc2\") " Dec 01 14:10:58 crc kubenswrapper[4585]: I1201 14:10:58.646174 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d29365aa-3c8b-46c7-8b46-eb101a582cc2-bundle" (OuterVolumeSpecName: "bundle") pod "d29365aa-3c8b-46c7-8b46-eb101a582cc2" (UID: "d29365aa-3c8b-46c7-8b46-eb101a582cc2"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:10:58 crc kubenswrapper[4585]: I1201 14:10:58.652166 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d29365aa-3c8b-46c7-8b46-eb101a582cc2-kube-api-access-v9vbf" (OuterVolumeSpecName: "kube-api-access-v9vbf") pod "d29365aa-3c8b-46c7-8b46-eb101a582cc2" (UID: "d29365aa-3c8b-46c7-8b46-eb101a582cc2"). InnerVolumeSpecName "kube-api-access-v9vbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:10:58 crc kubenswrapper[4585]: I1201 14:10:58.658745 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d29365aa-3c8b-46c7-8b46-eb101a582cc2-util" (OuterVolumeSpecName: "util") pod "d29365aa-3c8b-46c7-8b46-eb101a582cc2" (UID: "d29365aa-3c8b-46c7-8b46-eb101a582cc2"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:10:58 crc kubenswrapper[4585]: I1201 14:10:58.747163 4585 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d29365aa-3c8b-46c7-8b46-eb101a582cc2-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 14:10:58 crc kubenswrapper[4585]: I1201 14:10:58.747189 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9vbf\" (UniqueName: \"kubernetes.io/projected/d29365aa-3c8b-46c7-8b46-eb101a582cc2-kube-api-access-v9vbf\") on node \"crc\" DevicePath \"\"" Dec 01 14:10:58 crc kubenswrapper[4585]: I1201 14:10:58.747200 4585 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d29365aa-3c8b-46c7-8b46-eb101a582cc2-util\") on node \"crc\" DevicePath \"\"" Dec 01 14:10:59 crc kubenswrapper[4585]: I1201 14:10:59.265943 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/92745d039728c2a50a72aca57f7e5e275bb4aa1a0150662e177c6352fbgxw5c" event={"ID":"d29365aa-3c8b-46c7-8b46-eb101a582cc2","Type":"ContainerDied","Data":"19353cf0aac6fdf4deb5d8641cf0664aa53997a3dc44bebd900ac290f60ce7bd"} Dec 01 14:10:59 crc kubenswrapper[4585]: I1201 14:10:59.266006 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19353cf0aac6fdf4deb5d8641cf0664aa53997a3dc44bebd900ac290f60ce7bd" Dec 01 14:10:59 crc kubenswrapper[4585]: I1201 14:10:59.266057 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/92745d039728c2a50a72aca57f7e5e275bb4aa1a0150662e177c6352fbgxw5c" Dec 01 14:11:05 crc kubenswrapper[4585]: I1201 14:11:05.575157 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-d645d669b-rhjvp"] Dec 01 14:11:05 crc kubenswrapper[4585]: E1201 14:11:05.575981 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d29365aa-3c8b-46c7-8b46-eb101a582cc2" containerName="pull" Dec 01 14:11:05 crc kubenswrapper[4585]: I1201 14:11:05.575997 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="d29365aa-3c8b-46c7-8b46-eb101a582cc2" containerName="pull" Dec 01 14:11:05 crc kubenswrapper[4585]: E1201 14:11:05.576020 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d29365aa-3c8b-46c7-8b46-eb101a582cc2" containerName="util" Dec 01 14:11:05 crc kubenswrapper[4585]: I1201 14:11:05.576027 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="d29365aa-3c8b-46c7-8b46-eb101a582cc2" containerName="util" Dec 01 14:11:05 crc kubenswrapper[4585]: E1201 14:11:05.576043 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d29365aa-3c8b-46c7-8b46-eb101a582cc2" containerName="extract" Dec 01 14:11:05 crc kubenswrapper[4585]: I1201 14:11:05.576050 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="d29365aa-3c8b-46c7-8b46-eb101a582cc2" containerName="extract" Dec 01 14:11:05 crc kubenswrapper[4585]: I1201 14:11:05.576161 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="d29365aa-3c8b-46c7-8b46-eb101a582cc2" containerName="extract" Dec 01 14:11:05 crc kubenswrapper[4585]: I1201 14:11:05.576560 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-d645d669b-rhjvp" Dec 01 14:11:05 crc kubenswrapper[4585]: I1201 14:11:05.590166 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-526n4" Dec 01 14:11:05 crc kubenswrapper[4585]: I1201 14:11:05.612882 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-d645d669b-rhjvp"] Dec 01 14:11:05 crc kubenswrapper[4585]: I1201 14:11:05.733448 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pr5rd\" (UniqueName: \"kubernetes.io/projected/9c187dfe-402b-4e73-8f4f-3d9dcf360954-kube-api-access-pr5rd\") pod \"openstack-operator-controller-operator-d645d669b-rhjvp\" (UID: \"9c187dfe-402b-4e73-8f4f-3d9dcf360954\") " pod="openstack-operators/openstack-operator-controller-operator-d645d669b-rhjvp" Dec 01 14:11:05 crc kubenswrapper[4585]: I1201 14:11:05.834781 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pr5rd\" (UniqueName: \"kubernetes.io/projected/9c187dfe-402b-4e73-8f4f-3d9dcf360954-kube-api-access-pr5rd\") pod \"openstack-operator-controller-operator-d645d669b-rhjvp\" (UID: \"9c187dfe-402b-4e73-8f4f-3d9dcf360954\") " pod="openstack-operators/openstack-operator-controller-operator-d645d669b-rhjvp" Dec 01 14:11:05 crc kubenswrapper[4585]: I1201 14:11:05.858427 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pr5rd\" (UniqueName: \"kubernetes.io/projected/9c187dfe-402b-4e73-8f4f-3d9dcf360954-kube-api-access-pr5rd\") pod \"openstack-operator-controller-operator-d645d669b-rhjvp\" (UID: \"9c187dfe-402b-4e73-8f4f-3d9dcf360954\") " pod="openstack-operators/openstack-operator-controller-operator-d645d669b-rhjvp" Dec 01 14:11:05 crc kubenswrapper[4585]: I1201 14:11:05.894768 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-d645d669b-rhjvp" Dec 01 14:11:06 crc kubenswrapper[4585]: I1201 14:11:06.135612 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-d645d669b-rhjvp"] Dec 01 14:11:06 crc kubenswrapper[4585]: I1201 14:11:06.309453 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-d645d669b-rhjvp" event={"ID":"9c187dfe-402b-4e73-8f4f-3d9dcf360954","Type":"ContainerStarted","Data":"4c2fcc752db646575bdaed1b2c384eeaec9fff02cd99489d46531d9a92099235"} Dec 01 14:11:11 crc kubenswrapper[4585]: I1201 14:11:11.344919 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-d645d669b-rhjvp" event={"ID":"9c187dfe-402b-4e73-8f4f-3d9dcf360954","Type":"ContainerStarted","Data":"5322b25e5d97c701539d150bd6fd9f9279378239b089c4aa19e65274306de3ae"} Dec 01 14:11:11 crc kubenswrapper[4585]: I1201 14:11:11.345476 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-d645d669b-rhjvp" Dec 01 14:11:11 crc kubenswrapper[4585]: I1201 14:11:11.374221 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-d645d669b-rhjvp" podStartSLOduration=1.7449301579999998 podStartE2EDuration="6.374207641s" podCreationTimestamp="2025-12-01 14:11:05 +0000 UTC" firstStartedPulling="2025-12-01 14:11:06.16015584 +0000 UTC m=+780.144369695" lastFinishedPulling="2025-12-01 14:11:10.789433323 +0000 UTC m=+784.773647178" observedRunningTime="2025-12-01 14:11:11.37118488 +0000 UTC m=+785.355398725" watchObservedRunningTime="2025-12-01 14:11:11.374207641 +0000 UTC m=+785.358421486" Dec 01 14:11:13 crc kubenswrapper[4585]: I1201 14:11:13.716382 4585 patch_prober.go:28] interesting pod/machine-config-daemon-lj9gs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 14:11:13 crc kubenswrapper[4585]: I1201 14:11:13.716731 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 14:11:15 crc kubenswrapper[4585]: I1201 14:11:15.897834 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-d645d669b-rhjvp" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.218498 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-br8df"] Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.220023 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-br8df" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.223098 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-qnbzj"] Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.223674 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-czt8v" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.224087 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-qnbzj" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.230210 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-9vf6c" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.234951 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-qnbzj"] Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.251648 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-br8df"] Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.260791 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-668d9c48b9-4kmsq"] Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.261737 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-4kmsq" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.263655 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-wmdkm" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.266182 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-8qd82"] Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.267339 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-8qd82" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.269863 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-4zvn4" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.306485 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-668d9c48b9-4kmsq"] Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.308835 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9472j\" (UniqueName: \"kubernetes.io/projected/c4697227-2800-4a64-89bf-5bf831077ceb-kube-api-access-9472j\") pod \"designate-operator-controller-manager-78b4bc895b-8qd82\" (UID: \"c4697227-2800-4a64-89bf-5bf831077ceb\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-8qd82" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.308925 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jg9l8\" (UniqueName: \"kubernetes.io/projected/7f8c91fb-441e-44f0-bf97-1340df47f4b0-kube-api-access-jg9l8\") pod \"glance-operator-controller-manager-668d9c48b9-4kmsq\" (UID: \"7f8c91fb-441e-44f0-bf97-1340df47f4b0\") " pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-4kmsq" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.308957 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wt24w\" (UniqueName: \"kubernetes.io/projected/59994d2c-6485-4beb-bcfc-3fd4a22bd203-kube-api-access-wt24w\") pod \"cinder-operator-controller-manager-859b6ccc6-qnbzj\" (UID: \"59994d2c-6485-4beb-bcfc-3fd4a22bd203\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-qnbzj" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.308987 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcf6j\" (UniqueName: \"kubernetes.io/projected/8da768c2-cb8c-40f9-b8d1-54a66743b340-kube-api-access-jcf6j\") pod \"barbican-operator-controller-manager-7d9dfd778-br8df\" (UID: \"8da768c2-cb8c-40f9-b8d1-54a66743b340\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-br8df" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.314816 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-qpqgr"] Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.316018 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-qpqgr" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.318835 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-8qd82"] Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.320362 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-bldx6" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.327396 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-vxz95"] Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.328296 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-vxz95" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.333351 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-dfr6h" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.342456 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-qpqgr"] Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.387098 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-vxz95"] Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.394627 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-77nsb"] Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.395647 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-77nsb" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.399751 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-glw2s" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.399909 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.417738 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wt24w\" (UniqueName: \"kubernetes.io/projected/59994d2c-6485-4beb-bcfc-3fd4a22bd203-kube-api-access-wt24w\") pod \"cinder-operator-controller-manager-859b6ccc6-qnbzj\" (UID: \"59994d2c-6485-4beb-bcfc-3fd4a22bd203\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-qnbzj" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.417778 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcf6j\" (UniqueName: \"kubernetes.io/projected/8da768c2-cb8c-40f9-b8d1-54a66743b340-kube-api-access-jcf6j\") pod \"barbican-operator-controller-manager-7d9dfd778-br8df\" (UID: \"8da768c2-cb8c-40f9-b8d1-54a66743b340\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-br8df" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.417827 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9472j\" (UniqueName: \"kubernetes.io/projected/c4697227-2800-4a64-89bf-5bf831077ceb-kube-api-access-9472j\") pod \"designate-operator-controller-manager-78b4bc895b-8qd82\" (UID: \"c4697227-2800-4a64-89bf-5bf831077ceb\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-8qd82" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.417870 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pnjj\" (UniqueName: \"kubernetes.io/projected/5afd79b9-5528-4ffe-9d3f-ac7b05502348-kube-api-access-2pnjj\") pod \"horizon-operator-controller-manager-68c6d99b8f-vxz95\" (UID: \"5afd79b9-5528-4ffe-9d3f-ac7b05502348\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-vxz95" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.417905 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-952gr\" (UniqueName: \"kubernetes.io/projected/60bbdebb-4ac8-4971-82b2-252a989a8c3a-kube-api-access-952gr\") pod \"heat-operator-controller-manager-5f64f6f8bb-qpqgr\" (UID: \"60bbdebb-4ac8-4971-82b2-252a989a8c3a\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-qpqgr" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.417924 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jg9l8\" (UniqueName: \"kubernetes.io/projected/7f8c91fb-441e-44f0-bf97-1340df47f4b0-kube-api-access-jg9l8\") pod \"glance-operator-controller-manager-668d9c48b9-4kmsq\" (UID: \"7f8c91fb-441e-44f0-bf97-1340df47f4b0\") " pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-4kmsq" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.422952 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-sqs7f"] Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.424287 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-sqs7f" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.431349 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-7nncr" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.438153 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-77nsb"] Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.469519 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-546d4bdf48-xsbwl"] Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.474224 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-xsbwl" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.477004 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9472j\" (UniqueName: \"kubernetes.io/projected/c4697227-2800-4a64-89bf-5bf831077ceb-kube-api-access-9472j\") pod \"designate-operator-controller-manager-78b4bc895b-8qd82\" (UID: \"c4697227-2800-4a64-89bf-5bf831077ceb\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-8qd82" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.480949 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-p2nq8" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.482055 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jg9l8\" (UniqueName: \"kubernetes.io/projected/7f8c91fb-441e-44f0-bf97-1340df47f4b0-kube-api-access-jg9l8\") pod \"glance-operator-controller-manager-668d9c48b9-4kmsq\" (UID: \"7f8c91fb-441e-44f0-bf97-1340df47f4b0\") " pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-4kmsq" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.483662 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wt24w\" (UniqueName: \"kubernetes.io/projected/59994d2c-6485-4beb-bcfc-3fd4a22bd203-kube-api-access-wt24w\") pod \"cinder-operator-controller-manager-859b6ccc6-qnbzj\" (UID: \"59994d2c-6485-4beb-bcfc-3fd4a22bd203\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-qnbzj" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.489406 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcf6j\" (UniqueName: \"kubernetes.io/projected/8da768c2-cb8c-40f9-b8d1-54a66743b340-kube-api-access-jcf6j\") pod \"barbican-operator-controller-manager-7d9dfd778-br8df\" (UID: \"8da768c2-cb8c-40f9-b8d1-54a66743b340\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-br8df" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.511863 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-546d4bdf48-xsbwl"] Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.518717 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cv5z\" (UniqueName: \"kubernetes.io/projected/f496d7d1-7362-487d-88d7-33e2c26ce97b-kube-api-access-2cv5z\") pod \"infra-operator-controller-manager-57548d458d-77nsb\" (UID: \"f496d7d1-7362-487d-88d7-33e2c26ce97b\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-77nsb" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.518773 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f496d7d1-7362-487d-88d7-33e2c26ce97b-cert\") pod \"infra-operator-controller-manager-57548d458d-77nsb\" (UID: \"f496d7d1-7362-487d-88d7-33e2c26ce97b\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-77nsb" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.518810 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pnjj\" (UniqueName: \"kubernetes.io/projected/5afd79b9-5528-4ffe-9d3f-ac7b05502348-kube-api-access-2pnjj\") pod \"horizon-operator-controller-manager-68c6d99b8f-vxz95\" (UID: \"5afd79b9-5528-4ffe-9d3f-ac7b05502348\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-vxz95" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.518833 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9csn5\" (UniqueName: \"kubernetes.io/projected/1d516bcd-4ed7-4c83-a07e-3a8f66761090-kube-api-access-9csn5\") pod \"ironic-operator-controller-manager-6c548fd776-sqs7f\" (UID: \"1d516bcd-4ed7-4c83-a07e-3a8f66761090\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-sqs7f" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.518878 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-952gr\" (UniqueName: \"kubernetes.io/projected/60bbdebb-4ac8-4971-82b2-252a989a8c3a-kube-api-access-952gr\") pod \"heat-operator-controller-manager-5f64f6f8bb-qpqgr\" (UID: \"60bbdebb-4ac8-4971-82b2-252a989a8c3a\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-qpqgr" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.518934 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvfvd\" (UniqueName: \"kubernetes.io/projected/50b75abe-8fa5-4e48-87bb-560b5609feda-kube-api-access-jvfvd\") pod \"keystone-operator-controller-manager-546d4bdf48-xsbwl\" (UID: \"50b75abe-8fa5-4e48-87bb-560b5609feda\") " pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-xsbwl" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.522550 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-sqs7f"] Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.536950 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-6546668bfd-pmpnl"] Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.538308 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-br8df" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.538752 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-pmpnl" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.544899 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6546668bfd-pmpnl"] Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.548765 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-qnbzj" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.559281 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-z24qr" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.565272 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-ndffl"] Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.566203 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-ndffl" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.573210 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-jcbpv" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.573768 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-ndffl"] Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.579585 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pnjj\" (UniqueName: \"kubernetes.io/projected/5afd79b9-5528-4ffe-9d3f-ac7b05502348-kube-api-access-2pnjj\") pod \"horizon-operator-controller-manager-68c6d99b8f-vxz95\" (UID: \"5afd79b9-5528-4ffe-9d3f-ac7b05502348\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-vxz95" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.579903 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-4kmsq" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.611535 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-952gr\" (UniqueName: \"kubernetes.io/projected/60bbdebb-4ac8-4971-82b2-252a989a8c3a-kube-api-access-952gr\") pod \"heat-operator-controller-manager-5f64f6f8bb-qpqgr\" (UID: \"60bbdebb-4ac8-4971-82b2-252a989a8c3a\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-qpqgr" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.625565 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-8qd82" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.628098 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cv5z\" (UniqueName: \"kubernetes.io/projected/f496d7d1-7362-487d-88d7-33e2c26ce97b-kube-api-access-2cv5z\") pod \"infra-operator-controller-manager-57548d458d-77nsb\" (UID: \"f496d7d1-7362-487d-88d7-33e2c26ce97b\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-77nsb" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.628173 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f496d7d1-7362-487d-88d7-33e2c26ce97b-cert\") pod \"infra-operator-controller-manager-57548d458d-77nsb\" (UID: \"f496d7d1-7362-487d-88d7-33e2c26ce97b\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-77nsb" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.628212 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgdzb\" (UniqueName: \"kubernetes.io/projected/abe5e9b4-4f45-4fb6-92f7-739d4174996b-kube-api-access-bgdzb\") pod \"manila-operator-controller-manager-6546668bfd-pmpnl\" (UID: \"abe5e9b4-4f45-4fb6-92f7-739d4174996b\") " pod="openstack-operators/manila-operator-controller-manager-6546668bfd-pmpnl" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.628242 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9csn5\" (UniqueName: \"kubernetes.io/projected/1d516bcd-4ed7-4c83-a07e-3a8f66761090-kube-api-access-9csn5\") pod \"ironic-operator-controller-manager-6c548fd776-sqs7f\" (UID: \"1d516bcd-4ed7-4c83-a07e-3a8f66761090\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-sqs7f" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.628318 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvfvd\" (UniqueName: \"kubernetes.io/projected/50b75abe-8fa5-4e48-87bb-560b5609feda-kube-api-access-jvfvd\") pod \"keystone-operator-controller-manager-546d4bdf48-xsbwl\" (UID: \"50b75abe-8fa5-4e48-87bb-560b5609feda\") " pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-xsbwl" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.628336 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsbhv\" (UniqueName: \"kubernetes.io/projected/1c21caba-6277-4106-b637-a4874412f527-kube-api-access-jsbhv\") pod \"mariadb-operator-controller-manager-56bbcc9d85-ndffl\" (UID: \"1c21caba-6277-4106-b637-a4874412f527\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-ndffl" Dec 01 14:11:34 crc kubenswrapper[4585]: E1201 14:11:34.628760 4585 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 01 14:11:34 crc kubenswrapper[4585]: E1201 14:11:34.628816 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f496d7d1-7362-487d-88d7-33e2c26ce97b-cert podName:f496d7d1-7362-487d-88d7-33e2c26ce97b nodeName:}" failed. No retries permitted until 2025-12-01 14:11:35.128794185 +0000 UTC m=+809.113008040 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f496d7d1-7362-487d-88d7-33e2c26ce97b-cert") pod "infra-operator-controller-manager-57548d458d-77nsb" (UID: "f496d7d1-7362-487d-88d7-33e2c26ce97b") : secret "infra-operator-webhook-server-cert" not found Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.641945 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-qpqgr" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.643984 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-6hdc6"] Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.660932 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-vxz95" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.665781 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-6hdc6" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.682210 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9csn5\" (UniqueName: \"kubernetes.io/projected/1d516bcd-4ed7-4c83-a07e-3a8f66761090-kube-api-access-9csn5\") pod \"ironic-operator-controller-manager-6c548fd776-sqs7f\" (UID: \"1d516bcd-4ed7-4c83-a07e-3a8f66761090\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-sqs7f" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.683376 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvfvd\" (UniqueName: \"kubernetes.io/projected/50b75abe-8fa5-4e48-87bb-560b5609feda-kube-api-access-jvfvd\") pod \"keystone-operator-controller-manager-546d4bdf48-xsbwl\" (UID: \"50b75abe-8fa5-4e48-87bb-560b5609feda\") " pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-xsbwl" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.684926 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cv5z\" (UniqueName: \"kubernetes.io/projected/f496d7d1-7362-487d-88d7-33e2c26ce97b-kube-api-access-2cv5z\") pod \"infra-operator-controller-manager-57548d458d-77nsb\" (UID: \"f496d7d1-7362-487d-88d7-33e2c26ce97b\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-77nsb" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.707200 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-8fdcb"] Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.709702 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-8fdcb" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.710500 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-mnhfd" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.727453 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-7wtz2" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.737202 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-6hdc6"] Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.745225 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgdzb\" (UniqueName: \"kubernetes.io/projected/abe5e9b4-4f45-4fb6-92f7-739d4174996b-kube-api-access-bgdzb\") pod \"manila-operator-controller-manager-6546668bfd-pmpnl\" (UID: \"abe5e9b4-4f45-4fb6-92f7-739d4174996b\") " pod="openstack-operators/manila-operator-controller-manager-6546668bfd-pmpnl" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.745279 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22xzq\" (UniqueName: \"kubernetes.io/projected/0854e7b6-a6fb-4fd0-9e48-564df5d8fea2-kube-api-access-22xzq\") pod \"octavia-operator-controller-manager-998648c74-6hdc6\" (UID: \"0854e7b6-a6fb-4fd0-9e48-564df5d8fea2\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-6hdc6" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.745320 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsbhv\" (UniqueName: \"kubernetes.io/projected/1c21caba-6277-4106-b637-a4874412f527-kube-api-access-jsbhv\") pod \"mariadb-operator-controller-manager-56bbcc9d85-ndffl\" (UID: \"1c21caba-6277-4106-b637-a4874412f527\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-ndffl" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.791041 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-w4g57"] Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.792096 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-w4g57" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.814325 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-8wpgk" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.820546 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-w4g57"] Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.826014 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-sqs7f" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.828681 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgdzb\" (UniqueName: \"kubernetes.io/projected/abe5e9b4-4f45-4fb6-92f7-739d4174996b-kube-api-access-bgdzb\") pod \"manila-operator-controller-manager-6546668bfd-pmpnl\" (UID: \"abe5e9b4-4f45-4fb6-92f7-739d4174996b\") " pod="openstack-operators/manila-operator-controller-manager-6546668bfd-pmpnl" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.830507 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsbhv\" (UniqueName: \"kubernetes.io/projected/1c21caba-6277-4106-b637-a4874412f527-kube-api-access-jsbhv\") pod \"mariadb-operator-controller-manager-56bbcc9d85-ndffl\" (UID: \"1c21caba-6277-4106-b637-a4874412f527\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-ndffl" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.845060 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4fr7bh"] Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.846311 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4fr7bh" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.847460 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x449l\" (UniqueName: \"kubernetes.io/projected/f62dd90c-aa85-4650-92e0-13e52ec60360-kube-api-access-x449l\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-8fdcb\" (UID: \"f62dd90c-aa85-4650-92e0-13e52ec60360\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-8fdcb" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.847524 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22xzq\" (UniqueName: \"kubernetes.io/projected/0854e7b6-a6fb-4fd0-9e48-564df5d8fea2-kube-api-access-22xzq\") pod \"octavia-operator-controller-manager-998648c74-6hdc6\" (UID: \"0854e7b6-a6fb-4fd0-9e48-564df5d8fea2\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-6hdc6" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.858548 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-xsbwl" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.863854 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.864544 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-b5tjv" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.887711 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-8fdcb"] Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.909743 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22xzq\" (UniqueName: \"kubernetes.io/projected/0854e7b6-a6fb-4fd0-9e48-564df5d8fea2-kube-api-access-22xzq\") pod \"octavia-operator-controller-manager-998648c74-6hdc6\" (UID: \"0854e7b6-a6fb-4fd0-9e48-564df5d8fea2\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-6hdc6" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.944034 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4fr7bh"] Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.953649 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x449l\" (UniqueName: \"kubernetes.io/projected/f62dd90c-aa85-4650-92e0-13e52ec60360-kube-api-access-x449l\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-8fdcb\" (UID: \"f62dd90c-aa85-4650-92e0-13e52ec60360\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-8fdcb" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.953719 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmvbh\" (UniqueName: \"kubernetes.io/projected/baa99a85-be34-458d-bc16-c367d4635b10-kube-api-access-jmvbh\") pod \"nova-operator-controller-manager-697bc559fc-w4g57\" (UID: \"baa99a85-be34-458d-bc16-c367d4635b10\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-w4g57" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.953789 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bcc7d39e-d462-4eaa-89fa-625c72c956b6-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4fr7bh\" (UID: \"bcc7d39e-d462-4eaa-89fa-625c72c956b6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4fr7bh" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.953816 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzpqv\" (UniqueName: \"kubernetes.io/projected/bcc7d39e-d462-4eaa-89fa-625c72c956b6-kube-api-access-lzpqv\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4fr7bh\" (UID: \"bcc7d39e-d462-4eaa-89fa-625c72c956b6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4fr7bh" Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.995181 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-r4qhj"] Dec 01 14:11:34 crc kubenswrapper[4585]: I1201 14:11:34.996316 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-r4qhj" Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.007489 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x449l\" (UniqueName: \"kubernetes.io/projected/f62dd90c-aa85-4650-92e0-13e52ec60360-kube-api-access-x449l\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-8fdcb\" (UID: \"f62dd90c-aa85-4650-92e0-13e52ec60360\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-8fdcb" Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.009551 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-pmpnl" Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.018557 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-r4qhj"] Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.019237 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-smhkp" Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.037984 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-ndffl" Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.054568 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bcc7d39e-d462-4eaa-89fa-625c72c956b6-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4fr7bh\" (UID: \"bcc7d39e-d462-4eaa-89fa-625c72c956b6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4fr7bh" Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.054616 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzpqv\" (UniqueName: \"kubernetes.io/projected/bcc7d39e-d462-4eaa-89fa-625c72c956b6-kube-api-access-lzpqv\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4fr7bh\" (UID: \"bcc7d39e-d462-4eaa-89fa-625c72c956b6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4fr7bh" Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.054671 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmvbh\" (UniqueName: \"kubernetes.io/projected/baa99a85-be34-458d-bc16-c367d4635b10-kube-api-access-jmvbh\") pod \"nova-operator-controller-manager-697bc559fc-w4g57\" (UID: \"baa99a85-be34-458d-bc16-c367d4635b10\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-w4g57" Dec 01 14:11:35 crc kubenswrapper[4585]: E1201 14:11:35.055026 4585 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 14:11:35 crc kubenswrapper[4585]: E1201 14:11:35.055072 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bcc7d39e-d462-4eaa-89fa-625c72c956b6-cert podName:bcc7d39e-d462-4eaa-89fa-625c72c956b6 nodeName:}" failed. No retries permitted until 2025-12-01 14:11:35.555058548 +0000 UTC m=+809.539272403 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bcc7d39e-d462-4eaa-89fa-625c72c956b6-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4fr7bh" (UID: "bcc7d39e-d462-4eaa-89fa-625c72c956b6") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.068042 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-m6vwb"] Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.069147 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-m6vwb" Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.070037 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-6hdc6" Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.075351 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-q66cl" Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.086259 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmvbh\" (UniqueName: \"kubernetes.io/projected/baa99a85-be34-458d-bc16-c367d4635b10-kube-api-access-jmvbh\") pod \"nova-operator-controller-manager-697bc559fc-w4g57\" (UID: \"baa99a85-be34-458d-bc16-c367d4635b10\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-w4g57" Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.103846 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-8fdcb" Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.121215 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-m6vwb"] Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.125259 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-w4g57" Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.133146 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5448bbd495-75vsz"] Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.134341 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5448bbd495-75vsz" Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.145327 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5448bbd495-75vsz"] Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.151227 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-hl59c" Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.156964 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzpqv\" (UniqueName: \"kubernetes.io/projected/bcc7d39e-d462-4eaa-89fa-625c72c956b6-kube-api-access-lzpqv\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4fr7bh\" (UID: \"bcc7d39e-d462-4eaa-89fa-625c72c956b6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4fr7bh" Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.160366 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mbc5\" (UniqueName: \"kubernetes.io/projected/cdbd6707-63ae-429d-8111-48ab6f912699-kube-api-access-4mbc5\") pod \"placement-operator-controller-manager-78f8948974-m6vwb\" (UID: \"cdbd6707-63ae-429d-8111-48ab6f912699\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-m6vwb" Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.160411 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7p65\" (UniqueName: \"kubernetes.io/projected/7b6381d5-3b01-4c14-a553-e4a51274b140-kube-api-access-h7p65\") pod \"ovn-operator-controller-manager-b6456fdb6-r4qhj\" (UID: \"7b6381d5-3b01-4c14-a553-e4a51274b140\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-r4qhj" Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.160523 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f496d7d1-7362-487d-88d7-33e2c26ce97b-cert\") pod \"infra-operator-controller-manager-57548d458d-77nsb\" (UID: \"f496d7d1-7362-487d-88d7-33e2c26ce97b\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-77nsb" Dec 01 14:11:35 crc kubenswrapper[4585]: E1201 14:11:35.160634 4585 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 01 14:11:35 crc kubenswrapper[4585]: E1201 14:11:35.160678 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f496d7d1-7362-487d-88d7-33e2c26ce97b-cert podName:f496d7d1-7362-487d-88d7-33e2c26ce97b nodeName:}" failed. No retries permitted until 2025-12-01 14:11:36.160663883 +0000 UTC m=+810.144877738 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f496d7d1-7362-487d-88d7-33e2c26ce97b-cert") pod "infra-operator-controller-manager-57548d458d-77nsb" (UID: "f496d7d1-7362-487d-88d7-33e2c26ce97b") : secret "infra-operator-webhook-server-cert" not found Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.168663 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-tgt7n"] Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.170190 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-tgt7n" Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.175531 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-g7px5" Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.198053 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-798dw"] Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.224243 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-tgt7n"] Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.224342 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-798dw" Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.238489 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-twlns" Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.261442 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mbc5\" (UniqueName: \"kubernetes.io/projected/cdbd6707-63ae-429d-8111-48ab6f912699-kube-api-access-4mbc5\") pod \"placement-operator-controller-manager-78f8948974-m6vwb\" (UID: \"cdbd6707-63ae-429d-8111-48ab6f912699\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-m6vwb" Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.261514 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7p65\" (UniqueName: \"kubernetes.io/projected/7b6381d5-3b01-4c14-a553-e4a51274b140-kube-api-access-h7p65\") pod \"ovn-operator-controller-manager-b6456fdb6-r4qhj\" (UID: \"7b6381d5-3b01-4c14-a553-e4a51274b140\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-r4qhj" Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.261587 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldf5g\" (UniqueName: \"kubernetes.io/projected/a672a71f-0885-4771-811e-fd658d282a84-kube-api-access-ldf5g\") pod \"telemetry-operator-controller-manager-76cc84c6bb-tgt7n\" (UID: \"a672a71f-0885-4771-811e-fd658d282a84\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-tgt7n" Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.261625 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tn2bf\" (UniqueName: \"kubernetes.io/projected/b847594a-d018-4939-8177-3faf4a42da5a-kube-api-access-tn2bf\") pod \"swift-operator-controller-manager-5448bbd495-75vsz\" (UID: \"b847594a-d018-4939-8177-3faf4a42da5a\") " pod="openstack-operators/swift-operator-controller-manager-5448bbd495-75vsz" Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.299852 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mbc5\" (UniqueName: \"kubernetes.io/projected/cdbd6707-63ae-429d-8111-48ab6f912699-kube-api-access-4mbc5\") pod \"placement-operator-controller-manager-78f8948974-m6vwb\" (UID: \"cdbd6707-63ae-429d-8111-48ab6f912699\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-m6vwb" Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.300658 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7p65\" (UniqueName: \"kubernetes.io/projected/7b6381d5-3b01-4c14-a553-e4a51274b140-kube-api-access-h7p65\") pod \"ovn-operator-controller-manager-b6456fdb6-r4qhj\" (UID: \"7b6381d5-3b01-4c14-a553-e4a51274b140\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-r4qhj" Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.325512 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-cqpws"] Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.326871 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-cqpws" Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.335503 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-tjcmb" Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.336037 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-r4qhj" Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.369220 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9rgh\" (UniqueName: \"kubernetes.io/projected/c756d201-c2d0-45f1-af3a-acdff1926a1a-kube-api-access-m9rgh\") pod \"test-operator-controller-manager-5854674fcc-798dw\" (UID: \"c756d201-c2d0-45f1-af3a-acdff1926a1a\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-798dw" Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.369301 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldf5g\" (UniqueName: \"kubernetes.io/projected/a672a71f-0885-4771-811e-fd658d282a84-kube-api-access-ldf5g\") pod \"telemetry-operator-controller-manager-76cc84c6bb-tgt7n\" (UID: \"a672a71f-0885-4771-811e-fd658d282a84\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-tgt7n" Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.369337 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tn2bf\" (UniqueName: \"kubernetes.io/projected/b847594a-d018-4939-8177-3faf4a42da5a-kube-api-access-tn2bf\") pod \"swift-operator-controller-manager-5448bbd495-75vsz\" (UID: \"b847594a-d018-4939-8177-3faf4a42da5a\") " pod="openstack-operators/swift-operator-controller-manager-5448bbd495-75vsz" Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.369924 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-798dw"] Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.391141 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-cqpws"] Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.397344 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tn2bf\" (UniqueName: \"kubernetes.io/projected/b847594a-d018-4939-8177-3faf4a42da5a-kube-api-access-tn2bf\") pod \"swift-operator-controller-manager-5448bbd495-75vsz\" (UID: \"b847594a-d018-4939-8177-3faf4a42da5a\") " pod="openstack-operators/swift-operator-controller-manager-5448bbd495-75vsz" Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.420044 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-b9b8558c-w5sxw"] Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.420980 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-b9b8558c-w5sxw" Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.426735 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.426905 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.427087 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-4w4pg" Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.430002 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldf5g\" (UniqueName: \"kubernetes.io/projected/a672a71f-0885-4771-811e-fd658d282a84-kube-api-access-ldf5g\") pod \"telemetry-operator-controller-manager-76cc84c6bb-tgt7n\" (UID: \"a672a71f-0885-4771-811e-fd658d282a84\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-tgt7n" Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.447897 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-m6vwb" Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.487556 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knw7n\" (UniqueName: \"kubernetes.io/projected/59cf60ed-a0d2-4ddc-bfc5-d5973ecbb9e4-kube-api-access-knw7n\") pod \"watcher-operator-controller-manager-769dc69bc-cqpws\" (UID: \"59cf60ed-a0d2-4ddc-bfc5-d5973ecbb9e4\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-cqpws" Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.487891 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2177fed7-edae-4e55-94fd-2037166cbfdc-metrics-certs\") pod \"openstack-operator-controller-manager-b9b8558c-w5sxw\" (UID: \"2177fed7-edae-4e55-94fd-2037166cbfdc\") " pod="openstack-operators/openstack-operator-controller-manager-b9b8558c-w5sxw" Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.487941 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9rgh\" (UniqueName: \"kubernetes.io/projected/c756d201-c2d0-45f1-af3a-acdff1926a1a-kube-api-access-m9rgh\") pod \"test-operator-controller-manager-5854674fcc-798dw\" (UID: \"c756d201-c2d0-45f1-af3a-acdff1926a1a\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-798dw" Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.488054 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh5pc\" (UniqueName: \"kubernetes.io/projected/2177fed7-edae-4e55-94fd-2037166cbfdc-kube-api-access-nh5pc\") pod \"openstack-operator-controller-manager-b9b8558c-w5sxw\" (UID: \"2177fed7-edae-4e55-94fd-2037166cbfdc\") " pod="openstack-operators/openstack-operator-controller-manager-b9b8558c-w5sxw" Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.488104 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2177fed7-edae-4e55-94fd-2037166cbfdc-webhook-certs\") pod \"openstack-operator-controller-manager-b9b8558c-w5sxw\" (UID: \"2177fed7-edae-4e55-94fd-2037166cbfdc\") " pod="openstack-operators/openstack-operator-controller-manager-b9b8558c-w5sxw" Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.528779 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5448bbd495-75vsz" Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.534243 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9rgh\" (UniqueName: \"kubernetes.io/projected/c756d201-c2d0-45f1-af3a-acdff1926a1a-kube-api-access-m9rgh\") pod \"test-operator-controller-manager-5854674fcc-798dw\" (UID: \"c756d201-c2d0-45f1-af3a-acdff1926a1a\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-798dw" Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.552089 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-b9b8558c-w5sxw"] Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.596441 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-tgt7n" Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.596655 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2177fed7-edae-4e55-94fd-2037166cbfdc-webhook-certs\") pod \"openstack-operator-controller-manager-b9b8558c-w5sxw\" (UID: \"2177fed7-edae-4e55-94fd-2037166cbfdc\") " pod="openstack-operators/openstack-operator-controller-manager-b9b8558c-w5sxw" Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.596729 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knw7n\" (UniqueName: \"kubernetes.io/projected/59cf60ed-a0d2-4ddc-bfc5-d5973ecbb9e4-kube-api-access-knw7n\") pod \"watcher-operator-controller-manager-769dc69bc-cqpws\" (UID: \"59cf60ed-a0d2-4ddc-bfc5-d5973ecbb9e4\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-cqpws" Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.596748 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2177fed7-edae-4e55-94fd-2037166cbfdc-metrics-certs\") pod \"openstack-operator-controller-manager-b9b8558c-w5sxw\" (UID: \"2177fed7-edae-4e55-94fd-2037166cbfdc\") " pod="openstack-operators/openstack-operator-controller-manager-b9b8558c-w5sxw" Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.596787 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bcc7d39e-d462-4eaa-89fa-625c72c956b6-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4fr7bh\" (UID: \"bcc7d39e-d462-4eaa-89fa-625c72c956b6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4fr7bh" Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.596825 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh5pc\" (UniqueName: \"kubernetes.io/projected/2177fed7-edae-4e55-94fd-2037166cbfdc-kube-api-access-nh5pc\") pod \"openstack-operator-controller-manager-b9b8558c-w5sxw\" (UID: \"2177fed7-edae-4e55-94fd-2037166cbfdc\") " pod="openstack-operators/openstack-operator-controller-manager-b9b8558c-w5sxw" Dec 01 14:11:35 crc kubenswrapper[4585]: E1201 14:11:35.597098 4585 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 01 14:11:35 crc kubenswrapper[4585]: E1201 14:11:35.597138 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2177fed7-edae-4e55-94fd-2037166cbfdc-metrics-certs podName:2177fed7-edae-4e55-94fd-2037166cbfdc nodeName:}" failed. No retries permitted until 2025-12-01 14:11:36.097125486 +0000 UTC m=+810.081339341 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2177fed7-edae-4e55-94fd-2037166cbfdc-metrics-certs") pod "openstack-operator-controller-manager-b9b8558c-w5sxw" (UID: "2177fed7-edae-4e55-94fd-2037166cbfdc") : secret "metrics-server-cert" not found Dec 01 14:11:35 crc kubenswrapper[4585]: E1201 14:11:35.597272 4585 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 14:11:35 crc kubenswrapper[4585]: E1201 14:11:35.597294 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bcc7d39e-d462-4eaa-89fa-625c72c956b6-cert podName:bcc7d39e-d462-4eaa-89fa-625c72c956b6 nodeName:}" failed. No retries permitted until 2025-12-01 14:11:36.597287391 +0000 UTC m=+810.581501246 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bcc7d39e-d462-4eaa-89fa-625c72c956b6-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4fr7bh" (UID: "bcc7d39e-d462-4eaa-89fa-625c72c956b6") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 14:11:35 crc kubenswrapper[4585]: E1201 14:11:35.597330 4585 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 01 14:11:35 crc kubenswrapper[4585]: E1201 14:11:35.597352 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2177fed7-edae-4e55-94fd-2037166cbfdc-webhook-certs podName:2177fed7-edae-4e55-94fd-2037166cbfdc nodeName:}" failed. No retries permitted until 2025-12-01 14:11:36.097342202 +0000 UTC m=+810.081556137 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2177fed7-edae-4e55-94fd-2037166cbfdc-webhook-certs") pod "openstack-operator-controller-manager-b9b8558c-w5sxw" (UID: "2177fed7-edae-4e55-94fd-2037166cbfdc") : secret "webhook-server-cert" not found Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.613420 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-798dw" Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.661276 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dcz6q"] Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.663197 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh5pc\" (UniqueName: \"kubernetes.io/projected/2177fed7-edae-4e55-94fd-2037166cbfdc-kube-api-access-nh5pc\") pod \"openstack-operator-controller-manager-b9b8558c-w5sxw\" (UID: \"2177fed7-edae-4e55-94fd-2037166cbfdc\") " pod="openstack-operators/openstack-operator-controller-manager-b9b8558c-w5sxw" Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.663590 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dcz6q" Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.666963 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knw7n\" (UniqueName: \"kubernetes.io/projected/59cf60ed-a0d2-4ddc-bfc5-d5973ecbb9e4-kube-api-access-knw7n\") pod \"watcher-operator-controller-manager-769dc69bc-cqpws\" (UID: \"59cf60ed-a0d2-4ddc-bfc5-d5973ecbb9e4\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-cqpws" Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.667178 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dcz6q"] Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.670863 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-vrj9l" Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.693367 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-cqpws" Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.704050 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-br8df"] Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.722449 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-br8df" event={"ID":"8da768c2-cb8c-40f9-b8d1-54a66743b340","Type":"ContainerStarted","Data":"d130d96c0e49d7ff36c93b37561c1a1e36e0a2ac6d3c7b721a84d30ed8122009"} Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.799586 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjgmb\" (UniqueName: \"kubernetes.io/projected/f4100ac0-da14-4d72-88e8-7f7356dad361-kube-api-access-mjgmb\") pod \"rabbitmq-cluster-operator-manager-668c99d594-dcz6q\" (UID: \"f4100ac0-da14-4d72-88e8-7f7356dad361\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dcz6q" Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.900855 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjgmb\" (UniqueName: \"kubernetes.io/projected/f4100ac0-da14-4d72-88e8-7f7356dad361-kube-api-access-mjgmb\") pod \"rabbitmq-cluster-operator-manager-668c99d594-dcz6q\" (UID: \"f4100ac0-da14-4d72-88e8-7f7356dad361\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dcz6q" Dec 01 14:11:35 crc kubenswrapper[4585]: I1201 14:11:35.916358 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjgmb\" (UniqueName: \"kubernetes.io/projected/f4100ac0-da14-4d72-88e8-7f7356dad361-kube-api-access-mjgmb\") pod \"rabbitmq-cluster-operator-manager-668c99d594-dcz6q\" (UID: \"f4100ac0-da14-4d72-88e8-7f7356dad361\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dcz6q" Dec 01 14:11:36 crc kubenswrapper[4585]: I1201 14:11:36.041383 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dcz6q" Dec 01 14:11:36 crc kubenswrapper[4585]: I1201 14:11:36.102714 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2177fed7-edae-4e55-94fd-2037166cbfdc-webhook-certs\") pod \"openstack-operator-controller-manager-b9b8558c-w5sxw\" (UID: \"2177fed7-edae-4e55-94fd-2037166cbfdc\") " pod="openstack-operators/openstack-operator-controller-manager-b9b8558c-w5sxw" Dec 01 14:11:36 crc kubenswrapper[4585]: I1201 14:11:36.102783 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2177fed7-edae-4e55-94fd-2037166cbfdc-metrics-certs\") pod \"openstack-operator-controller-manager-b9b8558c-w5sxw\" (UID: \"2177fed7-edae-4e55-94fd-2037166cbfdc\") " pod="openstack-operators/openstack-operator-controller-manager-b9b8558c-w5sxw" Dec 01 14:11:36 crc kubenswrapper[4585]: E1201 14:11:36.102891 4585 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 01 14:11:36 crc kubenswrapper[4585]: E1201 14:11:36.102955 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2177fed7-edae-4e55-94fd-2037166cbfdc-webhook-certs podName:2177fed7-edae-4e55-94fd-2037166cbfdc nodeName:}" failed. No retries permitted until 2025-12-01 14:11:37.102938569 +0000 UTC m=+811.087152424 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2177fed7-edae-4e55-94fd-2037166cbfdc-webhook-certs") pod "openstack-operator-controller-manager-b9b8558c-w5sxw" (UID: "2177fed7-edae-4e55-94fd-2037166cbfdc") : secret "webhook-server-cert" not found Dec 01 14:11:36 crc kubenswrapper[4585]: E1201 14:11:36.102983 4585 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 01 14:11:36 crc kubenswrapper[4585]: E1201 14:11:36.103037 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2177fed7-edae-4e55-94fd-2037166cbfdc-metrics-certs podName:2177fed7-edae-4e55-94fd-2037166cbfdc nodeName:}" failed. No retries permitted until 2025-12-01 14:11:37.103020442 +0000 UTC m=+811.087234397 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2177fed7-edae-4e55-94fd-2037166cbfdc-metrics-certs") pod "openstack-operator-controller-manager-b9b8558c-w5sxw" (UID: "2177fed7-edae-4e55-94fd-2037166cbfdc") : secret "metrics-server-cert" not found Dec 01 14:11:36 crc kubenswrapper[4585]: I1201 14:11:36.204951 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f496d7d1-7362-487d-88d7-33e2c26ce97b-cert\") pod \"infra-operator-controller-manager-57548d458d-77nsb\" (UID: \"f496d7d1-7362-487d-88d7-33e2c26ce97b\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-77nsb" Dec 01 14:11:36 crc kubenswrapper[4585]: E1201 14:11:36.205279 4585 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 01 14:11:36 crc kubenswrapper[4585]: E1201 14:11:36.205354 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f496d7d1-7362-487d-88d7-33e2c26ce97b-cert podName:f496d7d1-7362-487d-88d7-33e2c26ce97b nodeName:}" failed. No retries permitted until 2025-12-01 14:11:38.205338179 +0000 UTC m=+812.189552034 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f496d7d1-7362-487d-88d7-33e2c26ce97b-cert") pod "infra-operator-controller-manager-57548d458d-77nsb" (UID: "f496d7d1-7362-487d-88d7-33e2c26ce97b") : secret "infra-operator-webhook-server-cert" not found Dec 01 14:11:36 crc kubenswrapper[4585]: I1201 14:11:36.594454 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-6hdc6"] Dec 01 14:11:36 crc kubenswrapper[4585]: I1201 14:11:36.621906 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bcc7d39e-d462-4eaa-89fa-625c72c956b6-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4fr7bh\" (UID: \"bcc7d39e-d462-4eaa-89fa-625c72c956b6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4fr7bh" Dec 01 14:11:36 crc kubenswrapper[4585]: E1201 14:11:36.622051 4585 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 14:11:36 crc kubenswrapper[4585]: E1201 14:11:36.622104 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bcc7d39e-d462-4eaa-89fa-625c72c956b6-cert podName:bcc7d39e-d462-4eaa-89fa-625c72c956b6 nodeName:}" failed. No retries permitted until 2025-12-01 14:11:38.622090958 +0000 UTC m=+812.606304813 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bcc7d39e-d462-4eaa-89fa-625c72c956b6-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4fr7bh" (UID: "bcc7d39e-d462-4eaa-89fa-625c72c956b6") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 14:11:36 crc kubenswrapper[4585]: I1201 14:11:36.740756 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-6hdc6" event={"ID":"0854e7b6-a6fb-4fd0-9e48-564df5d8fea2","Type":"ContainerStarted","Data":"5cd3f1f1457caed26cca88ad098b8902b39573081b33810cb325e5591313e438"} Dec 01 14:11:36 crc kubenswrapper[4585]: I1201 14:11:36.756217 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-vxz95"] Dec 01 14:11:36 crc kubenswrapper[4585]: W1201 14:11:36.803038 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5afd79b9_5528_4ffe_9d3f_ac7b05502348.slice/crio-78f668ec2c47b3f9669d86cf827bab86171ed6c93c0b7d411541c4dfc11547bd WatchSource:0}: Error finding container 78f668ec2c47b3f9669d86cf827bab86171ed6c93c0b7d411541c4dfc11547bd: Status 404 returned error can't find the container with id 78f668ec2c47b3f9669d86cf827bab86171ed6c93c0b7d411541c4dfc11547bd Dec 01 14:11:36 crc kubenswrapper[4585]: I1201 14:11:36.804447 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-8fdcb"] Dec 01 14:11:36 crc kubenswrapper[4585]: I1201 14:11:36.849735 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6546668bfd-pmpnl"] Dec 01 14:11:36 crc kubenswrapper[4585]: I1201 14:11:36.861745 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-ndffl"] Dec 01 14:11:36 crc kubenswrapper[4585]: I1201 14:11:36.877403 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-qpqgr"] Dec 01 14:11:36 crc kubenswrapper[4585]: W1201 14:11:36.878013 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4697227_2800_4a64_89bf_5bf831077ceb.slice/crio-9a1ae3ec0e368f8818561df5786209147bc1c71cb82fdb87baaf301199705e08 WatchSource:0}: Error finding container 9a1ae3ec0e368f8818561df5786209147bc1c71cb82fdb87baaf301199705e08: Status 404 returned error can't find the container with id 9a1ae3ec0e368f8818561df5786209147bc1c71cb82fdb87baaf301199705e08 Dec 01 14:11:36 crc kubenswrapper[4585]: I1201 14:11:36.888036 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-qnbzj"] Dec 01 14:11:36 crc kubenswrapper[4585]: I1201 14:11:36.893072 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-8qd82"] Dec 01 14:11:36 crc kubenswrapper[4585]: I1201 14:11:36.899162 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-tgt7n"] Dec 01 14:11:36 crc kubenswrapper[4585]: E1201 14:11:36.899522 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jmvbh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-w4g57_openstack-operators(baa99a85-be34-458d-bc16-c367d4635b10): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 14:11:36 crc kubenswrapper[4585]: E1201 14:11:36.900644 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4mbc5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-m6vwb_openstack-operators(cdbd6707-63ae-429d-8111-48ab6f912699): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 14:11:36 crc kubenswrapper[4585]: E1201 14:11:36.901437 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jmvbh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-w4g57_openstack-operators(baa99a85-be34-458d-bc16-c367d4635b10): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 14:11:36 crc kubenswrapper[4585]: E1201 14:11:36.902325 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4mbc5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-m6vwb_openstack-operators(cdbd6707-63ae-429d-8111-48ab6f912699): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 14:11:36 crc kubenswrapper[4585]: E1201 14:11:36.903272 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-w4g57" podUID="baa99a85-be34-458d-bc16-c367d4635b10" Dec 01 14:11:36 crc kubenswrapper[4585]: I1201 14:11:36.903745 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-668d9c48b9-4kmsq"] Dec 01 14:11:36 crc kubenswrapper[4585]: E1201 14:11:36.903889 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-m6vwb" podUID="cdbd6707-63ae-429d-8111-48ab6f912699" Dec 01 14:11:36 crc kubenswrapper[4585]: E1201 14:11:36.913832 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:986861e5a0a9954f63581d9d55a30f8057883cefea489415d76257774526eea3,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jvfvd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-546d4bdf48-xsbwl_openstack-operators(50b75abe-8fa5-4e48-87bb-560b5609feda): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 14:11:36 crc kubenswrapper[4585]: I1201 14:11:36.917362 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-m6vwb"] Dec 01 14:11:36 crc kubenswrapper[4585]: I1201 14:11:36.925539 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-w4g57"] Dec 01 14:11:36 crc kubenswrapper[4585]: E1201 14:11:36.925836 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jvfvd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-546d4bdf48-xsbwl_openstack-operators(50b75abe-8fa5-4e48-87bb-560b5609feda): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 14:11:36 crc kubenswrapper[4585]: I1201 14:11:36.931596 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-sqs7f"] Dec 01 14:11:36 crc kubenswrapper[4585]: E1201 14:11:36.931655 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-xsbwl" podUID="50b75abe-8fa5-4e48-87bb-560b5609feda" Dec 01 14:11:36 crc kubenswrapper[4585]: I1201 14:11:36.938372 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-546d4bdf48-xsbwl"] Dec 01 14:11:36 crc kubenswrapper[4585]: I1201 14:11:36.958472 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-r4qhj"] Dec 01 14:11:36 crc kubenswrapper[4585]: E1201 14:11:36.978142 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-h7p65,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-r4qhj_openstack-operators(7b6381d5-3b01-4c14-a553-e4a51274b140): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 14:11:36 crc kubenswrapper[4585]: E1201 14:11:36.983795 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-h7p65,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-r4qhj_openstack-operators(7b6381d5-3b01-4c14-a553-e4a51274b140): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 14:11:36 crc kubenswrapper[4585]: E1201 14:11:36.985790 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-r4qhj" podUID="7b6381d5-3b01-4c14-a553-e4a51274b140" Dec 01 14:11:36 crc kubenswrapper[4585]: I1201 14:11:36.995698 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dcz6q"] Dec 01 14:11:36 crc kubenswrapper[4585]: W1201 14:11:36.996884 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4100ac0_da14_4d72_88e8_7f7356dad361.slice/crio-abf4b595cb45a7fe1f647a6ab294ddf13004f08e6de4a4f4f689b431d6cbffc4 WatchSource:0}: Error finding container abf4b595cb45a7fe1f647a6ab294ddf13004f08e6de4a4f4f689b431d6cbffc4: Status 404 returned error can't find the container with id abf4b595cb45a7fe1f647a6ab294ddf13004f08e6de4a4f4f689b431d6cbffc4 Dec 01 14:11:37 crc kubenswrapper[4585]: I1201 14:11:37.027573 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5448bbd495-75vsz"] Dec 01 14:11:37 crc kubenswrapper[4585]: E1201 14:11:37.046417 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.248:5001/openstack-k8s-operators/swift-operator:7676a13231e369663a54ee29a65e6ea40110f875,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tn2bf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5448bbd495-75vsz_openstack-operators(b847594a-d018-4939-8177-3faf4a42da5a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 14:11:37 crc kubenswrapper[4585]: E1201 14:11:37.049462 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tn2bf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5448bbd495-75vsz_openstack-operators(b847594a-d018-4939-8177-3faf4a42da5a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 14:11:37 crc kubenswrapper[4585]: W1201 14:11:37.050255 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59cf60ed_a0d2_4ddc_bfc5_d5973ecbb9e4.slice/crio-febcc98e4ece8c87146c48a68c56c665fccc13fddb03e354e682271abeaad8df WatchSource:0}: Error finding container febcc98e4ece8c87146c48a68c56c665fccc13fddb03e354e682271abeaad8df: Status 404 returned error can't find the container with id febcc98e4ece8c87146c48a68c56c665fccc13fddb03e354e682271abeaad8df Dec 01 14:11:37 crc kubenswrapper[4585]: E1201 14:11:37.051331 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/swift-operator-controller-manager-5448bbd495-75vsz" podUID="b847594a-d018-4939-8177-3faf4a42da5a" Dec 01 14:11:37 crc kubenswrapper[4585]: I1201 14:11:37.051536 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-cqpws"] Dec 01 14:11:37 crc kubenswrapper[4585]: E1201 14:11:37.054854 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-knw7n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-cqpws_openstack-operators(59cf60ed-a0d2-4ddc-bfc5-d5973ecbb9e4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 14:11:37 crc kubenswrapper[4585]: E1201 14:11:37.056440 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-knw7n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-cqpws_openstack-operators(59cf60ed-a0d2-4ddc-bfc5-d5973ecbb9e4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 14:11:37 crc kubenswrapper[4585]: E1201 14:11:37.057894 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-cqpws" podUID="59cf60ed-a0d2-4ddc-bfc5-d5973ecbb9e4" Dec 01 14:11:37 crc kubenswrapper[4585]: I1201 14:11:37.061006 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-798dw"] Dec 01 14:11:37 crc kubenswrapper[4585]: I1201 14:11:37.142365 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2177fed7-edae-4e55-94fd-2037166cbfdc-webhook-certs\") pod \"openstack-operator-controller-manager-b9b8558c-w5sxw\" (UID: \"2177fed7-edae-4e55-94fd-2037166cbfdc\") " pod="openstack-operators/openstack-operator-controller-manager-b9b8558c-w5sxw" Dec 01 14:11:37 crc kubenswrapper[4585]: I1201 14:11:37.142463 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2177fed7-edae-4e55-94fd-2037166cbfdc-metrics-certs\") pod \"openstack-operator-controller-manager-b9b8558c-w5sxw\" (UID: \"2177fed7-edae-4e55-94fd-2037166cbfdc\") " pod="openstack-operators/openstack-operator-controller-manager-b9b8558c-w5sxw" Dec 01 14:11:37 crc kubenswrapper[4585]: E1201 14:11:37.142677 4585 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 01 14:11:37 crc kubenswrapper[4585]: E1201 14:11:37.142727 4585 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 01 14:11:37 crc kubenswrapper[4585]: E1201 14:11:37.142831 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2177fed7-edae-4e55-94fd-2037166cbfdc-metrics-certs podName:2177fed7-edae-4e55-94fd-2037166cbfdc nodeName:}" failed. No retries permitted until 2025-12-01 14:11:39.142809209 +0000 UTC m=+813.127023064 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2177fed7-edae-4e55-94fd-2037166cbfdc-metrics-certs") pod "openstack-operator-controller-manager-b9b8558c-w5sxw" (UID: "2177fed7-edae-4e55-94fd-2037166cbfdc") : secret "metrics-server-cert" not found Dec 01 14:11:37 crc kubenswrapper[4585]: E1201 14:11:37.142891 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2177fed7-edae-4e55-94fd-2037166cbfdc-webhook-certs podName:2177fed7-edae-4e55-94fd-2037166cbfdc nodeName:}" failed. No retries permitted until 2025-12-01 14:11:39.14287117 +0000 UTC m=+813.127085025 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2177fed7-edae-4e55-94fd-2037166cbfdc-webhook-certs") pod "openstack-operator-controller-manager-b9b8558c-w5sxw" (UID: "2177fed7-edae-4e55-94fd-2037166cbfdc") : secret "webhook-server-cert" not found Dec 01 14:11:37 crc kubenswrapper[4585]: W1201 14:11:37.224120 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc756d201_c2d0_45f1_af3a_acdff1926a1a.slice/crio-34518fb653191fd50afad907ccefe99e776ce3e580e2ec0f067a23a0ae5ffea9 WatchSource:0}: Error finding container 34518fb653191fd50afad907ccefe99e776ce3e580e2ec0f067a23a0ae5ffea9: Status 404 returned error can't find the container with id 34518fb653191fd50afad907ccefe99e776ce3e580e2ec0f067a23a0ae5ffea9 Dec 01 14:11:37 crc kubenswrapper[4585]: I1201 14:11:37.752405 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-tgt7n" event={"ID":"a672a71f-0885-4771-811e-fd658d282a84","Type":"ContainerStarted","Data":"70efbecbe78d4c40424ef831b9803f67737b307988e03f00f60d254375fb97e8"} Dec 01 14:11:37 crc kubenswrapper[4585]: I1201 14:11:37.753791 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-8qd82" event={"ID":"c4697227-2800-4a64-89bf-5bf831077ceb","Type":"ContainerStarted","Data":"9a1ae3ec0e368f8818561df5786209147bc1c71cb82fdb87baaf301199705e08"} Dec 01 14:11:37 crc kubenswrapper[4585]: I1201 14:11:37.755410 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-m6vwb" event={"ID":"cdbd6707-63ae-429d-8111-48ab6f912699","Type":"ContainerStarted","Data":"4d763abe61f4c2ac7b2142d17fe689993e424c354eb93f0cc04efb48cb01c590"} Dec 01 14:11:37 crc kubenswrapper[4585]: I1201 14:11:37.757639 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-qnbzj" event={"ID":"59994d2c-6485-4beb-bcfc-3fd4a22bd203","Type":"ContainerStarted","Data":"842fb86042082d8eb2667e609246635bed62b26ad63f3e9e5a4956e75fba1708"} Dec 01 14:11:37 crc kubenswrapper[4585]: I1201 14:11:37.764371 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-4kmsq" event={"ID":"7f8c91fb-441e-44f0-bf97-1340df47f4b0","Type":"ContainerStarted","Data":"d77148fcc161390e62078d754267666e7cbcaaec61ff7a13e474d28240baf6ac"} Dec 01 14:11:37 crc kubenswrapper[4585]: E1201 14:11:37.764382 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-m6vwb" podUID="cdbd6707-63ae-429d-8111-48ab6f912699" Dec 01 14:11:37 crc kubenswrapper[4585]: I1201 14:11:37.766454 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-pmpnl" event={"ID":"abe5e9b4-4f45-4fb6-92f7-739d4174996b","Type":"ContainerStarted","Data":"cf8e581f50fb206a831aae7e344e974757b9ceba03cb4098ae3c04a549d90d13"} Dec 01 14:11:37 crc kubenswrapper[4585]: I1201 14:11:37.769791 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-sqs7f" event={"ID":"1d516bcd-4ed7-4c83-a07e-3a8f66761090","Type":"ContainerStarted","Data":"7253ed066603e6cc444f8cdce4d7ab4ebddd80ab6a212bdc05a3e68c7ee7a41f"} Dec 01 14:11:37 crc kubenswrapper[4585]: I1201 14:11:37.772702 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-w4g57" event={"ID":"baa99a85-be34-458d-bc16-c367d4635b10","Type":"ContainerStarted","Data":"001616b248f2ef7f095049ec3bb550b848b405a13b96f5ca42621330a05f1161"} Dec 01 14:11:37 crc kubenswrapper[4585]: E1201 14:11:37.791416 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-w4g57" podUID="baa99a85-be34-458d-bc16-c367d4635b10" Dec 01 14:11:37 crc kubenswrapper[4585]: I1201 14:11:37.791512 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5448bbd495-75vsz" event={"ID":"b847594a-d018-4939-8177-3faf4a42da5a","Type":"ContainerStarted","Data":"8e57dddb43e7b8364fcc0a461cdb942b642869c7442868dc6014b8a0f0514d99"} Dec 01 14:11:37 crc kubenswrapper[4585]: E1201 14:11:37.812827 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.248:5001/openstack-k8s-operators/swift-operator:7676a13231e369663a54ee29a65e6ea40110f875\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-5448bbd495-75vsz" podUID="b847594a-d018-4939-8177-3faf4a42da5a" Dec 01 14:11:37 crc kubenswrapper[4585]: I1201 14:11:37.823631 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-r4qhj" event={"ID":"7b6381d5-3b01-4c14-a553-e4a51274b140","Type":"ContainerStarted","Data":"2d32d158372ce713a17b14c6bc39c2353c646baff061f9ade35fcb3fc55b21b5"} Dec 01 14:11:37 crc kubenswrapper[4585]: E1201 14:11:37.827479 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-r4qhj" podUID="7b6381d5-3b01-4c14-a553-e4a51274b140" Dec 01 14:11:37 crc kubenswrapper[4585]: I1201 14:11:37.828493 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-ndffl" event={"ID":"1c21caba-6277-4106-b637-a4874412f527","Type":"ContainerStarted","Data":"c5c64c8209ec2a10e39a0e0bd560d58aa9f0910f4459c0b8783559236c534d46"} Dec 01 14:11:37 crc kubenswrapper[4585]: I1201 14:11:37.832054 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-xsbwl" event={"ID":"50b75abe-8fa5-4e48-87bb-560b5609feda","Type":"ContainerStarted","Data":"537aaa8ab607f3c7bb70abb1979e81178cc4c6f6e18e7b73bfa5376a5deb8f7d"} Dec 01 14:11:37 crc kubenswrapper[4585]: I1201 14:11:37.844454 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dcz6q" event={"ID":"f4100ac0-da14-4d72-88e8-7f7356dad361","Type":"ContainerStarted","Data":"abf4b595cb45a7fe1f647a6ab294ddf13004f08e6de4a4f4f689b431d6cbffc4"} Dec 01 14:11:37 crc kubenswrapper[4585]: E1201 14:11:37.848685 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:986861e5a0a9954f63581d9d55a30f8057883cefea489415d76257774526eea3\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-xsbwl" podUID="50b75abe-8fa5-4e48-87bb-560b5609feda" Dec 01 14:11:37 crc kubenswrapper[4585]: I1201 14:11:37.849367 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-798dw" event={"ID":"c756d201-c2d0-45f1-af3a-acdff1926a1a","Type":"ContainerStarted","Data":"34518fb653191fd50afad907ccefe99e776ce3e580e2ec0f067a23a0ae5ffea9"} Dec 01 14:11:37 crc kubenswrapper[4585]: I1201 14:11:37.850524 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-8fdcb" event={"ID":"f62dd90c-aa85-4650-92e0-13e52ec60360","Type":"ContainerStarted","Data":"16eb545cafed09faa3477fdd4aa9b35bad0a91d20ae97022237094aa1e800510"} Dec 01 14:11:37 crc kubenswrapper[4585]: I1201 14:11:37.860186 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-qpqgr" event={"ID":"60bbdebb-4ac8-4971-82b2-252a989a8c3a","Type":"ContainerStarted","Data":"f04598941636cf7727e7bc84b1cdd0c8e5abd84b0078e43b579b2e7dae2f4e2f"} Dec 01 14:11:37 crc kubenswrapper[4585]: I1201 14:11:37.876212 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-vxz95" event={"ID":"5afd79b9-5528-4ffe-9d3f-ac7b05502348","Type":"ContainerStarted","Data":"78f668ec2c47b3f9669d86cf827bab86171ed6c93c0b7d411541c4dfc11547bd"} Dec 01 14:11:37 crc kubenswrapper[4585]: I1201 14:11:37.877296 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-cqpws" event={"ID":"59cf60ed-a0d2-4ddc-bfc5-d5973ecbb9e4","Type":"ContainerStarted","Data":"febcc98e4ece8c87146c48a68c56c665fccc13fddb03e354e682271abeaad8df"} Dec 01 14:11:37 crc kubenswrapper[4585]: E1201 14:11:37.880816 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-cqpws" podUID="59cf60ed-a0d2-4ddc-bfc5-d5973ecbb9e4" Dec 01 14:11:38 crc kubenswrapper[4585]: I1201 14:11:38.256431 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f496d7d1-7362-487d-88d7-33e2c26ce97b-cert\") pod \"infra-operator-controller-manager-57548d458d-77nsb\" (UID: \"f496d7d1-7362-487d-88d7-33e2c26ce97b\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-77nsb" Dec 01 14:11:38 crc kubenswrapper[4585]: E1201 14:11:38.256874 4585 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 01 14:11:38 crc kubenswrapper[4585]: E1201 14:11:38.257067 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f496d7d1-7362-487d-88d7-33e2c26ce97b-cert podName:f496d7d1-7362-487d-88d7-33e2c26ce97b nodeName:}" failed. No retries permitted until 2025-12-01 14:11:42.257029799 +0000 UTC m=+816.241243654 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f496d7d1-7362-487d-88d7-33e2c26ce97b-cert") pod "infra-operator-controller-manager-57548d458d-77nsb" (UID: "f496d7d1-7362-487d-88d7-33e2c26ce97b") : secret "infra-operator-webhook-server-cert" not found Dec 01 14:11:38 crc kubenswrapper[4585]: I1201 14:11:38.663768 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bcc7d39e-d462-4eaa-89fa-625c72c956b6-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4fr7bh\" (UID: \"bcc7d39e-d462-4eaa-89fa-625c72c956b6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4fr7bh" Dec 01 14:11:38 crc kubenswrapper[4585]: E1201 14:11:38.663954 4585 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 14:11:38 crc kubenswrapper[4585]: E1201 14:11:38.664040 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bcc7d39e-d462-4eaa-89fa-625c72c956b6-cert podName:bcc7d39e-d462-4eaa-89fa-625c72c956b6 nodeName:}" failed. No retries permitted until 2025-12-01 14:11:42.664022378 +0000 UTC m=+816.648236233 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bcc7d39e-d462-4eaa-89fa-625c72c956b6-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4fr7bh" (UID: "bcc7d39e-d462-4eaa-89fa-625c72c956b6") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 14:11:38 crc kubenswrapper[4585]: E1201 14:11:38.898635 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-m6vwb" podUID="cdbd6707-63ae-429d-8111-48ab6f912699" Dec 01 14:11:38 crc kubenswrapper[4585]: E1201 14:11:38.899592 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-r4qhj" podUID="7b6381d5-3b01-4c14-a553-e4a51274b140" Dec 01 14:11:38 crc kubenswrapper[4585]: E1201 14:11:38.899792 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-cqpws" podUID="59cf60ed-a0d2-4ddc-bfc5-d5973ecbb9e4" Dec 01 14:11:38 crc kubenswrapper[4585]: E1201 14:11:38.900364 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:986861e5a0a9954f63581d9d55a30f8057883cefea489415d76257774526eea3\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-xsbwl" podUID="50b75abe-8fa5-4e48-87bb-560b5609feda" Dec 01 14:11:38 crc kubenswrapper[4585]: E1201 14:11:38.900443 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-w4g57" podUID="baa99a85-be34-458d-bc16-c367d4635b10" Dec 01 14:11:38 crc kubenswrapper[4585]: E1201 14:11:38.907316 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.248:5001/openstack-k8s-operators/swift-operator:7676a13231e369663a54ee29a65e6ea40110f875\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-5448bbd495-75vsz" podUID="b847594a-d018-4939-8177-3faf4a42da5a" Dec 01 14:11:39 crc kubenswrapper[4585]: I1201 14:11:39.176441 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2177fed7-edae-4e55-94fd-2037166cbfdc-webhook-certs\") pod \"openstack-operator-controller-manager-b9b8558c-w5sxw\" (UID: \"2177fed7-edae-4e55-94fd-2037166cbfdc\") " pod="openstack-operators/openstack-operator-controller-manager-b9b8558c-w5sxw" Dec 01 14:11:39 crc kubenswrapper[4585]: I1201 14:11:39.176569 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2177fed7-edae-4e55-94fd-2037166cbfdc-metrics-certs\") pod \"openstack-operator-controller-manager-b9b8558c-w5sxw\" (UID: \"2177fed7-edae-4e55-94fd-2037166cbfdc\") " pod="openstack-operators/openstack-operator-controller-manager-b9b8558c-w5sxw" Dec 01 14:11:39 crc kubenswrapper[4585]: E1201 14:11:39.176637 4585 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 01 14:11:39 crc kubenswrapper[4585]: E1201 14:11:39.176923 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2177fed7-edae-4e55-94fd-2037166cbfdc-webhook-certs podName:2177fed7-edae-4e55-94fd-2037166cbfdc nodeName:}" failed. No retries permitted until 2025-12-01 14:11:43.176900349 +0000 UTC m=+817.161114274 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2177fed7-edae-4e55-94fd-2037166cbfdc-webhook-certs") pod "openstack-operator-controller-manager-b9b8558c-w5sxw" (UID: "2177fed7-edae-4e55-94fd-2037166cbfdc") : secret "webhook-server-cert" not found Dec 01 14:11:39 crc kubenswrapper[4585]: E1201 14:11:39.177290 4585 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 01 14:11:39 crc kubenswrapper[4585]: E1201 14:11:39.177578 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2177fed7-edae-4e55-94fd-2037166cbfdc-metrics-certs podName:2177fed7-edae-4e55-94fd-2037166cbfdc nodeName:}" failed. No retries permitted until 2025-12-01 14:11:43.177557236 +0000 UTC m=+817.161771091 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2177fed7-edae-4e55-94fd-2037166cbfdc-metrics-certs") pod "openstack-operator-controller-manager-b9b8558c-w5sxw" (UID: "2177fed7-edae-4e55-94fd-2037166cbfdc") : secret "metrics-server-cert" not found Dec 01 14:11:42 crc kubenswrapper[4585]: I1201 14:11:42.323568 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f496d7d1-7362-487d-88d7-33e2c26ce97b-cert\") pod \"infra-operator-controller-manager-57548d458d-77nsb\" (UID: \"f496d7d1-7362-487d-88d7-33e2c26ce97b\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-77nsb" Dec 01 14:11:42 crc kubenswrapper[4585]: E1201 14:11:42.323712 4585 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 01 14:11:42 crc kubenswrapper[4585]: E1201 14:11:42.324129 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f496d7d1-7362-487d-88d7-33e2c26ce97b-cert podName:f496d7d1-7362-487d-88d7-33e2c26ce97b nodeName:}" failed. No retries permitted until 2025-12-01 14:11:50.324108812 +0000 UTC m=+824.308322677 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f496d7d1-7362-487d-88d7-33e2c26ce97b-cert") pod "infra-operator-controller-manager-57548d458d-77nsb" (UID: "f496d7d1-7362-487d-88d7-33e2c26ce97b") : secret "infra-operator-webhook-server-cert" not found Dec 01 14:11:42 crc kubenswrapper[4585]: I1201 14:11:42.728913 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bcc7d39e-d462-4eaa-89fa-625c72c956b6-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4fr7bh\" (UID: \"bcc7d39e-d462-4eaa-89fa-625c72c956b6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4fr7bh" Dec 01 14:11:42 crc kubenswrapper[4585]: E1201 14:11:42.729130 4585 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 14:11:42 crc kubenswrapper[4585]: E1201 14:11:42.729217 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bcc7d39e-d462-4eaa-89fa-625c72c956b6-cert podName:bcc7d39e-d462-4eaa-89fa-625c72c956b6 nodeName:}" failed. No retries permitted until 2025-12-01 14:11:50.729194049 +0000 UTC m=+824.713407914 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bcc7d39e-d462-4eaa-89fa-625c72c956b6-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4fr7bh" (UID: "bcc7d39e-d462-4eaa-89fa-625c72c956b6") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 14:11:43 crc kubenswrapper[4585]: I1201 14:11:43.236957 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2177fed7-edae-4e55-94fd-2037166cbfdc-webhook-certs\") pod \"openstack-operator-controller-manager-b9b8558c-w5sxw\" (UID: \"2177fed7-edae-4e55-94fd-2037166cbfdc\") " pod="openstack-operators/openstack-operator-controller-manager-b9b8558c-w5sxw" Dec 01 14:11:43 crc kubenswrapper[4585]: I1201 14:11:43.237353 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2177fed7-edae-4e55-94fd-2037166cbfdc-metrics-certs\") pod \"openstack-operator-controller-manager-b9b8558c-w5sxw\" (UID: \"2177fed7-edae-4e55-94fd-2037166cbfdc\") " pod="openstack-operators/openstack-operator-controller-manager-b9b8558c-w5sxw" Dec 01 14:11:43 crc kubenswrapper[4585]: E1201 14:11:43.237185 4585 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 01 14:11:43 crc kubenswrapper[4585]: E1201 14:11:43.237562 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2177fed7-edae-4e55-94fd-2037166cbfdc-webhook-certs podName:2177fed7-edae-4e55-94fd-2037166cbfdc nodeName:}" failed. No retries permitted until 2025-12-01 14:11:51.23754659 +0000 UTC m=+825.221760445 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2177fed7-edae-4e55-94fd-2037166cbfdc-webhook-certs") pod "openstack-operator-controller-manager-b9b8558c-w5sxw" (UID: "2177fed7-edae-4e55-94fd-2037166cbfdc") : secret "webhook-server-cert" not found Dec 01 14:11:43 crc kubenswrapper[4585]: E1201 14:11:43.237511 4585 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 01 14:11:43 crc kubenswrapper[4585]: E1201 14:11:43.237919 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2177fed7-edae-4e55-94fd-2037166cbfdc-metrics-certs podName:2177fed7-edae-4e55-94fd-2037166cbfdc nodeName:}" failed. No retries permitted until 2025-12-01 14:11:51.237894669 +0000 UTC m=+825.222108524 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2177fed7-edae-4e55-94fd-2037166cbfdc-metrics-certs") pod "openstack-operator-controller-manager-b9b8558c-w5sxw" (UID: "2177fed7-edae-4e55-94fd-2037166cbfdc") : secret "metrics-server-cert" not found Dec 01 14:11:43 crc kubenswrapper[4585]: I1201 14:11:43.716748 4585 patch_prober.go:28] interesting pod/machine-config-daemon-lj9gs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 14:11:43 crc kubenswrapper[4585]: I1201 14:11:43.716819 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 14:11:43 crc kubenswrapper[4585]: I1201 14:11:43.716869 4585 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" Dec 01 14:11:43 crc kubenswrapper[4585]: I1201 14:11:43.717671 4585 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9c565360e1e1b852f24cf87ad3ed2b80ca20fd43a45c1f1f0ee3553f5b1d6b02"} pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 14:11:43 crc kubenswrapper[4585]: I1201 14:11:43.720139 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerName="machine-config-daemon" containerID="cri-o://9c565360e1e1b852f24cf87ad3ed2b80ca20fd43a45c1f1f0ee3553f5b1d6b02" gracePeriod=600 Dec 01 14:11:43 crc kubenswrapper[4585]: I1201 14:11:43.942210 4585 generic.go:334] "Generic (PLEG): container finished" podID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerID="9c565360e1e1b852f24cf87ad3ed2b80ca20fd43a45c1f1f0ee3553f5b1d6b02" exitCode=0 Dec 01 14:11:43 crc kubenswrapper[4585]: I1201 14:11:43.942250 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" event={"ID":"f7beb40d-bcd0-43c8-a9fe-c32408790a4c","Type":"ContainerDied","Data":"9c565360e1e1b852f24cf87ad3ed2b80ca20fd43a45c1f1f0ee3553f5b1d6b02"} Dec 01 14:11:43 crc kubenswrapper[4585]: I1201 14:11:43.942280 4585 scope.go:117] "RemoveContainer" containerID="80328dc2704086bb4c5e275cec97e65017716d1273f5d086800acbe8844177d3" Dec 01 14:11:48 crc kubenswrapper[4585]: E1201 14:11:48.934438 4585 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5" Dec 01 14:11:48 crc kubenswrapper[4585]: E1201 14:11:48.935114 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2pnjj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-68c6d99b8f-vxz95_openstack-operators(5afd79b9-5528-4ffe-9d3f-ac7b05502348): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 14:11:49 crc kubenswrapper[4585]: E1201 14:11:49.478716 4585 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:c4abfc148600dfa85915f3dc911d988ea2335f26cb6b8d749fe79bfe53e5e429" Dec 01 14:11:49 crc kubenswrapper[4585]: E1201 14:11:49.479181 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:c4abfc148600dfa85915f3dc911d988ea2335f26cb6b8d749fe79bfe53e5e429,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-952gr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-5f64f6f8bb-qpqgr_openstack-operators(60bbdebb-4ac8-4971-82b2-252a989a8c3a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 14:11:49 crc kubenswrapper[4585]: E1201 14:11:49.981604 4585 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168" Dec 01 14:11:49 crc kubenswrapper[4585]: E1201 14:11:49.981793 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-22xzq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-998648c74-6hdc6_openstack-operators(0854e7b6-a6fb-4fd0-9e48-564df5d8fea2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 14:11:50 crc kubenswrapper[4585]: I1201 14:11:50.344267 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f496d7d1-7362-487d-88d7-33e2c26ce97b-cert\") pod \"infra-operator-controller-manager-57548d458d-77nsb\" (UID: \"f496d7d1-7362-487d-88d7-33e2c26ce97b\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-77nsb" Dec 01 14:11:50 crc kubenswrapper[4585]: E1201 14:11:50.344414 4585 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 01 14:11:50 crc kubenswrapper[4585]: E1201 14:11:50.344486 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f496d7d1-7362-487d-88d7-33e2c26ce97b-cert podName:f496d7d1-7362-487d-88d7-33e2c26ce97b nodeName:}" failed. No retries permitted until 2025-12-01 14:12:06.344469682 +0000 UTC m=+840.328683537 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f496d7d1-7362-487d-88d7-33e2c26ce97b-cert") pod "infra-operator-controller-manager-57548d458d-77nsb" (UID: "f496d7d1-7362-487d-88d7-33e2c26ce97b") : secret "infra-operator-webhook-server-cert" not found Dec 01 14:11:50 crc kubenswrapper[4585]: I1201 14:11:50.750257 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bcc7d39e-d462-4eaa-89fa-625c72c956b6-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4fr7bh\" (UID: \"bcc7d39e-d462-4eaa-89fa-625c72c956b6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4fr7bh" Dec 01 14:11:50 crc kubenswrapper[4585]: I1201 14:11:50.759891 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bcc7d39e-d462-4eaa-89fa-625c72c956b6-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4fr7bh\" (UID: \"bcc7d39e-d462-4eaa-89fa-625c72c956b6\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4fr7bh" Dec 01 14:11:50 crc kubenswrapper[4585]: I1201 14:11:50.766478 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4fr7bh" Dec 01 14:11:51 crc kubenswrapper[4585]: I1201 14:11:51.256188 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2177fed7-edae-4e55-94fd-2037166cbfdc-webhook-certs\") pod \"openstack-operator-controller-manager-b9b8558c-w5sxw\" (UID: \"2177fed7-edae-4e55-94fd-2037166cbfdc\") " pod="openstack-operators/openstack-operator-controller-manager-b9b8558c-w5sxw" Dec 01 14:11:51 crc kubenswrapper[4585]: I1201 14:11:51.256256 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2177fed7-edae-4e55-94fd-2037166cbfdc-metrics-certs\") pod \"openstack-operator-controller-manager-b9b8558c-w5sxw\" (UID: \"2177fed7-edae-4e55-94fd-2037166cbfdc\") " pod="openstack-operators/openstack-operator-controller-manager-b9b8558c-w5sxw" Dec 01 14:11:51 crc kubenswrapper[4585]: E1201 14:11:51.256357 4585 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 01 14:11:51 crc kubenswrapper[4585]: E1201 14:11:51.256359 4585 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 01 14:11:51 crc kubenswrapper[4585]: E1201 14:11:51.256407 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2177fed7-edae-4e55-94fd-2037166cbfdc-metrics-certs podName:2177fed7-edae-4e55-94fd-2037166cbfdc nodeName:}" failed. No retries permitted until 2025-12-01 14:12:07.256392491 +0000 UTC m=+841.240606346 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2177fed7-edae-4e55-94fd-2037166cbfdc-metrics-certs") pod "openstack-operator-controller-manager-b9b8558c-w5sxw" (UID: "2177fed7-edae-4e55-94fd-2037166cbfdc") : secret "metrics-server-cert" not found Dec 01 14:11:51 crc kubenswrapper[4585]: E1201 14:11:51.256419 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2177fed7-edae-4e55-94fd-2037166cbfdc-webhook-certs podName:2177fed7-edae-4e55-94fd-2037166cbfdc nodeName:}" failed. No retries permitted until 2025-12-01 14:12:07.256414321 +0000 UTC m=+841.240628176 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2177fed7-edae-4e55-94fd-2037166cbfdc-webhook-certs") pod "openstack-operator-controller-manager-b9b8558c-w5sxw" (UID: "2177fed7-edae-4e55-94fd-2037166cbfdc") : secret "webhook-server-cert" not found Dec 01 14:11:51 crc kubenswrapper[4585]: E1201 14:11:51.493542 4585 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:9f68d7bc8c6bce38f46dee8a8272d5365c49fe7b32b2af52e8ac884e212f3a85" Dec 01 14:11:51 crc kubenswrapper[4585]: E1201 14:11:51.493768 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:9f68d7bc8c6bce38f46dee8a8272d5365c49fe7b32b2af52e8ac884e212f3a85,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9472j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-78b4bc895b-8qd82_openstack-operators(c4697227-2800-4a64-89bf-5bf831077ceb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 14:11:53 crc kubenswrapper[4585]: E1201 14:11:53.216330 4585 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:1d60701214b39cdb0fa70bbe5710f9b131139a9f4b482c2db4058a04daefb801" Dec 01 14:11:53 crc kubenswrapper[4585]: E1201 14:11:53.216727 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:1d60701214b39cdb0fa70bbe5710f9b131139a9f4b482c2db4058a04daefb801,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wt24w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-859b6ccc6-qnbzj_openstack-operators(59994d2c-6485-4beb-bcfc-3fd4a22bd203): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 14:11:54 crc kubenswrapper[4585]: E1201 14:11:54.521986 4585 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385" Dec 01 14:11:54 crc kubenswrapper[4585]: E1201 14:11:54.522194 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ldf5g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76cc84c6bb-tgt7n_openstack-operators(a672a71f-0885-4771-811e-fd658d282a84): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 14:11:55 crc kubenswrapper[4585]: E1201 14:11:55.175202 4585 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:440cde33d3a2a0c545cd1c110a3634eb85544370f448865b97a13c38034b0172" Dec 01 14:11:55 crc kubenswrapper[4585]: E1201 14:11:55.175775 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:440cde33d3a2a0c545cd1c110a3634eb85544370f448865b97a13c38034b0172,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jg9l8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-668d9c48b9-4kmsq_openstack-operators(7f8c91fb-441e-44f0-bf97-1340df47f4b0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 14:11:57 crc kubenswrapper[4585]: E1201 14:11:57.822662 4585 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94" Dec 01 14:11:57 crc kubenswrapper[4585]: E1201 14:11:57.823021 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-m9rgh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-798dw_openstack-operators(c756d201-c2d0-45f1-af3a-acdff1926a1a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 14:11:59 crc kubenswrapper[4585]: E1201 14:11:59.297084 4585 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Dec 01 14:11:59 crc kubenswrapper[4585]: E1201 14:11:59.297283 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mjgmb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-dcz6q_openstack-operators(f4100ac0-da14-4d72-88e8-7f7356dad361): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 14:11:59 crc kubenswrapper[4585]: E1201 14:11:59.298668 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dcz6q" podUID="f4100ac0-da14-4d72-88e8-7f7356dad361" Dec 01 14:12:00 crc kubenswrapper[4585]: E1201 14:12:00.052606 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dcz6q" podUID="f4100ac0-da14-4d72-88e8-7f7356dad361" Dec 01 14:12:06 crc kubenswrapper[4585]: E1201 14:12:06.276707 4585 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670" Dec 01 14:12:06 crc kubenswrapper[4585]: E1201 14:12:06.277411 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jmvbh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-w4g57_openstack-operators(baa99a85-be34-458d-bc16-c367d4635b10): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 14:12:06 crc kubenswrapper[4585]: I1201 14:12:06.361936 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f496d7d1-7362-487d-88d7-33e2c26ce97b-cert\") pod \"infra-operator-controller-manager-57548d458d-77nsb\" (UID: \"f496d7d1-7362-487d-88d7-33e2c26ce97b\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-77nsb" Dec 01 14:12:06 crc kubenswrapper[4585]: I1201 14:12:06.368047 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f496d7d1-7362-487d-88d7-33e2c26ce97b-cert\") pod \"infra-operator-controller-manager-57548d458d-77nsb\" (UID: \"f496d7d1-7362-487d-88d7-33e2c26ce97b\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-77nsb" Dec 01 14:12:06 crc kubenswrapper[4585]: I1201 14:12:06.541328 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-glw2s" Dec 01 14:12:06 crc kubenswrapper[4585]: I1201 14:12:06.550534 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-77nsb" Dec 01 14:12:07 crc kubenswrapper[4585]: I1201 14:12:07.213964 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4fr7bh"] Dec 01 14:12:07 crc kubenswrapper[4585]: I1201 14:12:07.278026 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2177fed7-edae-4e55-94fd-2037166cbfdc-metrics-certs\") pod \"openstack-operator-controller-manager-b9b8558c-w5sxw\" (UID: \"2177fed7-edae-4e55-94fd-2037166cbfdc\") " pod="openstack-operators/openstack-operator-controller-manager-b9b8558c-w5sxw" Dec 01 14:12:07 crc kubenswrapper[4585]: I1201 14:12:07.278118 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2177fed7-edae-4e55-94fd-2037166cbfdc-webhook-certs\") pod \"openstack-operator-controller-manager-b9b8558c-w5sxw\" (UID: \"2177fed7-edae-4e55-94fd-2037166cbfdc\") " pod="openstack-operators/openstack-operator-controller-manager-b9b8558c-w5sxw" Dec 01 14:12:07 crc kubenswrapper[4585]: I1201 14:12:07.284738 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2177fed7-edae-4e55-94fd-2037166cbfdc-webhook-certs\") pod \"openstack-operator-controller-manager-b9b8558c-w5sxw\" (UID: \"2177fed7-edae-4e55-94fd-2037166cbfdc\") " pod="openstack-operators/openstack-operator-controller-manager-b9b8558c-w5sxw" Dec 01 14:12:07 crc kubenswrapper[4585]: I1201 14:12:07.284776 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2177fed7-edae-4e55-94fd-2037166cbfdc-metrics-certs\") pod \"openstack-operator-controller-manager-b9b8558c-w5sxw\" (UID: \"2177fed7-edae-4e55-94fd-2037166cbfdc\") " pod="openstack-operators/openstack-operator-controller-manager-b9b8558c-w5sxw" Dec 01 14:12:07 crc kubenswrapper[4585]: I1201 14:12:07.520535 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-4w4pg" Dec 01 14:12:07 crc kubenswrapper[4585]: I1201 14:12:07.528813 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-b9b8558c-w5sxw" Dec 01 14:12:08 crc kubenswrapper[4585]: I1201 14:12:08.139745 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-sqs7f" event={"ID":"1d516bcd-4ed7-4c83-a07e-3a8f66761090","Type":"ContainerStarted","Data":"1b3c01851de922dd4b7565f2eb33d29f91d061862721bde4b7582e8f0c4cad86"} Dec 01 14:12:08 crc kubenswrapper[4585]: I1201 14:12:08.148887 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-ndffl" event={"ID":"1c21caba-6277-4106-b637-a4874412f527","Type":"ContainerStarted","Data":"3f1703a22b34f19601b22d4411539e8c1f95914c2e3b38ef0c8d8179d8709a9a"} Dec 01 14:12:08 crc kubenswrapper[4585]: I1201 14:12:08.170544 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4fr7bh" event={"ID":"bcc7d39e-d462-4eaa-89fa-625c72c956b6","Type":"ContainerStarted","Data":"918a58010ed56b76786ea12b32f74286067bd0ad3f3feb6cb7c8232d2e956f39"} Dec 01 14:12:08 crc kubenswrapper[4585]: I1201 14:12:08.230785 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-8fdcb" event={"ID":"f62dd90c-aa85-4650-92e0-13e52ec60360","Type":"ContainerStarted","Data":"b4ea4397493f97214209da674525212c656d3204f85a4d15caddecef4bc44c8f"} Dec 01 14:12:08 crc kubenswrapper[4585]: I1201 14:12:08.271573 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-77nsb"] Dec 01 14:12:08 crc kubenswrapper[4585]: I1201 14:12:08.281864 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" event={"ID":"f7beb40d-bcd0-43c8-a9fe-c32408790a4c","Type":"ContainerStarted","Data":"041ce578b922e9949f0fa3c528cdc2179672e360c800f9ad0c54e96def5e8b8a"} Dec 01 14:12:08 crc kubenswrapper[4585]: I1201 14:12:08.313231 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-pmpnl" event={"ID":"abe5e9b4-4f45-4fb6-92f7-739d4174996b","Type":"ContainerStarted","Data":"2c7967c44a75c4398abd7f5db1443fe167392e6dcd7f1697b38487de34b03c45"} Dec 01 14:12:09 crc kubenswrapper[4585]: I1201 14:12:09.330752 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-br8df" event={"ID":"8da768c2-cb8c-40f9-b8d1-54a66743b340","Type":"ContainerStarted","Data":"f705306e33f861c715c116edddb7b99c0662eae994a006cf38ceebc090a3e7ef"} Dec 01 14:12:09 crc kubenswrapper[4585]: W1201 14:12:09.614214 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf496d7d1_7362_487d_88d7_33e2c26ce97b.slice/crio-a28a2dac0080d427d0a931b9eecdb5df30b7f373238e1e3ca83f8017a6d98b37 WatchSource:0}: Error finding container a28a2dac0080d427d0a931b9eecdb5df30b7f373238e1e3ca83f8017a6d98b37: Status 404 returned error can't find the container with id a28a2dac0080d427d0a931b9eecdb5df30b7f373238e1e3ca83f8017a6d98b37 Dec 01 14:12:09 crc kubenswrapper[4585]: I1201 14:12:09.946298 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-b9b8558c-w5sxw"] Dec 01 14:12:10 crc kubenswrapper[4585]: I1201 14:12:10.338861 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-77nsb" event={"ID":"f496d7d1-7362-487d-88d7-33e2c26ce97b","Type":"ContainerStarted","Data":"a28a2dac0080d427d0a931b9eecdb5df30b7f373238e1e3ca83f8017a6d98b37"} Dec 01 14:12:10 crc kubenswrapper[4585]: I1201 14:12:10.342053 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-m6vwb" event={"ID":"cdbd6707-63ae-429d-8111-48ab6f912699","Type":"ContainerStarted","Data":"9d8e8eae57ca9832ea4c223583dfc612fca9ac9cf1b7a9df7e0891fa9ae6e5c3"} Dec 01 14:12:10 crc kubenswrapper[4585]: I1201 14:12:10.343180 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-b9b8558c-w5sxw" event={"ID":"2177fed7-edae-4e55-94fd-2037166cbfdc","Type":"ContainerStarted","Data":"52c14364061e9468f2e03517a5317f7f3404303e3ef814c8686bfdec4e153a3d"} Dec 01 14:12:10 crc kubenswrapper[4585]: I1201 14:12:10.344691 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-cqpws" event={"ID":"59cf60ed-a0d2-4ddc-bfc5-d5973ecbb9e4","Type":"ContainerStarted","Data":"bfced8a8bc2fceaf14e174964453b070dafb74d56de05e31abd433190f86f7ac"} Dec 01 14:12:11 crc kubenswrapper[4585]: I1201 14:12:11.354479 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-r4qhj" event={"ID":"7b6381d5-3b01-4c14-a553-e4a51274b140","Type":"ContainerStarted","Data":"712f07241ecf5ab9ffe1c5b0849cba222b9186e66fbc7ea2a714dac9dddc1415"} Dec 01 14:12:11 crc kubenswrapper[4585]: I1201 14:12:11.360946 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-xsbwl" event={"ID":"50b75abe-8fa5-4e48-87bb-560b5609feda","Type":"ContainerStarted","Data":"4e4d8d63d1f11645df7667be6979bf9c862ec67f374fdb3dbcd5ffac4133c5db"} Dec 01 14:12:11 crc kubenswrapper[4585]: I1201 14:12:11.364281 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5448bbd495-75vsz" event={"ID":"b847594a-d018-4939-8177-3faf4a42da5a","Type":"ContainerStarted","Data":"5f451b4244b8f519bd52c939c4a836cd6832c8ace8cbe5ff4a9c6e4456e8df47"} Dec 01 14:12:11 crc kubenswrapper[4585]: E1201 14:12:11.565679 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-998648c74-6hdc6" podUID="0854e7b6-a6fb-4fd0-9e48-564df5d8fea2" Dec 01 14:12:11 crc kubenswrapper[4585]: E1201 14:12:11.640168 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-4kmsq" podUID="7f8c91fb-441e-44f0-bf97-1340df47f4b0" Dec 01 14:12:11 crc kubenswrapper[4585]: E1201 14:12:11.917908 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-w4g57" podUID="baa99a85-be34-458d-bc16-c367d4635b10" Dec 01 14:12:12 crc kubenswrapper[4585]: I1201 14:12:12.433316 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-m6vwb" event={"ID":"cdbd6707-63ae-429d-8111-48ab6f912699","Type":"ContainerStarted","Data":"b0ac4787b2bb4b7e916d97982c1c2d88c0f674bb47efbaac29aa342cd84d59d6"} Dec 01 14:12:12 crc kubenswrapper[4585]: I1201 14:12:12.433762 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-78f8948974-m6vwb" Dec 01 14:12:12 crc kubenswrapper[4585]: I1201 14:12:12.454211 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-4kmsq" event={"ID":"7f8c91fb-441e-44f0-bf97-1340df47f4b0","Type":"ContainerStarted","Data":"1b2d844e97ed0b62606dcd6a801fcf39a3e1f12d527b26283e5af2f25a3df9c5"} Dec 01 14:12:12 crc kubenswrapper[4585]: I1201 14:12:12.469479 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-b9b8558c-w5sxw" event={"ID":"2177fed7-edae-4e55-94fd-2037166cbfdc","Type":"ContainerStarted","Data":"7efef193b7ffa86a3dc56b3aaa5bb83a415fb68b32c2727befec28d39c376dae"} Dec 01 14:12:12 crc kubenswrapper[4585]: I1201 14:12:12.470836 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-b9b8558c-w5sxw" Dec 01 14:12:12 crc kubenswrapper[4585]: I1201 14:12:12.472777 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-78f8948974-m6vwb" podStartSLOduration=3.927050393 podStartE2EDuration="38.472756973s" podCreationTimestamp="2025-12-01 14:11:34 +0000 UTC" firstStartedPulling="2025-12-01 14:11:36.899096442 +0000 UTC m=+810.883310297" lastFinishedPulling="2025-12-01 14:12:11.444803022 +0000 UTC m=+845.429016877" observedRunningTime="2025-12-01 14:12:12.461467692 +0000 UTC m=+846.445681547" watchObservedRunningTime="2025-12-01 14:12:12.472756973 +0000 UTC m=+846.456970828" Dec 01 14:12:12 crc kubenswrapper[4585]: I1201 14:12:12.492104 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-pmpnl" event={"ID":"abe5e9b4-4f45-4fb6-92f7-739d4174996b","Type":"ContainerStarted","Data":"0ec1066cabb63aaa54c047c721c6c0d20f6f78d225265f941d448efa0927f4d1"} Dec 01 14:12:12 crc kubenswrapper[4585]: I1201 14:12:12.493422 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-pmpnl" Dec 01 14:12:12 crc kubenswrapper[4585]: I1201 14:12:12.506435 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-pmpnl" Dec 01 14:12:12 crc kubenswrapper[4585]: I1201 14:12:12.513803 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-w4g57" event={"ID":"baa99a85-be34-458d-bc16-c367d4635b10","Type":"ContainerStarted","Data":"f77d09b0fb8e66d360837f166d9a0a5b67aa354e1e3cca946c75c261a9b8e1ef"} Dec 01 14:12:12 crc kubenswrapper[4585]: E1201 14:12:12.515122 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670\\\"\"" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-w4g57" podUID="baa99a85-be34-458d-bc16-c367d4635b10" Dec 01 14:12:12 crc kubenswrapper[4585]: I1201 14:12:12.520330 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-xsbwl" event={"ID":"50b75abe-8fa5-4e48-87bb-560b5609feda","Type":"ContainerStarted","Data":"37a1ad05e2b91f625d23b28839b43b0def78d1e75073f5b605d7ded67de929df"} Dec 01 14:12:12 crc kubenswrapper[4585]: I1201 14:12:12.521092 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-xsbwl" Dec 01 14:12:12 crc kubenswrapper[4585]: I1201 14:12:12.532339 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-6hdc6" event={"ID":"0854e7b6-a6fb-4fd0-9e48-564df5d8fea2","Type":"ContainerStarted","Data":"ef1deb027bdd9225f98f7078e20ba913287a3bb78e2bb63720d6e00828bfdd47"} Dec 01 14:12:12 crc kubenswrapper[4585]: I1201 14:12:12.542466 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-pmpnl" podStartSLOduration=4.029156825 podStartE2EDuration="38.542451051s" podCreationTimestamp="2025-12-01 14:11:34 +0000 UTC" firstStartedPulling="2025-12-01 14:11:36.805454846 +0000 UTC m=+810.789668701" lastFinishedPulling="2025-12-01 14:12:11.318749072 +0000 UTC m=+845.302962927" observedRunningTime="2025-12-01 14:12:12.541404533 +0000 UTC m=+846.525618378" watchObservedRunningTime="2025-12-01 14:12:12.542451051 +0000 UTC m=+846.526664906" Dec 01 14:12:12 crc kubenswrapper[4585]: I1201 14:12:12.549179 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-r4qhj" event={"ID":"7b6381d5-3b01-4c14-a553-e4a51274b140","Type":"ContainerStarted","Data":"8a7736684cf28ee24e07c984b00b74b4d02ae7e32d49027e1818d26aae157fa3"} Dec 01 14:12:12 crc kubenswrapper[4585]: I1201 14:12:12.549839 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-r4qhj" Dec 01 14:12:12 crc kubenswrapper[4585]: E1201 14:12:12.586174 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-vxz95" podUID="5afd79b9-5528-4ffe-9d3f-ac7b05502348" Dec 01 14:12:12 crc kubenswrapper[4585]: I1201 14:12:12.597939 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-b9b8558c-w5sxw" podStartSLOduration=37.597918609 podStartE2EDuration="37.597918609s" podCreationTimestamp="2025-12-01 14:11:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:12:12.58820021 +0000 UTC m=+846.572414065" watchObservedRunningTime="2025-12-01 14:12:12.597918609 +0000 UTC m=+846.582132464" Dec 01 14:12:12 crc kubenswrapper[4585]: I1201 14:12:12.652302 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-xsbwl" podStartSLOduration=7.9189956630000005 podStartE2EDuration="38.652288979s" podCreationTimestamp="2025-12-01 14:11:34 +0000 UTC" firstStartedPulling="2025-12-01 14:11:36.907023673 +0000 UTC m=+810.891237528" lastFinishedPulling="2025-12-01 14:12:07.640316989 +0000 UTC m=+841.624530844" observedRunningTime="2025-12-01 14:12:12.625820123 +0000 UTC m=+846.610033978" watchObservedRunningTime="2025-12-01 14:12:12.652288979 +0000 UTC m=+846.636502834" Dec 01 14:12:12 crc kubenswrapper[4585]: I1201 14:12:12.653324 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-r4qhj" podStartSLOduration=8.472075625 podStartE2EDuration="38.653320636s" podCreationTimestamp="2025-12-01 14:11:34 +0000 UTC" firstStartedPulling="2025-12-01 14:11:36.977873872 +0000 UTC m=+810.962087727" lastFinishedPulling="2025-12-01 14:12:07.159118883 +0000 UTC m=+841.143332738" observedRunningTime="2025-12-01 14:12:12.652099664 +0000 UTC m=+846.636313519" watchObservedRunningTime="2025-12-01 14:12:12.653320636 +0000 UTC m=+846.637534491" Dec 01 14:12:12 crc kubenswrapper[4585]: E1201 14:12:12.697337 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-5854674fcc-798dw" podUID="c756d201-c2d0-45f1-af3a-acdff1926a1a" Dec 01 14:12:12 crc kubenswrapper[4585]: E1201 14:12:12.728040 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-8qd82" podUID="c4697227-2800-4a64-89bf-5bf831077ceb" Dec 01 14:12:12 crc kubenswrapper[4585]: E1201 14:12:12.728890 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-qpqgr" podUID="60bbdebb-4ac8-4971-82b2-252a989a8c3a" Dec 01 14:12:12 crc kubenswrapper[4585]: E1201 14:12:12.831829 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-qnbzj" podUID="59994d2c-6485-4beb-bcfc-3fd4a22bd203" Dec 01 14:12:12 crc kubenswrapper[4585]: E1201 14:12:12.878345 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-tgt7n" podUID="a672a71f-0885-4771-811e-fd658d282a84" Dec 01 14:12:13 crc kubenswrapper[4585]: I1201 14:12:13.564810 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-qnbzj" event={"ID":"59994d2c-6485-4beb-bcfc-3fd4a22bd203","Type":"ContainerStarted","Data":"472dd53901bf7d89179e363a53de60d03fddc05e96670b64a5c0ed67e0da551c"} Dec 01 14:12:13 crc kubenswrapper[4585]: I1201 14:12:13.578577 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-sqs7f" event={"ID":"1d516bcd-4ed7-4c83-a07e-3a8f66761090","Type":"ContainerStarted","Data":"16019d698d45a0a005fb1e225a8ee07a969f58d99b63b60c967b6d9d97c34c17"} Dec 01 14:12:13 crc kubenswrapper[4585]: I1201 14:12:13.578853 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-sqs7f" Dec 01 14:12:13 crc kubenswrapper[4585]: I1201 14:12:13.583204 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-sqs7f" Dec 01 14:12:13 crc kubenswrapper[4585]: I1201 14:12:13.592034 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-8qd82" event={"ID":"c4697227-2800-4a64-89bf-5bf831077ceb","Type":"ContainerStarted","Data":"6c6185e9517c6438fe9d33dfc709972df1745e30b02e3ded16d58cb08291f071"} Dec 01 14:12:13 crc kubenswrapper[4585]: I1201 14:12:13.624605 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-sqs7f" podStartSLOduration=5.174670841 podStartE2EDuration="39.624586647s" podCreationTimestamp="2025-12-01 14:11:34 +0000 UTC" firstStartedPulling="2025-12-01 14:11:36.894418417 +0000 UTC m=+810.878632272" lastFinishedPulling="2025-12-01 14:12:11.344334223 +0000 UTC m=+845.328548078" observedRunningTime="2025-12-01 14:12:13.621184046 +0000 UTC m=+847.605397911" watchObservedRunningTime="2025-12-01 14:12:13.624586647 +0000 UTC m=+847.608800502" Dec 01 14:12:13 crc kubenswrapper[4585]: I1201 14:12:13.631957 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-798dw" event={"ID":"c756d201-c2d0-45f1-af3a-acdff1926a1a","Type":"ContainerStarted","Data":"481a8fba073f32254efb389d434110fc05bc31b7e73af0d378628529a9c87c01"} Dec 01 14:12:13 crc kubenswrapper[4585]: I1201 14:12:13.654797 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-8fdcb" event={"ID":"f62dd90c-aa85-4650-92e0-13e52ec60360","Type":"ContainerStarted","Data":"b0b0926873097992be08b8051bca47c18248a3e5a140c1063e881fe187b2cf84"} Dec 01 14:12:13 crc kubenswrapper[4585]: I1201 14:12:13.655311 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-8fdcb" Dec 01 14:12:13 crc kubenswrapper[4585]: I1201 14:12:13.661122 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-8fdcb" Dec 01 14:12:13 crc kubenswrapper[4585]: I1201 14:12:13.672071 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5448bbd495-75vsz" event={"ID":"b847594a-d018-4939-8177-3faf4a42da5a","Type":"ContainerStarted","Data":"faeceb108a3273e2778c0aeda3680ab9ae15c92f62fdad91c4460a5511ba2940"} Dec 01 14:12:13 crc kubenswrapper[4585]: I1201 14:12:13.672677 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5448bbd495-75vsz" Dec 01 14:12:13 crc kubenswrapper[4585]: I1201 14:12:13.682890 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dcz6q" event={"ID":"f4100ac0-da14-4d72-88e8-7f7356dad361","Type":"ContainerStarted","Data":"47a298beda1bf29c2b33c53df82e7a7b1989daa35a20971617b94a5ead3c3111"} Dec 01 14:12:13 crc kubenswrapper[4585]: I1201 14:12:13.690802 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-tgt7n" event={"ID":"a672a71f-0885-4771-811e-fd658d282a84","Type":"ContainerStarted","Data":"478bd3f490babd96ebdb1d98dd9e28d2203630fd058e4e1f12a5662d3a47e843"} Dec 01 14:12:13 crc kubenswrapper[4585]: I1201 14:12:13.714295 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-cqpws" event={"ID":"59cf60ed-a0d2-4ddc-bfc5-d5973ecbb9e4","Type":"ContainerStarted","Data":"9e0616af58dce7faed77c5b49b05564174c3b3d5ed06fcdeb0b2fb9ced40e8ca"} Dec 01 14:12:13 crc kubenswrapper[4585]: I1201 14:12:13.715062 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-cqpws" Dec 01 14:12:13 crc kubenswrapper[4585]: I1201 14:12:13.728110 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-qpqgr" event={"ID":"60bbdebb-4ac8-4971-82b2-252a989a8c3a","Type":"ContainerStarted","Data":"a8b5bb1a373d0c63d5bb10043e5262281cdba70704ca55f7faa9f0030734ceb9"} Dec 01 14:12:13 crc kubenswrapper[4585]: I1201 14:12:13.729381 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-8fdcb" podStartSLOduration=5.030187679 podStartE2EDuration="39.72936464s" podCreationTimestamp="2025-12-01 14:11:34 +0000 UTC" firstStartedPulling="2025-12-01 14:11:36.79436815 +0000 UTC m=+810.778582005" lastFinishedPulling="2025-12-01 14:12:11.493545111 +0000 UTC m=+845.477758966" observedRunningTime="2025-12-01 14:12:13.72786481 +0000 UTC m=+847.712078665" watchObservedRunningTime="2025-12-01 14:12:13.72936464 +0000 UTC m=+847.713578495" Dec 01 14:12:13 crc kubenswrapper[4585]: I1201 14:12:13.734628 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-br8df" event={"ID":"8da768c2-cb8c-40f9-b8d1-54a66743b340","Type":"ContainerStarted","Data":"1c5a05aafd3df50e87915cbdee52d1f3334991b0ed99cd5eef8f50ee421e7e7d"} Dec 01 14:12:13 crc kubenswrapper[4585]: I1201 14:12:13.736107 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-br8df" Dec 01 14:12:13 crc kubenswrapper[4585]: I1201 14:12:13.749647 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-br8df" Dec 01 14:12:13 crc kubenswrapper[4585]: I1201 14:12:13.751823 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-vxz95" event={"ID":"5afd79b9-5528-4ffe-9d3f-ac7b05502348","Type":"ContainerStarted","Data":"c6d0685b286f98f88cdd5758ca7be65bfd75c6fba612ccae12e05a1a7a68df6a"} Dec 01 14:12:13 crc kubenswrapper[4585]: I1201 14:12:13.779379 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-4kmsq" event={"ID":"7f8c91fb-441e-44f0-bf97-1340df47f4b0","Type":"ContainerStarted","Data":"17b91714b34edb7c5fe3777f39ff3b3a0dd693ccdace571c935299a22dcdff3d"} Dec 01 14:12:13 crc kubenswrapper[4585]: I1201 14:12:13.780108 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-4kmsq" Dec 01 14:12:13 crc kubenswrapper[4585]: I1201 14:12:13.780940 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-cqpws" podStartSLOduration=5.188600781 podStartE2EDuration="39.780930464s" podCreationTimestamp="2025-12-01 14:11:34 +0000 UTC" firstStartedPulling="2025-12-01 14:11:37.054757021 +0000 UTC m=+811.038970876" lastFinishedPulling="2025-12-01 14:12:11.647086704 +0000 UTC m=+845.631300559" observedRunningTime="2025-12-01 14:12:13.772173931 +0000 UTC m=+847.756387786" watchObservedRunningTime="2025-12-01 14:12:13.780930464 +0000 UTC m=+847.765144319" Dec 01 14:12:13 crc kubenswrapper[4585]: I1201 14:12:13.796286 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-ndffl" event={"ID":"1c21caba-6277-4106-b637-a4874412f527","Type":"ContainerStarted","Data":"3e1281efe525a2e208508945c23368b6fc101ab665b7a82058295c70e88e61cc"} Dec 01 14:12:13 crc kubenswrapper[4585]: I1201 14:12:13.796323 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-ndffl" Dec 01 14:12:13 crc kubenswrapper[4585]: I1201 14:12:13.805266 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-ndffl" Dec 01 14:12:13 crc kubenswrapper[4585]: I1201 14:12:13.814397 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dcz6q" podStartSLOduration=3.9354634649999998 podStartE2EDuration="38.814380256s" podCreationTimestamp="2025-12-01 14:11:35 +0000 UTC" firstStartedPulling="2025-12-01 14:11:36.999135779 +0000 UTC m=+810.983349634" lastFinishedPulling="2025-12-01 14:12:11.87805257 +0000 UTC m=+845.862266425" observedRunningTime="2025-12-01 14:12:13.805211292 +0000 UTC m=+847.789425137" watchObservedRunningTime="2025-12-01 14:12:13.814380256 +0000 UTC m=+847.798594111" Dec 01 14:12:13 crc kubenswrapper[4585]: I1201 14:12:13.897456 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5448bbd495-75vsz" podStartSLOduration=8.994838621 podStartE2EDuration="39.89743942s" podCreationTimestamp="2025-12-01 14:11:34 +0000 UTC" firstStartedPulling="2025-12-01 14:11:37.046292356 +0000 UTC m=+811.030506211" lastFinishedPulling="2025-12-01 14:12:07.948893155 +0000 UTC m=+841.933107010" observedRunningTime="2025-12-01 14:12:13.873564674 +0000 UTC m=+847.857778529" watchObservedRunningTime="2025-12-01 14:12:13.89743942 +0000 UTC m=+847.881653275" Dec 01 14:12:13 crc kubenswrapper[4585]: I1201 14:12:13.980736 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-4kmsq" podStartSLOduration=3.887095308 podStartE2EDuration="39.9807202s" podCreationTimestamp="2025-12-01 14:11:34 +0000 UTC" firstStartedPulling="2025-12-01 14:11:36.898850595 +0000 UTC m=+810.883064450" lastFinishedPulling="2025-12-01 14:12:12.992475477 +0000 UTC m=+846.976689342" observedRunningTime="2025-12-01 14:12:13.945363397 +0000 UTC m=+847.929577252" watchObservedRunningTime="2025-12-01 14:12:13.9807202 +0000 UTC m=+847.964934055" Dec 01 14:12:14 crc kubenswrapper[4585]: I1201 14:12:14.044886 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-ndffl" podStartSLOduration=5.31495744 podStartE2EDuration="40.04486339s" podCreationTimestamp="2025-12-01 14:11:34 +0000 UTC" firstStartedPulling="2025-12-01 14:11:36.857608966 +0000 UTC m=+810.841822821" lastFinishedPulling="2025-12-01 14:12:11.587514916 +0000 UTC m=+845.571728771" observedRunningTime="2025-12-01 14:12:14.022466453 +0000 UTC m=+848.006680308" watchObservedRunningTime="2025-12-01 14:12:14.04486339 +0000 UTC m=+848.029077245" Dec 01 14:12:14 crc kubenswrapper[4585]: I1201 14:12:14.046622 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-br8df" podStartSLOduration=3.716275705 podStartE2EDuration="40.046613466s" podCreationTimestamp="2025-12-01 14:11:34 +0000 UTC" firstStartedPulling="2025-12-01 14:11:35.607898864 +0000 UTC m=+809.592112709" lastFinishedPulling="2025-12-01 14:12:11.938236615 +0000 UTC m=+845.922450470" observedRunningTime="2025-12-01 14:12:14.043353099 +0000 UTC m=+848.027566954" watchObservedRunningTime="2025-12-01 14:12:14.046613466 +0000 UTC m=+848.030827311" Dec 01 14:12:14 crc kubenswrapper[4585]: I1201 14:12:14.801776 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-qnbzj" event={"ID":"59994d2c-6485-4beb-bcfc-3fd4a22bd203","Type":"ContainerStarted","Data":"159593ddb94a06508277832947b0047bb734eebc240b4a70c1495ed5033387a3"} Dec 01 14:12:14 crc kubenswrapper[4585]: I1201 14:12:14.801894 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-qnbzj" Dec 01 14:12:14 crc kubenswrapper[4585]: I1201 14:12:14.803178 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-8qd82" event={"ID":"c4697227-2800-4a64-89bf-5bf831077ceb","Type":"ContainerStarted","Data":"8b2c42faadac128a0b3650522282b42f30592831db24a23a37522d4b0e66e894"} Dec 01 14:12:14 crc kubenswrapper[4585]: I1201 14:12:14.803368 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-8qd82" Dec 01 14:12:14 crc kubenswrapper[4585]: I1201 14:12:14.805398 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-6hdc6" event={"ID":"0854e7b6-a6fb-4fd0-9e48-564df5d8fea2","Type":"ContainerStarted","Data":"a888c74fd520b1c26ed4d59127c917be6deddcc3424770159aa4ad71ba46efd6"} Dec 01 14:12:14 crc kubenswrapper[4585]: I1201 14:12:14.805425 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-998648c74-6hdc6" Dec 01 14:12:14 crc kubenswrapper[4585]: I1201 14:12:14.823274 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-qnbzj" podStartSLOduration=3.5770071310000002 podStartE2EDuration="40.823253428s" podCreationTimestamp="2025-12-01 14:11:34 +0000 UTC" firstStartedPulling="2025-12-01 14:11:36.892470415 +0000 UTC m=+810.876684260" lastFinishedPulling="2025-12-01 14:12:14.138716702 +0000 UTC m=+848.122930557" observedRunningTime="2025-12-01 14:12:14.822444286 +0000 UTC m=+848.806658131" watchObservedRunningTime="2025-12-01 14:12:14.823253428 +0000 UTC m=+848.807467283" Dec 01 14:12:14 crc kubenswrapper[4585]: I1201 14:12:14.839175 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-8qd82" podStartSLOduration=3.545088861 podStartE2EDuration="40.839162152s" podCreationTimestamp="2025-12-01 14:11:34 +0000 UTC" firstStartedPulling="2025-12-01 14:11:36.894710355 +0000 UTC m=+810.878924210" lastFinishedPulling="2025-12-01 14:12:14.188783656 +0000 UTC m=+848.172997501" observedRunningTime="2025-12-01 14:12:14.837901008 +0000 UTC m=+848.822114863" watchObservedRunningTime="2025-12-01 14:12:14.839162152 +0000 UTC m=+848.823376007" Dec 01 14:12:14 crc kubenswrapper[4585]: I1201 14:12:14.855543 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-998648c74-6hdc6" podStartSLOduration=4.090406828 podStartE2EDuration="40.855523658s" podCreationTimestamp="2025-12-01 14:11:34 +0000 UTC" firstStartedPulling="2025-12-01 14:11:36.591457862 +0000 UTC m=+810.575671717" lastFinishedPulling="2025-12-01 14:12:13.356574692 +0000 UTC m=+847.340788547" observedRunningTime="2025-12-01 14:12:14.849647811 +0000 UTC m=+848.833861666" watchObservedRunningTime="2025-12-01 14:12:14.855523658 +0000 UTC m=+848.839737513" Dec 01 14:12:15 crc kubenswrapper[4585]: I1201 14:12:15.451312 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-78f8948974-m6vwb" Dec 01 14:12:15 crc kubenswrapper[4585]: I1201 14:12:15.697392 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-cqpws" Dec 01 14:12:15 crc kubenswrapper[4585]: I1201 14:12:15.815792 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-tgt7n" event={"ID":"a672a71f-0885-4771-811e-fd658d282a84","Type":"ContainerStarted","Data":"fc8ba51fb85f62c4b68d5ba92b47591d95b23c942e762449909ab08458614f0d"} Dec 01 14:12:15 crc kubenswrapper[4585]: I1201 14:12:15.816896 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-tgt7n" Dec 01 14:12:15 crc kubenswrapper[4585]: I1201 14:12:15.819601 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-vxz95" event={"ID":"5afd79b9-5528-4ffe-9d3f-ac7b05502348","Type":"ContainerStarted","Data":"05b517278740c983fb01ec392e0b9df01b125d276bdb958304986efea2a71f39"} Dec 01 14:12:15 crc kubenswrapper[4585]: I1201 14:12:15.820098 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-vxz95" Dec 01 14:12:15 crc kubenswrapper[4585]: I1201 14:12:15.828502 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-798dw" event={"ID":"c756d201-c2d0-45f1-af3a-acdff1926a1a","Type":"ContainerStarted","Data":"418a41f1f4fa8c36105f3baf2961792fc2670be064db4f52a11bf86e0f62cb7f"} Dec 01 14:12:15 crc kubenswrapper[4585]: I1201 14:12:15.828568 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5854674fcc-798dw" Dec 01 14:12:15 crc kubenswrapper[4585]: I1201 14:12:15.840987 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-tgt7n" podStartSLOduration=4.319798172 podStartE2EDuration="41.840952026s" podCreationTimestamp="2025-12-01 14:11:34 +0000 UTC" firstStartedPulling="2025-12-01 14:11:36.889005053 +0000 UTC m=+810.873218908" lastFinishedPulling="2025-12-01 14:12:14.410158907 +0000 UTC m=+848.394372762" observedRunningTime="2025-12-01 14:12:15.833907628 +0000 UTC m=+849.818121483" watchObservedRunningTime="2025-12-01 14:12:15.840952026 +0000 UTC m=+849.825165881" Dec 01 14:12:15 crc kubenswrapper[4585]: I1201 14:12:15.854872 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-vxz95" podStartSLOduration=4.264852397 podStartE2EDuration="41.854855706s" podCreationTimestamp="2025-12-01 14:11:34 +0000 UTC" firstStartedPulling="2025-12-01 14:11:36.823488457 +0000 UTC m=+810.807702312" lastFinishedPulling="2025-12-01 14:12:14.413491766 +0000 UTC m=+848.397705621" observedRunningTime="2025-12-01 14:12:15.851172088 +0000 UTC m=+849.835385943" watchObservedRunningTime="2025-12-01 14:12:15.854855706 +0000 UTC m=+849.839069561" Dec 01 14:12:15 crc kubenswrapper[4585]: I1201 14:12:15.873616 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5854674fcc-798dw" podStartSLOduration=4.689875656 podStartE2EDuration="41.873601266s" podCreationTimestamp="2025-12-01 14:11:34 +0000 UTC" firstStartedPulling="2025-12-01 14:11:37.229952271 +0000 UTC m=+811.214166116" lastFinishedPulling="2025-12-01 14:12:14.413677871 +0000 UTC m=+848.397891726" observedRunningTime="2025-12-01 14:12:15.869834136 +0000 UTC m=+849.854047991" watchObservedRunningTime="2025-12-01 14:12:15.873601266 +0000 UTC m=+849.857815121" Dec 01 14:12:17 crc kubenswrapper[4585]: I1201 14:12:17.537329 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-b9b8558c-w5sxw" Dec 01 14:12:17 crc kubenswrapper[4585]: I1201 14:12:17.847544 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-qpqgr" event={"ID":"60bbdebb-4ac8-4971-82b2-252a989a8c3a","Type":"ContainerStarted","Data":"61378f6acc9417b9fd35473fad39b2437ac06c2678af9e4f79dc1fdbb5958a1a"} Dec 01 14:12:17 crc kubenswrapper[4585]: I1201 14:12:17.848369 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-qpqgr" Dec 01 14:12:17 crc kubenswrapper[4585]: I1201 14:12:17.850566 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4fr7bh" event={"ID":"bcc7d39e-d462-4eaa-89fa-625c72c956b6","Type":"ContainerStarted","Data":"8eccefb0cf2141bd44eedceadaed4d1a3aa2745eee7b0f42e86be547d2de7057"} Dec 01 14:12:17 crc kubenswrapper[4585]: I1201 14:12:17.851840 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-77nsb" event={"ID":"f496d7d1-7362-487d-88d7-33e2c26ce97b","Type":"ContainerStarted","Data":"bc3322828715e2067773378d7670430d0181444105ea38f4d803310b4fd591ee"} Dec 01 14:12:17 crc kubenswrapper[4585]: I1201 14:12:17.875935 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-qpqgr" podStartSLOduration=3.302171007 podStartE2EDuration="43.875919591s" podCreationTimestamp="2025-12-01 14:11:34 +0000 UTC" firstStartedPulling="2025-12-01 14:11:36.860460342 +0000 UTC m=+810.844674197" lastFinishedPulling="2025-12-01 14:12:17.434208926 +0000 UTC m=+851.418422781" observedRunningTime="2025-12-01 14:12:17.873618579 +0000 UTC m=+851.857832434" watchObservedRunningTime="2025-12-01 14:12:17.875919591 +0000 UTC m=+851.860133446" Dec 01 14:12:18 crc kubenswrapper[4585]: I1201 14:12:18.859750 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4fr7bh" event={"ID":"bcc7d39e-d462-4eaa-89fa-625c72c956b6","Type":"ContainerStarted","Data":"eceb86191e2c51f56015df179de1c2a3bed659061903d1483fd7240eb172102b"} Dec 01 14:12:18 crc kubenswrapper[4585]: I1201 14:12:18.860388 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4fr7bh" Dec 01 14:12:18 crc kubenswrapper[4585]: I1201 14:12:18.861420 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-77nsb" event={"ID":"f496d7d1-7362-487d-88d7-33e2c26ce97b","Type":"ContainerStarted","Data":"f8564fa266a0d54d78f6404d13666cdfb62dd157233cebd9cbd2a30ed05e6dab"} Dec 01 14:12:18 crc kubenswrapper[4585]: I1201 14:12:18.891852 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4fr7bh" podStartSLOduration=35.092435296 podStartE2EDuration="44.89183613s" podCreationTimestamp="2025-12-01 14:11:34 +0000 UTC" firstStartedPulling="2025-12-01 14:12:07.643181485 +0000 UTC m=+841.627395340" lastFinishedPulling="2025-12-01 14:12:17.442582319 +0000 UTC m=+851.426796174" observedRunningTime="2025-12-01 14:12:18.885659276 +0000 UTC m=+852.869873141" watchObservedRunningTime="2025-12-01 14:12:18.89183613 +0000 UTC m=+852.876049995" Dec 01 14:12:18 crc kubenswrapper[4585]: I1201 14:12:18.911784 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-57548d458d-77nsb" podStartSLOduration=37.087147797 podStartE2EDuration="44.911765861s" podCreationTimestamp="2025-12-01 14:11:34 +0000 UTC" firstStartedPulling="2025-12-01 14:12:09.636660744 +0000 UTC m=+843.620874599" lastFinishedPulling="2025-12-01 14:12:17.461278808 +0000 UTC m=+851.445492663" observedRunningTime="2025-12-01 14:12:18.906043999 +0000 UTC m=+852.890257864" watchObservedRunningTime="2025-12-01 14:12:18.911765861 +0000 UTC m=+852.895979736" Dec 01 14:12:19 crc kubenswrapper[4585]: I1201 14:12:19.869571 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-57548d458d-77nsb" Dec 01 14:12:24 crc kubenswrapper[4585]: I1201 14:12:24.552170 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-qnbzj" Dec 01 14:12:24 crc kubenswrapper[4585]: I1201 14:12:24.582819 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-4kmsq" Dec 01 14:12:24 crc kubenswrapper[4585]: I1201 14:12:24.631132 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-8qd82" Dec 01 14:12:24 crc kubenswrapper[4585]: I1201 14:12:24.645445 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-qpqgr" Dec 01 14:12:24 crc kubenswrapper[4585]: I1201 14:12:24.666630 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-vxz95" Dec 01 14:12:24 crc kubenswrapper[4585]: I1201 14:12:24.862475 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-xsbwl" Dec 01 14:12:25 crc kubenswrapper[4585]: I1201 14:12:25.073340 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-998648c74-6hdc6" Dec 01 14:12:25 crc kubenswrapper[4585]: I1201 14:12:25.339814 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-r4qhj" Dec 01 14:12:25 crc kubenswrapper[4585]: E1201 14:12:25.413795 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670\\\"\"" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-w4g57" podUID="baa99a85-be34-458d-bc16-c367d4635b10" Dec 01 14:12:25 crc kubenswrapper[4585]: I1201 14:12:25.532513 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5448bbd495-75vsz" Dec 01 14:12:27 crc kubenswrapper[4585]: I1201 14:12:27.143731 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-tgt7n" Dec 01 14:12:27 crc kubenswrapper[4585]: I1201 14:12:27.178512 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5854674fcc-798dw" Dec 01 14:12:27 crc kubenswrapper[4585]: I1201 14:12:27.178558 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-57548d458d-77nsb" Dec 01 14:12:30 crc kubenswrapper[4585]: I1201 14:12:30.773140 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4fr7bh" Dec 01 14:12:38 crc kubenswrapper[4585]: I1201 14:12:38.415014 4585 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 14:12:40 crc kubenswrapper[4585]: I1201 14:12:40.244848 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-w4g57" event={"ID":"baa99a85-be34-458d-bc16-c367d4635b10","Type":"ContainerStarted","Data":"55e35a49e5936439e7e814fb2a9c7772a62420dee45f88440735c87c39a36f65"} Dec 01 14:12:40 crc kubenswrapper[4585]: I1201 14:12:40.245985 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-w4g57" Dec 01 14:12:40 crc kubenswrapper[4585]: I1201 14:12:40.266269 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-w4g57" podStartSLOduration=3.830830448 podStartE2EDuration="1m6.266242778s" podCreationTimestamp="2025-12-01 14:11:34 +0000 UTC" firstStartedPulling="2025-12-01 14:11:36.899430671 +0000 UTC m=+810.883644526" lastFinishedPulling="2025-12-01 14:12:39.334843001 +0000 UTC m=+873.319056856" observedRunningTime="2025-12-01 14:12:40.258515372 +0000 UTC m=+874.242729257" watchObservedRunningTime="2025-12-01 14:12:40.266242778 +0000 UTC m=+874.250456653" Dec 01 14:12:45 crc kubenswrapper[4585]: I1201 14:12:45.127795 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-w4g57" Dec 01 14:12:59 crc kubenswrapper[4585]: I1201 14:12:59.983544 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-ntss9"] Dec 01 14:12:59 crc kubenswrapper[4585]: I1201 14:12:59.985127 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-ntss9" Dec 01 14:12:59 crc kubenswrapper[4585]: I1201 14:12:59.992133 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 01 14:12:59 crc kubenswrapper[4585]: I1201 14:12:59.992548 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-bnnjs" Dec 01 14:12:59 crc kubenswrapper[4585]: I1201 14:12:59.992686 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 01 14:12:59 crc kubenswrapper[4585]: I1201 14:12:59.992697 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 01 14:13:00 crc kubenswrapper[4585]: I1201 14:13:00.012945 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-ntss9"] Dec 01 14:13:00 crc kubenswrapper[4585]: I1201 14:13:00.049164 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-sc2xr"] Dec 01 14:13:00 crc kubenswrapper[4585]: I1201 14:13:00.050252 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-sc2xr" Dec 01 14:13:00 crc kubenswrapper[4585]: I1201 14:13:00.055863 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 01 14:13:00 crc kubenswrapper[4585]: I1201 14:13:00.063520 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e3ec753-b8c2-4b1c-b05e-06a62eefe234-config\") pod \"dnsmasq-dns-675f4bcbfc-ntss9\" (UID: \"3e3ec753-b8c2-4b1c-b05e-06a62eefe234\") " pod="openstack/dnsmasq-dns-675f4bcbfc-ntss9" Dec 01 14:13:00 crc kubenswrapper[4585]: I1201 14:13:00.063581 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hjt4\" (UniqueName: \"kubernetes.io/projected/3e3ec753-b8c2-4b1c-b05e-06a62eefe234-kube-api-access-6hjt4\") pod \"dnsmasq-dns-675f4bcbfc-ntss9\" (UID: \"3e3ec753-b8c2-4b1c-b05e-06a62eefe234\") " pod="openstack/dnsmasq-dns-675f4bcbfc-ntss9" Dec 01 14:13:00 crc kubenswrapper[4585]: I1201 14:13:00.117123 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-sc2xr"] Dec 01 14:13:00 crc kubenswrapper[4585]: I1201 14:13:00.164872 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rm2k5\" (UniqueName: \"kubernetes.io/projected/d6f85986-34f0-4c32-9c59-2a9d9aa8d7f7-kube-api-access-rm2k5\") pod \"dnsmasq-dns-78dd6ddcc-sc2xr\" (UID: \"d6f85986-34f0-4c32-9c59-2a9d9aa8d7f7\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sc2xr" Dec 01 14:13:00 crc kubenswrapper[4585]: I1201 14:13:00.164935 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e3ec753-b8c2-4b1c-b05e-06a62eefe234-config\") pod \"dnsmasq-dns-675f4bcbfc-ntss9\" (UID: \"3e3ec753-b8c2-4b1c-b05e-06a62eefe234\") " pod="openstack/dnsmasq-dns-675f4bcbfc-ntss9" Dec 01 14:13:00 crc kubenswrapper[4585]: I1201 14:13:00.164993 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hjt4\" (UniqueName: \"kubernetes.io/projected/3e3ec753-b8c2-4b1c-b05e-06a62eefe234-kube-api-access-6hjt4\") pod \"dnsmasq-dns-675f4bcbfc-ntss9\" (UID: \"3e3ec753-b8c2-4b1c-b05e-06a62eefe234\") " pod="openstack/dnsmasq-dns-675f4bcbfc-ntss9" Dec 01 14:13:00 crc kubenswrapper[4585]: I1201 14:13:00.165016 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6f85986-34f0-4c32-9c59-2a9d9aa8d7f7-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-sc2xr\" (UID: \"d6f85986-34f0-4c32-9c59-2a9d9aa8d7f7\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sc2xr" Dec 01 14:13:00 crc kubenswrapper[4585]: I1201 14:13:00.165047 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6f85986-34f0-4c32-9c59-2a9d9aa8d7f7-config\") pod \"dnsmasq-dns-78dd6ddcc-sc2xr\" (UID: \"d6f85986-34f0-4c32-9c59-2a9d9aa8d7f7\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sc2xr" Dec 01 14:13:00 crc kubenswrapper[4585]: I1201 14:13:00.165944 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e3ec753-b8c2-4b1c-b05e-06a62eefe234-config\") pod \"dnsmasq-dns-675f4bcbfc-ntss9\" (UID: \"3e3ec753-b8c2-4b1c-b05e-06a62eefe234\") " pod="openstack/dnsmasq-dns-675f4bcbfc-ntss9" Dec 01 14:13:00 crc kubenswrapper[4585]: I1201 14:13:00.186532 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hjt4\" (UniqueName: \"kubernetes.io/projected/3e3ec753-b8c2-4b1c-b05e-06a62eefe234-kube-api-access-6hjt4\") pod \"dnsmasq-dns-675f4bcbfc-ntss9\" (UID: \"3e3ec753-b8c2-4b1c-b05e-06a62eefe234\") " pod="openstack/dnsmasq-dns-675f4bcbfc-ntss9" Dec 01 14:13:00 crc kubenswrapper[4585]: I1201 14:13:00.266737 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6f85986-34f0-4c32-9c59-2a9d9aa8d7f7-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-sc2xr\" (UID: \"d6f85986-34f0-4c32-9c59-2a9d9aa8d7f7\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sc2xr" Dec 01 14:13:00 crc kubenswrapper[4585]: I1201 14:13:00.266816 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6f85986-34f0-4c32-9c59-2a9d9aa8d7f7-config\") pod \"dnsmasq-dns-78dd6ddcc-sc2xr\" (UID: \"d6f85986-34f0-4c32-9c59-2a9d9aa8d7f7\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sc2xr" Dec 01 14:13:00 crc kubenswrapper[4585]: I1201 14:13:00.266911 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rm2k5\" (UniqueName: \"kubernetes.io/projected/d6f85986-34f0-4c32-9c59-2a9d9aa8d7f7-kube-api-access-rm2k5\") pod \"dnsmasq-dns-78dd6ddcc-sc2xr\" (UID: \"d6f85986-34f0-4c32-9c59-2a9d9aa8d7f7\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sc2xr" Dec 01 14:13:00 crc kubenswrapper[4585]: I1201 14:13:00.268049 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6f85986-34f0-4c32-9c59-2a9d9aa8d7f7-config\") pod \"dnsmasq-dns-78dd6ddcc-sc2xr\" (UID: \"d6f85986-34f0-4c32-9c59-2a9d9aa8d7f7\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sc2xr" Dec 01 14:13:00 crc kubenswrapper[4585]: I1201 14:13:00.268108 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6f85986-34f0-4c32-9c59-2a9d9aa8d7f7-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-sc2xr\" (UID: \"d6f85986-34f0-4c32-9c59-2a9d9aa8d7f7\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sc2xr" Dec 01 14:13:00 crc kubenswrapper[4585]: I1201 14:13:00.289748 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rm2k5\" (UniqueName: \"kubernetes.io/projected/d6f85986-34f0-4c32-9c59-2a9d9aa8d7f7-kube-api-access-rm2k5\") pod \"dnsmasq-dns-78dd6ddcc-sc2xr\" (UID: \"d6f85986-34f0-4c32-9c59-2a9d9aa8d7f7\") " pod="openstack/dnsmasq-dns-78dd6ddcc-sc2xr" Dec 01 14:13:00 crc kubenswrapper[4585]: I1201 14:13:00.304412 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-ntss9" Dec 01 14:13:00 crc kubenswrapper[4585]: I1201 14:13:00.364409 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-sc2xr" Dec 01 14:13:00 crc kubenswrapper[4585]: I1201 14:13:00.811523 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-ntss9"] Dec 01 14:13:00 crc kubenswrapper[4585]: W1201 14:13:00.816161 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e3ec753_b8c2_4b1c_b05e_06a62eefe234.slice/crio-e4d3c904cded6d957f4a99fa0a639d1b631b797adfec75ce10b40b2ae5572bf0 WatchSource:0}: Error finding container e4d3c904cded6d957f4a99fa0a639d1b631b797adfec75ce10b40b2ae5572bf0: Status 404 returned error can't find the container with id e4d3c904cded6d957f4a99fa0a639d1b631b797adfec75ce10b40b2ae5572bf0 Dec 01 14:13:00 crc kubenswrapper[4585]: I1201 14:13:00.888707 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-sc2xr"] Dec 01 14:13:00 crc kubenswrapper[4585]: W1201 14:13:00.891561 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6f85986_34f0_4c32_9c59_2a9d9aa8d7f7.slice/crio-1c2f7c6107fac6cd2c16ca92932336250a182fe6ed6fbabdf9314edde9b711cf WatchSource:0}: Error finding container 1c2f7c6107fac6cd2c16ca92932336250a182fe6ed6fbabdf9314edde9b711cf: Status 404 returned error can't find the container with id 1c2f7c6107fac6cd2c16ca92932336250a182fe6ed6fbabdf9314edde9b711cf Dec 01 14:13:01 crc kubenswrapper[4585]: I1201 14:13:01.381455 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-sc2xr" event={"ID":"d6f85986-34f0-4c32-9c59-2a9d9aa8d7f7","Type":"ContainerStarted","Data":"1c2f7c6107fac6cd2c16ca92932336250a182fe6ed6fbabdf9314edde9b711cf"} Dec 01 14:13:01 crc kubenswrapper[4585]: I1201 14:13:01.382626 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-ntss9" event={"ID":"3e3ec753-b8c2-4b1c-b05e-06a62eefe234","Type":"ContainerStarted","Data":"e4d3c904cded6d957f4a99fa0a639d1b631b797adfec75ce10b40b2ae5572bf0"} Dec 01 14:13:03 crc kubenswrapper[4585]: I1201 14:13:03.211193 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-ntss9"] Dec 01 14:13:03 crc kubenswrapper[4585]: I1201 14:13:03.228784 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-rghfw"] Dec 01 14:13:03 crc kubenswrapper[4585]: I1201 14:13:03.229900 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-rghfw" Dec 01 14:13:03 crc kubenswrapper[4585]: I1201 14:13:03.271776 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-rghfw"] Dec 01 14:13:03 crc kubenswrapper[4585]: I1201 14:13:03.308575 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/35d48e93-76cf-4411-9772-d650bb88c378-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-rghfw\" (UID: \"35d48e93-76cf-4411-9772-d650bb88c378\") " pod="openstack/dnsmasq-dns-5ccc8479f9-rghfw" Dec 01 14:13:03 crc kubenswrapper[4585]: I1201 14:13:03.308643 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbxw6\" (UniqueName: \"kubernetes.io/projected/35d48e93-76cf-4411-9772-d650bb88c378-kube-api-access-fbxw6\") pod \"dnsmasq-dns-5ccc8479f9-rghfw\" (UID: \"35d48e93-76cf-4411-9772-d650bb88c378\") " pod="openstack/dnsmasq-dns-5ccc8479f9-rghfw" Dec 01 14:13:03 crc kubenswrapper[4585]: I1201 14:13:03.309262 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35d48e93-76cf-4411-9772-d650bb88c378-config\") pod \"dnsmasq-dns-5ccc8479f9-rghfw\" (UID: \"35d48e93-76cf-4411-9772-d650bb88c378\") " pod="openstack/dnsmasq-dns-5ccc8479f9-rghfw" Dec 01 14:13:03 crc kubenswrapper[4585]: I1201 14:13:03.413137 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35d48e93-76cf-4411-9772-d650bb88c378-config\") pod \"dnsmasq-dns-5ccc8479f9-rghfw\" (UID: \"35d48e93-76cf-4411-9772-d650bb88c378\") " pod="openstack/dnsmasq-dns-5ccc8479f9-rghfw" Dec 01 14:13:03 crc kubenswrapper[4585]: I1201 14:13:03.413213 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/35d48e93-76cf-4411-9772-d650bb88c378-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-rghfw\" (UID: \"35d48e93-76cf-4411-9772-d650bb88c378\") " pod="openstack/dnsmasq-dns-5ccc8479f9-rghfw" Dec 01 14:13:03 crc kubenswrapper[4585]: I1201 14:13:03.413240 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbxw6\" (UniqueName: \"kubernetes.io/projected/35d48e93-76cf-4411-9772-d650bb88c378-kube-api-access-fbxw6\") pod \"dnsmasq-dns-5ccc8479f9-rghfw\" (UID: \"35d48e93-76cf-4411-9772-d650bb88c378\") " pod="openstack/dnsmasq-dns-5ccc8479f9-rghfw" Dec 01 14:13:03 crc kubenswrapper[4585]: I1201 14:13:03.414200 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35d48e93-76cf-4411-9772-d650bb88c378-config\") pod \"dnsmasq-dns-5ccc8479f9-rghfw\" (UID: \"35d48e93-76cf-4411-9772-d650bb88c378\") " pod="openstack/dnsmasq-dns-5ccc8479f9-rghfw" Dec 01 14:13:03 crc kubenswrapper[4585]: I1201 14:13:03.414749 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/35d48e93-76cf-4411-9772-d650bb88c378-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-rghfw\" (UID: \"35d48e93-76cf-4411-9772-d650bb88c378\") " pod="openstack/dnsmasq-dns-5ccc8479f9-rghfw" Dec 01 14:13:03 crc kubenswrapper[4585]: I1201 14:13:03.447717 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbxw6\" (UniqueName: \"kubernetes.io/projected/35d48e93-76cf-4411-9772-d650bb88c378-kube-api-access-fbxw6\") pod \"dnsmasq-dns-5ccc8479f9-rghfw\" (UID: \"35d48e93-76cf-4411-9772-d650bb88c378\") " pod="openstack/dnsmasq-dns-5ccc8479f9-rghfw" Dec 01 14:13:03 crc kubenswrapper[4585]: I1201 14:13:03.576996 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-rghfw" Dec 01 14:13:03 crc kubenswrapper[4585]: I1201 14:13:03.619576 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-sc2xr"] Dec 01 14:13:03 crc kubenswrapper[4585]: I1201 14:13:03.665202 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-828g9"] Dec 01 14:13:03 crc kubenswrapper[4585]: I1201 14:13:03.666415 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-828g9" Dec 01 14:13:03 crc kubenswrapper[4585]: I1201 14:13:03.691230 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-828g9"] Dec 01 14:13:03 crc kubenswrapper[4585]: I1201 14:13:03.717805 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvlxp\" (UniqueName: \"kubernetes.io/projected/9fb3832e-e925-4f7d-8409-fc123bc61b44-kube-api-access-nvlxp\") pod \"dnsmasq-dns-57d769cc4f-828g9\" (UID: \"9fb3832e-e925-4f7d-8409-fc123bc61b44\") " pod="openstack/dnsmasq-dns-57d769cc4f-828g9" Dec 01 14:13:03 crc kubenswrapper[4585]: I1201 14:13:03.717876 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fb3832e-e925-4f7d-8409-fc123bc61b44-config\") pod \"dnsmasq-dns-57d769cc4f-828g9\" (UID: \"9fb3832e-e925-4f7d-8409-fc123bc61b44\") " pod="openstack/dnsmasq-dns-57d769cc4f-828g9" Dec 01 14:13:03 crc kubenswrapper[4585]: I1201 14:13:03.717908 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9fb3832e-e925-4f7d-8409-fc123bc61b44-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-828g9\" (UID: \"9fb3832e-e925-4f7d-8409-fc123bc61b44\") " pod="openstack/dnsmasq-dns-57d769cc4f-828g9" Dec 01 14:13:03 crc kubenswrapper[4585]: I1201 14:13:03.823241 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fb3832e-e925-4f7d-8409-fc123bc61b44-config\") pod \"dnsmasq-dns-57d769cc4f-828g9\" (UID: \"9fb3832e-e925-4f7d-8409-fc123bc61b44\") " pod="openstack/dnsmasq-dns-57d769cc4f-828g9" Dec 01 14:13:03 crc kubenswrapper[4585]: I1201 14:13:03.823306 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9fb3832e-e925-4f7d-8409-fc123bc61b44-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-828g9\" (UID: \"9fb3832e-e925-4f7d-8409-fc123bc61b44\") " pod="openstack/dnsmasq-dns-57d769cc4f-828g9" Dec 01 14:13:03 crc kubenswrapper[4585]: I1201 14:13:03.823355 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvlxp\" (UniqueName: \"kubernetes.io/projected/9fb3832e-e925-4f7d-8409-fc123bc61b44-kube-api-access-nvlxp\") pod \"dnsmasq-dns-57d769cc4f-828g9\" (UID: \"9fb3832e-e925-4f7d-8409-fc123bc61b44\") " pod="openstack/dnsmasq-dns-57d769cc4f-828g9" Dec 01 14:13:03 crc kubenswrapper[4585]: I1201 14:13:03.824331 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fb3832e-e925-4f7d-8409-fc123bc61b44-config\") pod \"dnsmasq-dns-57d769cc4f-828g9\" (UID: \"9fb3832e-e925-4f7d-8409-fc123bc61b44\") " pod="openstack/dnsmasq-dns-57d769cc4f-828g9" Dec 01 14:13:03 crc kubenswrapper[4585]: I1201 14:13:03.824829 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9fb3832e-e925-4f7d-8409-fc123bc61b44-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-828g9\" (UID: \"9fb3832e-e925-4f7d-8409-fc123bc61b44\") " pod="openstack/dnsmasq-dns-57d769cc4f-828g9" Dec 01 14:13:03 crc kubenswrapper[4585]: I1201 14:13:03.865579 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvlxp\" (UniqueName: \"kubernetes.io/projected/9fb3832e-e925-4f7d-8409-fc123bc61b44-kube-api-access-nvlxp\") pod \"dnsmasq-dns-57d769cc4f-828g9\" (UID: \"9fb3832e-e925-4f7d-8409-fc123bc61b44\") " pod="openstack/dnsmasq-dns-57d769cc4f-828g9" Dec 01 14:13:03 crc kubenswrapper[4585]: I1201 14:13:03.991702 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-828g9" Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.064775 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-rghfw"] Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.427461 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-rghfw" event={"ID":"35d48e93-76cf-4411-9772-d650bb88c378","Type":"ContainerStarted","Data":"10fc8348f876bd89ed59316915e8046b9ad2dec12c5f7a2b4e012fc0aaccbdce"} Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.443009 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.459459 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.460329 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.467684 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.468265 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-7czjg" Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.469012 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.469333 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.469462 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.469566 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.470050 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.525569 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-828g9"] Dec 01 14:13:04 crc kubenswrapper[4585]: W1201 14:13:04.550088 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9fb3832e_e925_4f7d_8409_fc123bc61b44.slice/crio-12fe5572a6f285ef8787773d7f33d45caf4571bf216fcb4ade872fe82568c531 WatchSource:0}: Error finding container 12fe5572a6f285ef8787773d7f33d45caf4571bf216fcb4ade872fe82568c531: Status 404 returned error can't find the container with id 12fe5572a6f285ef8787773d7f33d45caf4571bf216fcb4ade872fe82568c531 Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.652827 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c266121-e7d2-42aa-b1d9-0d15bdd0f798\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.652900 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6c266121-e7d2-42aa-b1d9-0d15bdd0f798-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c266121-e7d2-42aa-b1d9-0d15bdd0f798\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.652920 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6c266121-e7d2-42aa-b1d9-0d15bdd0f798-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c266121-e7d2-42aa-b1d9-0d15bdd0f798\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.652954 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6c266121-e7d2-42aa-b1d9-0d15bdd0f798-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c266121-e7d2-42aa-b1d9-0d15bdd0f798\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.653014 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6c266121-e7d2-42aa-b1d9-0d15bdd0f798-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c266121-e7d2-42aa-b1d9-0d15bdd0f798\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.653064 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6c266121-e7d2-42aa-b1d9-0d15bdd0f798-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c266121-e7d2-42aa-b1d9-0d15bdd0f798\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.653085 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6c266121-e7d2-42aa-b1d9-0d15bdd0f798-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c266121-e7d2-42aa-b1d9-0d15bdd0f798\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.653106 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6c266121-e7d2-42aa-b1d9-0d15bdd0f798-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c266121-e7d2-42aa-b1d9-0d15bdd0f798\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.653124 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6c266121-e7d2-42aa-b1d9-0d15bdd0f798-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c266121-e7d2-42aa-b1d9-0d15bdd0f798\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.653144 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6c266121-e7d2-42aa-b1d9-0d15bdd0f798-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c266121-e7d2-42aa-b1d9-0d15bdd0f798\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.653161 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtdpp\" (UniqueName: \"kubernetes.io/projected/6c266121-e7d2-42aa-b1d9-0d15bdd0f798-kube-api-access-jtdpp\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c266121-e7d2-42aa-b1d9-0d15bdd0f798\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.753948 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6c266121-e7d2-42aa-b1d9-0d15bdd0f798-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c266121-e7d2-42aa-b1d9-0d15bdd0f798\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.754018 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6c266121-e7d2-42aa-b1d9-0d15bdd0f798-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c266121-e7d2-42aa-b1d9-0d15bdd0f798\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.754058 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6c266121-e7d2-42aa-b1d9-0d15bdd0f798-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c266121-e7d2-42aa-b1d9-0d15bdd0f798\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.754078 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6c266121-e7d2-42aa-b1d9-0d15bdd0f798-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c266121-e7d2-42aa-b1d9-0d15bdd0f798\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.754098 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6c266121-e7d2-42aa-b1d9-0d15bdd0f798-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c266121-e7d2-42aa-b1d9-0d15bdd0f798\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.754114 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6c266121-e7d2-42aa-b1d9-0d15bdd0f798-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c266121-e7d2-42aa-b1d9-0d15bdd0f798\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.754137 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6c266121-e7d2-42aa-b1d9-0d15bdd0f798-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c266121-e7d2-42aa-b1d9-0d15bdd0f798\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.754155 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtdpp\" (UniqueName: \"kubernetes.io/projected/6c266121-e7d2-42aa-b1d9-0d15bdd0f798-kube-api-access-jtdpp\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c266121-e7d2-42aa-b1d9-0d15bdd0f798\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.754178 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c266121-e7d2-42aa-b1d9-0d15bdd0f798\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.754207 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6c266121-e7d2-42aa-b1d9-0d15bdd0f798-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c266121-e7d2-42aa-b1d9-0d15bdd0f798\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.754221 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6c266121-e7d2-42aa-b1d9-0d15bdd0f798-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c266121-e7d2-42aa-b1d9-0d15bdd0f798\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.754627 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6c266121-e7d2-42aa-b1d9-0d15bdd0f798-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c266121-e7d2-42aa-b1d9-0d15bdd0f798\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.755243 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6c266121-e7d2-42aa-b1d9-0d15bdd0f798-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c266121-e7d2-42aa-b1d9-0d15bdd0f798\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.755263 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6c266121-e7d2-42aa-b1d9-0d15bdd0f798-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c266121-e7d2-42aa-b1d9-0d15bdd0f798\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.755469 4585 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c266121-e7d2-42aa-b1d9-0d15bdd0f798\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.756028 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6c266121-e7d2-42aa-b1d9-0d15bdd0f798-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c266121-e7d2-42aa-b1d9-0d15bdd0f798\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.760686 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6c266121-e7d2-42aa-b1d9-0d15bdd0f798-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c266121-e7d2-42aa-b1d9-0d15bdd0f798\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.763504 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6c266121-e7d2-42aa-b1d9-0d15bdd0f798-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c266121-e7d2-42aa-b1d9-0d15bdd0f798\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.766090 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6c266121-e7d2-42aa-b1d9-0d15bdd0f798-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c266121-e7d2-42aa-b1d9-0d15bdd0f798\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.778502 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6c266121-e7d2-42aa-b1d9-0d15bdd0f798-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c266121-e7d2-42aa-b1d9-0d15bdd0f798\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.793150 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtdpp\" (UniqueName: \"kubernetes.io/projected/6c266121-e7d2-42aa-b1d9-0d15bdd0f798-kube-api-access-jtdpp\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c266121-e7d2-42aa-b1d9-0d15bdd0f798\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.795321 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6c266121-e7d2-42aa-b1d9-0d15bdd0f798-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c266121-e7d2-42aa-b1d9-0d15bdd0f798\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.817600 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c266121-e7d2-42aa-b1d9-0d15bdd0f798\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.819459 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.820845 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.828108 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.828276 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.828577 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-hjxqc" Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.828763 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.829015 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.829129 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.829144 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.840964 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.959346 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d41c9a27-f15b-44c5-84b2-0e083f8dc837-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d41c9a27-f15b-44c5-84b2-0e083f8dc837\") " pod="openstack/rabbitmq-server-0" Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.959449 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d41c9a27-f15b-44c5-84b2-0e083f8dc837-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d41c9a27-f15b-44c5-84b2-0e083f8dc837\") " pod="openstack/rabbitmq-server-0" Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.959488 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d41c9a27-f15b-44c5-84b2-0e083f8dc837-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d41c9a27-f15b-44c5-84b2-0e083f8dc837\") " pod="openstack/rabbitmq-server-0" Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.959509 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d41c9a27-f15b-44c5-84b2-0e083f8dc837-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d41c9a27-f15b-44c5-84b2-0e083f8dc837\") " pod="openstack/rabbitmq-server-0" Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.959555 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"d41c9a27-f15b-44c5-84b2-0e083f8dc837\") " pod="openstack/rabbitmq-server-0" Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.959575 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d41c9a27-f15b-44c5-84b2-0e083f8dc837-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d41c9a27-f15b-44c5-84b2-0e083f8dc837\") " pod="openstack/rabbitmq-server-0" Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.959594 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwnv5\" (UniqueName: \"kubernetes.io/projected/d41c9a27-f15b-44c5-84b2-0e083f8dc837-kube-api-access-gwnv5\") pod \"rabbitmq-server-0\" (UID: \"d41c9a27-f15b-44c5-84b2-0e083f8dc837\") " pod="openstack/rabbitmq-server-0" Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.959633 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d41c9a27-f15b-44c5-84b2-0e083f8dc837-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d41c9a27-f15b-44c5-84b2-0e083f8dc837\") " pod="openstack/rabbitmq-server-0" Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.959650 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d41c9a27-f15b-44c5-84b2-0e083f8dc837-config-data\") pod \"rabbitmq-server-0\" (UID: \"d41c9a27-f15b-44c5-84b2-0e083f8dc837\") " pod="openstack/rabbitmq-server-0" Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.959671 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d41c9a27-f15b-44c5-84b2-0e083f8dc837-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d41c9a27-f15b-44c5-84b2-0e083f8dc837\") " pod="openstack/rabbitmq-server-0" Dec 01 14:13:04 crc kubenswrapper[4585]: I1201 14:13:04.959709 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d41c9a27-f15b-44c5-84b2-0e083f8dc837-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d41c9a27-f15b-44c5-84b2-0e083f8dc837\") " pod="openstack/rabbitmq-server-0" Dec 01 14:13:05 crc kubenswrapper[4585]: I1201 14:13:05.061529 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d41c9a27-f15b-44c5-84b2-0e083f8dc837-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d41c9a27-f15b-44c5-84b2-0e083f8dc837\") " pod="openstack/rabbitmq-server-0" Dec 01 14:13:05 crc kubenswrapper[4585]: I1201 14:13:05.061571 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d41c9a27-f15b-44c5-84b2-0e083f8dc837-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d41c9a27-f15b-44c5-84b2-0e083f8dc837\") " pod="openstack/rabbitmq-server-0" Dec 01 14:13:05 crc kubenswrapper[4585]: I1201 14:13:05.061605 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d41c9a27-f15b-44c5-84b2-0e083f8dc837-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d41c9a27-f15b-44c5-84b2-0e083f8dc837\") " pod="openstack/rabbitmq-server-0" Dec 01 14:13:05 crc kubenswrapper[4585]: I1201 14:13:05.061625 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d41c9a27-f15b-44c5-84b2-0e083f8dc837-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d41c9a27-f15b-44c5-84b2-0e083f8dc837\") " pod="openstack/rabbitmq-server-0" Dec 01 14:13:05 crc kubenswrapper[4585]: I1201 14:13:05.061668 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"d41c9a27-f15b-44c5-84b2-0e083f8dc837\") " pod="openstack/rabbitmq-server-0" Dec 01 14:13:05 crc kubenswrapper[4585]: I1201 14:13:05.061683 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d41c9a27-f15b-44c5-84b2-0e083f8dc837-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d41c9a27-f15b-44c5-84b2-0e083f8dc837\") " pod="openstack/rabbitmq-server-0" Dec 01 14:13:05 crc kubenswrapper[4585]: I1201 14:13:05.061700 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwnv5\" (UniqueName: \"kubernetes.io/projected/d41c9a27-f15b-44c5-84b2-0e083f8dc837-kube-api-access-gwnv5\") pod \"rabbitmq-server-0\" (UID: \"d41c9a27-f15b-44c5-84b2-0e083f8dc837\") " pod="openstack/rabbitmq-server-0" Dec 01 14:13:05 crc kubenswrapper[4585]: I1201 14:13:05.061716 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d41c9a27-f15b-44c5-84b2-0e083f8dc837-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d41c9a27-f15b-44c5-84b2-0e083f8dc837\") " pod="openstack/rabbitmq-server-0" Dec 01 14:13:05 crc kubenswrapper[4585]: I1201 14:13:05.061732 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d41c9a27-f15b-44c5-84b2-0e083f8dc837-config-data\") pod \"rabbitmq-server-0\" (UID: \"d41c9a27-f15b-44c5-84b2-0e083f8dc837\") " pod="openstack/rabbitmq-server-0" Dec 01 14:13:05 crc kubenswrapper[4585]: I1201 14:13:05.061749 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d41c9a27-f15b-44c5-84b2-0e083f8dc837-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d41c9a27-f15b-44c5-84b2-0e083f8dc837\") " pod="openstack/rabbitmq-server-0" Dec 01 14:13:05 crc kubenswrapper[4585]: I1201 14:13:05.061770 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d41c9a27-f15b-44c5-84b2-0e083f8dc837-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d41c9a27-f15b-44c5-84b2-0e083f8dc837\") " pod="openstack/rabbitmq-server-0" Dec 01 14:13:05 crc kubenswrapper[4585]: I1201 14:13:05.062074 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d41c9a27-f15b-44c5-84b2-0e083f8dc837-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d41c9a27-f15b-44c5-84b2-0e083f8dc837\") " pod="openstack/rabbitmq-server-0" Dec 01 14:13:05 crc kubenswrapper[4585]: I1201 14:13:05.062320 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d41c9a27-f15b-44c5-84b2-0e083f8dc837-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d41c9a27-f15b-44c5-84b2-0e083f8dc837\") " pod="openstack/rabbitmq-server-0" Dec 01 14:13:05 crc kubenswrapper[4585]: I1201 14:13:05.063017 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d41c9a27-f15b-44c5-84b2-0e083f8dc837-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d41c9a27-f15b-44c5-84b2-0e083f8dc837\") " pod="openstack/rabbitmq-server-0" Dec 01 14:13:05 crc kubenswrapper[4585]: I1201 14:13:05.063504 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d41c9a27-f15b-44c5-84b2-0e083f8dc837-config-data\") pod \"rabbitmq-server-0\" (UID: \"d41c9a27-f15b-44c5-84b2-0e083f8dc837\") " pod="openstack/rabbitmq-server-0" Dec 01 14:13:05 crc kubenswrapper[4585]: I1201 14:13:05.064060 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d41c9a27-f15b-44c5-84b2-0e083f8dc837-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d41c9a27-f15b-44c5-84b2-0e083f8dc837\") " pod="openstack/rabbitmq-server-0" Dec 01 14:13:05 crc kubenswrapper[4585]: I1201 14:13:05.064138 4585 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"d41c9a27-f15b-44c5-84b2-0e083f8dc837\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Dec 01 14:13:05 crc kubenswrapper[4585]: I1201 14:13:05.065307 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d41c9a27-f15b-44c5-84b2-0e083f8dc837-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d41c9a27-f15b-44c5-84b2-0e083f8dc837\") " pod="openstack/rabbitmq-server-0" Dec 01 14:13:05 crc kubenswrapper[4585]: I1201 14:13:05.067623 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d41c9a27-f15b-44c5-84b2-0e083f8dc837-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d41c9a27-f15b-44c5-84b2-0e083f8dc837\") " pod="openstack/rabbitmq-server-0" Dec 01 14:13:05 crc kubenswrapper[4585]: I1201 14:13:05.070947 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d41c9a27-f15b-44c5-84b2-0e083f8dc837-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d41c9a27-f15b-44c5-84b2-0e083f8dc837\") " pod="openstack/rabbitmq-server-0" Dec 01 14:13:05 crc kubenswrapper[4585]: I1201 14:13:05.076862 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d41c9a27-f15b-44c5-84b2-0e083f8dc837-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d41c9a27-f15b-44c5-84b2-0e083f8dc837\") " pod="openstack/rabbitmq-server-0" Dec 01 14:13:05 crc kubenswrapper[4585]: I1201 14:13:05.079233 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwnv5\" (UniqueName: \"kubernetes.io/projected/d41c9a27-f15b-44c5-84b2-0e083f8dc837-kube-api-access-gwnv5\") pod \"rabbitmq-server-0\" (UID: \"d41c9a27-f15b-44c5-84b2-0e083f8dc837\") " pod="openstack/rabbitmq-server-0" Dec 01 14:13:05 crc kubenswrapper[4585]: I1201 14:13:05.097254 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"d41c9a27-f15b-44c5-84b2-0e083f8dc837\") " pod="openstack/rabbitmq-server-0" Dec 01 14:13:05 crc kubenswrapper[4585]: I1201 14:13:05.117784 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:13:05 crc kubenswrapper[4585]: I1201 14:13:05.178526 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 01 14:13:05 crc kubenswrapper[4585]: I1201 14:13:05.441439 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-828g9" event={"ID":"9fb3832e-e925-4f7d-8409-fc123bc61b44","Type":"ContainerStarted","Data":"12fe5572a6f285ef8787773d7f33d45caf4571bf216fcb4ade872fe82568c531"} Dec 01 14:13:05 crc kubenswrapper[4585]: I1201 14:13:05.914361 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 14:13:05 crc kubenswrapper[4585]: I1201 14:13:05.989183 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 14:13:05 crc kubenswrapper[4585]: W1201 14:13:05.993953 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd41c9a27_f15b_44c5_84b2_0e083f8dc837.slice/crio-16f7c8940c8b332113a9028b5b8993a30fe696de6b22be734af4a73e28b2daf2 WatchSource:0}: Error finding container 16f7c8940c8b332113a9028b5b8993a30fe696de6b22be734af4a73e28b2daf2: Status 404 returned error can't find the container with id 16f7c8940c8b332113a9028b5b8993a30fe696de6b22be734af4a73e28b2daf2 Dec 01 14:13:06 crc kubenswrapper[4585]: I1201 14:13:06.250574 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 01 14:13:06 crc kubenswrapper[4585]: I1201 14:13:06.252496 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 01 14:13:06 crc kubenswrapper[4585]: I1201 14:13:06.256618 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 01 14:13:06 crc kubenswrapper[4585]: I1201 14:13:06.257086 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 01 14:13:06 crc kubenswrapper[4585]: I1201 14:13:06.258089 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 01 14:13:06 crc kubenswrapper[4585]: I1201 14:13:06.258206 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 01 14:13:06 crc kubenswrapper[4585]: I1201 14:13:06.258323 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-8nrhx" Dec 01 14:13:06 crc kubenswrapper[4585]: I1201 14:13:06.262809 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 01 14:13:06 crc kubenswrapper[4585]: I1201 14:13:06.384576 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/61a59437-2c03-417a-839f-6b610fa43a83-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"61a59437-2c03-417a-839f-6b610fa43a83\") " pod="openstack/openstack-galera-0" Dec 01 14:13:06 crc kubenswrapper[4585]: I1201 14:13:06.384620 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/61a59437-2c03-417a-839f-6b610fa43a83-config-data-default\") pod \"openstack-galera-0\" (UID: \"61a59437-2c03-417a-839f-6b610fa43a83\") " pod="openstack/openstack-galera-0" Dec 01 14:13:06 crc kubenswrapper[4585]: I1201 14:13:06.384648 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"61a59437-2c03-417a-839f-6b610fa43a83\") " pod="openstack/openstack-galera-0" Dec 01 14:13:06 crc kubenswrapper[4585]: I1201 14:13:06.384690 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61a59437-2c03-417a-839f-6b610fa43a83-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"61a59437-2c03-417a-839f-6b610fa43a83\") " pod="openstack/openstack-galera-0" Dec 01 14:13:06 crc kubenswrapper[4585]: I1201 14:13:06.384793 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/61a59437-2c03-417a-839f-6b610fa43a83-kolla-config\") pod \"openstack-galera-0\" (UID: \"61a59437-2c03-417a-839f-6b610fa43a83\") " pod="openstack/openstack-galera-0" Dec 01 14:13:06 crc kubenswrapper[4585]: I1201 14:13:06.384836 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61a59437-2c03-417a-839f-6b610fa43a83-operator-scripts\") pod \"openstack-galera-0\" (UID: \"61a59437-2c03-417a-839f-6b610fa43a83\") " pod="openstack/openstack-galera-0" Dec 01 14:13:06 crc kubenswrapper[4585]: I1201 14:13:06.384865 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/61a59437-2c03-417a-839f-6b610fa43a83-config-data-generated\") pod \"openstack-galera-0\" (UID: \"61a59437-2c03-417a-839f-6b610fa43a83\") " pod="openstack/openstack-galera-0" Dec 01 14:13:06 crc kubenswrapper[4585]: I1201 14:13:06.384891 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tc62t\" (UniqueName: \"kubernetes.io/projected/61a59437-2c03-417a-839f-6b610fa43a83-kube-api-access-tc62t\") pod \"openstack-galera-0\" (UID: \"61a59437-2c03-417a-839f-6b610fa43a83\") " pod="openstack/openstack-galera-0" Dec 01 14:13:06 crc kubenswrapper[4585]: I1201 14:13:06.455074 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d41c9a27-f15b-44c5-84b2-0e083f8dc837","Type":"ContainerStarted","Data":"16f7c8940c8b332113a9028b5b8993a30fe696de6b22be734af4a73e28b2daf2"} Dec 01 14:13:06 crc kubenswrapper[4585]: I1201 14:13:06.461189 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6c266121-e7d2-42aa-b1d9-0d15bdd0f798","Type":"ContainerStarted","Data":"01129af23bcc210a5877ac141d365da74bc31f28218a2968425fc8e15107979c"} Dec 01 14:13:06 crc kubenswrapper[4585]: I1201 14:13:06.489840 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/61a59437-2c03-417a-839f-6b610fa43a83-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"61a59437-2c03-417a-839f-6b610fa43a83\") " pod="openstack/openstack-galera-0" Dec 01 14:13:06 crc kubenswrapper[4585]: I1201 14:13:06.489892 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/61a59437-2c03-417a-839f-6b610fa43a83-config-data-default\") pod \"openstack-galera-0\" (UID: \"61a59437-2c03-417a-839f-6b610fa43a83\") " pod="openstack/openstack-galera-0" Dec 01 14:13:06 crc kubenswrapper[4585]: I1201 14:13:06.489914 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"61a59437-2c03-417a-839f-6b610fa43a83\") " pod="openstack/openstack-galera-0" Dec 01 14:13:06 crc kubenswrapper[4585]: I1201 14:13:06.489941 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61a59437-2c03-417a-839f-6b610fa43a83-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"61a59437-2c03-417a-839f-6b610fa43a83\") " pod="openstack/openstack-galera-0" Dec 01 14:13:06 crc kubenswrapper[4585]: I1201 14:13:06.489955 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/61a59437-2c03-417a-839f-6b610fa43a83-kolla-config\") pod \"openstack-galera-0\" (UID: \"61a59437-2c03-417a-839f-6b610fa43a83\") " pod="openstack/openstack-galera-0" Dec 01 14:13:06 crc kubenswrapper[4585]: I1201 14:13:06.490028 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61a59437-2c03-417a-839f-6b610fa43a83-operator-scripts\") pod \"openstack-galera-0\" (UID: \"61a59437-2c03-417a-839f-6b610fa43a83\") " pod="openstack/openstack-galera-0" Dec 01 14:13:06 crc kubenswrapper[4585]: I1201 14:13:06.490089 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/61a59437-2c03-417a-839f-6b610fa43a83-config-data-generated\") pod \"openstack-galera-0\" (UID: \"61a59437-2c03-417a-839f-6b610fa43a83\") " pod="openstack/openstack-galera-0" Dec 01 14:13:06 crc kubenswrapper[4585]: I1201 14:13:06.490108 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tc62t\" (UniqueName: \"kubernetes.io/projected/61a59437-2c03-417a-839f-6b610fa43a83-kube-api-access-tc62t\") pod \"openstack-galera-0\" (UID: \"61a59437-2c03-417a-839f-6b610fa43a83\") " pod="openstack/openstack-galera-0" Dec 01 14:13:06 crc kubenswrapper[4585]: I1201 14:13:06.491370 4585 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"61a59437-2c03-417a-839f-6b610fa43a83\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/openstack-galera-0" Dec 01 14:13:06 crc kubenswrapper[4585]: I1201 14:13:06.491790 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/61a59437-2c03-417a-839f-6b610fa43a83-config-data-generated\") pod \"openstack-galera-0\" (UID: \"61a59437-2c03-417a-839f-6b610fa43a83\") " pod="openstack/openstack-galera-0" Dec 01 14:13:06 crc kubenswrapper[4585]: I1201 14:13:06.500198 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 01 14:13:06 crc kubenswrapper[4585]: I1201 14:13:06.500549 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 01 14:13:06 crc kubenswrapper[4585]: I1201 14:13:06.500679 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 01 14:13:06 crc kubenswrapper[4585]: I1201 14:13:06.508609 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/61a59437-2c03-417a-839f-6b610fa43a83-config-data-default\") pod \"openstack-galera-0\" (UID: \"61a59437-2c03-417a-839f-6b610fa43a83\") " pod="openstack/openstack-galera-0" Dec 01 14:13:06 crc kubenswrapper[4585]: I1201 14:13:06.509124 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/61a59437-2c03-417a-839f-6b610fa43a83-kolla-config\") pod \"openstack-galera-0\" (UID: \"61a59437-2c03-417a-839f-6b610fa43a83\") " pod="openstack/openstack-galera-0" Dec 01 14:13:06 crc kubenswrapper[4585]: I1201 14:13:06.509812 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61a59437-2c03-417a-839f-6b610fa43a83-operator-scripts\") pod \"openstack-galera-0\" (UID: \"61a59437-2c03-417a-839f-6b610fa43a83\") " pod="openstack/openstack-galera-0" Dec 01 14:13:06 crc kubenswrapper[4585]: I1201 14:13:06.514566 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 01 14:13:06 crc kubenswrapper[4585]: I1201 14:13:06.525575 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/61a59437-2c03-417a-839f-6b610fa43a83-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"61a59437-2c03-417a-839f-6b610fa43a83\") " pod="openstack/openstack-galera-0" Dec 01 14:13:06 crc kubenswrapper[4585]: I1201 14:13:06.535180 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61a59437-2c03-417a-839f-6b610fa43a83-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"61a59437-2c03-417a-839f-6b610fa43a83\") " pod="openstack/openstack-galera-0" Dec 01 14:13:06 crc kubenswrapper[4585]: I1201 14:13:06.551927 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"61a59437-2c03-417a-839f-6b610fa43a83\") " pod="openstack/openstack-galera-0" Dec 01 14:13:06 crc kubenswrapper[4585]: I1201 14:13:06.617633 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tc62t\" (UniqueName: \"kubernetes.io/projected/61a59437-2c03-417a-839f-6b610fa43a83-kube-api-access-tc62t\") pod \"openstack-galera-0\" (UID: \"61a59437-2c03-417a-839f-6b610fa43a83\") " pod="openstack/openstack-galera-0" Dec 01 14:13:06 crc kubenswrapper[4585]: I1201 14:13:06.885336 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-8nrhx" Dec 01 14:13:06 crc kubenswrapper[4585]: I1201 14:13:06.894037 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 01 14:13:07 crc kubenswrapper[4585]: I1201 14:13:07.448424 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 01 14:13:07 crc kubenswrapper[4585]: I1201 14:13:07.452555 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 01 14:13:07 crc kubenswrapper[4585]: I1201 14:13:07.463078 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-7zkkg" Dec 01 14:13:07 crc kubenswrapper[4585]: I1201 14:13:07.463390 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 01 14:13:07 crc kubenswrapper[4585]: I1201 14:13:07.463509 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 01 14:13:07 crc kubenswrapper[4585]: I1201 14:13:07.463619 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 01 14:13:07 crc kubenswrapper[4585]: I1201 14:13:07.478661 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 01 14:13:07 crc kubenswrapper[4585]: I1201 14:13:07.516400 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/471a678b-d81a-4526-b826-65b359685c99-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"471a678b-d81a-4526-b826-65b359685c99\") " pod="openstack/openstack-cell1-galera-0" Dec 01 14:13:07 crc kubenswrapper[4585]: I1201 14:13:07.516449 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/471a678b-d81a-4526-b826-65b359685c99-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"471a678b-d81a-4526-b826-65b359685c99\") " pod="openstack/openstack-cell1-galera-0" Dec 01 14:13:07 crc kubenswrapper[4585]: I1201 14:13:07.516487 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"471a678b-d81a-4526-b826-65b359685c99\") " pod="openstack/openstack-cell1-galera-0" Dec 01 14:13:07 crc kubenswrapper[4585]: I1201 14:13:07.516504 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/471a678b-d81a-4526-b826-65b359685c99-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"471a678b-d81a-4526-b826-65b359685c99\") " pod="openstack/openstack-cell1-galera-0" Dec 01 14:13:07 crc kubenswrapper[4585]: I1201 14:13:07.516526 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8r79s\" (UniqueName: \"kubernetes.io/projected/471a678b-d81a-4526-b826-65b359685c99-kube-api-access-8r79s\") pod \"openstack-cell1-galera-0\" (UID: \"471a678b-d81a-4526-b826-65b359685c99\") " pod="openstack/openstack-cell1-galera-0" Dec 01 14:13:07 crc kubenswrapper[4585]: I1201 14:13:07.516548 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/471a678b-d81a-4526-b826-65b359685c99-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"471a678b-d81a-4526-b826-65b359685c99\") " pod="openstack/openstack-cell1-galera-0" Dec 01 14:13:07 crc kubenswrapper[4585]: I1201 14:13:07.516581 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/471a678b-d81a-4526-b826-65b359685c99-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"471a678b-d81a-4526-b826-65b359685c99\") " pod="openstack/openstack-cell1-galera-0" Dec 01 14:13:07 crc kubenswrapper[4585]: I1201 14:13:07.516616 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/471a678b-d81a-4526-b826-65b359685c99-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"471a678b-d81a-4526-b826-65b359685c99\") " pod="openstack/openstack-cell1-galera-0" Dec 01 14:13:07 crc kubenswrapper[4585]: I1201 14:13:07.606029 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 01 14:13:07 crc kubenswrapper[4585]: I1201 14:13:07.618668 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/471a678b-d81a-4526-b826-65b359685c99-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"471a678b-d81a-4526-b826-65b359685c99\") " pod="openstack/openstack-cell1-galera-0" Dec 01 14:13:07 crc kubenswrapper[4585]: I1201 14:13:07.619087 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/471a678b-d81a-4526-b826-65b359685c99-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"471a678b-d81a-4526-b826-65b359685c99\") " pod="openstack/openstack-cell1-galera-0" Dec 01 14:13:07 crc kubenswrapper[4585]: I1201 14:13:07.619143 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"471a678b-d81a-4526-b826-65b359685c99\") " pod="openstack/openstack-cell1-galera-0" Dec 01 14:13:07 crc kubenswrapper[4585]: I1201 14:13:07.619176 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/471a678b-d81a-4526-b826-65b359685c99-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"471a678b-d81a-4526-b826-65b359685c99\") " pod="openstack/openstack-cell1-galera-0" Dec 01 14:13:07 crc kubenswrapper[4585]: I1201 14:13:07.619211 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8r79s\" (UniqueName: \"kubernetes.io/projected/471a678b-d81a-4526-b826-65b359685c99-kube-api-access-8r79s\") pod \"openstack-cell1-galera-0\" (UID: \"471a678b-d81a-4526-b826-65b359685c99\") " pod="openstack/openstack-cell1-galera-0" Dec 01 14:13:07 crc kubenswrapper[4585]: I1201 14:13:07.619234 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/471a678b-d81a-4526-b826-65b359685c99-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"471a678b-d81a-4526-b826-65b359685c99\") " pod="openstack/openstack-cell1-galera-0" Dec 01 14:13:07 crc kubenswrapper[4585]: I1201 14:13:07.619265 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/471a678b-d81a-4526-b826-65b359685c99-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"471a678b-d81a-4526-b826-65b359685c99\") " pod="openstack/openstack-cell1-galera-0" Dec 01 14:13:07 crc kubenswrapper[4585]: I1201 14:13:07.619299 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/471a678b-d81a-4526-b826-65b359685c99-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"471a678b-d81a-4526-b826-65b359685c99\") " pod="openstack/openstack-cell1-galera-0" Dec 01 14:13:07 crc kubenswrapper[4585]: I1201 14:13:07.620651 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/471a678b-d81a-4526-b826-65b359685c99-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"471a678b-d81a-4526-b826-65b359685c99\") " pod="openstack/openstack-cell1-galera-0" Dec 01 14:13:07 crc kubenswrapper[4585]: I1201 14:13:07.620935 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/471a678b-d81a-4526-b826-65b359685c99-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"471a678b-d81a-4526-b826-65b359685c99\") " pod="openstack/openstack-cell1-galera-0" Dec 01 14:13:07 crc kubenswrapper[4585]: I1201 14:13:07.622286 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/471a678b-d81a-4526-b826-65b359685c99-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"471a678b-d81a-4526-b826-65b359685c99\") " pod="openstack/openstack-cell1-galera-0" Dec 01 14:13:07 crc kubenswrapper[4585]: I1201 14:13:07.622614 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/471a678b-d81a-4526-b826-65b359685c99-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"471a678b-d81a-4526-b826-65b359685c99\") " pod="openstack/openstack-cell1-galera-0" Dec 01 14:13:07 crc kubenswrapper[4585]: I1201 14:13:07.625199 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/471a678b-d81a-4526-b826-65b359685c99-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"471a678b-d81a-4526-b826-65b359685c99\") " pod="openstack/openstack-cell1-galera-0" Dec 01 14:13:07 crc kubenswrapper[4585]: I1201 14:13:07.625362 4585 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"471a678b-d81a-4526-b826-65b359685c99\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/openstack-cell1-galera-0" Dec 01 14:13:07 crc kubenswrapper[4585]: I1201 14:13:07.625960 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/471a678b-d81a-4526-b826-65b359685c99-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"471a678b-d81a-4526-b826-65b359685c99\") " pod="openstack/openstack-cell1-galera-0" Dec 01 14:13:07 crc kubenswrapper[4585]: I1201 14:13:07.671117 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8r79s\" (UniqueName: \"kubernetes.io/projected/471a678b-d81a-4526-b826-65b359685c99-kube-api-access-8r79s\") pod \"openstack-cell1-galera-0\" (UID: \"471a678b-d81a-4526-b826-65b359685c99\") " pod="openstack/openstack-cell1-galera-0" Dec 01 14:13:07 crc kubenswrapper[4585]: I1201 14:13:07.735020 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"471a678b-d81a-4526-b826-65b359685c99\") " pod="openstack/openstack-cell1-galera-0" Dec 01 14:13:07 crc kubenswrapper[4585]: I1201 14:13:07.822256 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 01 14:13:08 crc kubenswrapper[4585]: I1201 14:13:08.169408 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 01 14:13:08 crc kubenswrapper[4585]: I1201 14:13:08.170611 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 01 14:13:08 crc kubenswrapper[4585]: I1201 14:13:08.185244 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-nkz5p" Dec 01 14:13:08 crc kubenswrapper[4585]: I1201 14:13:08.185600 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 01 14:13:08 crc kubenswrapper[4585]: I1201 14:13:08.185807 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 01 14:13:08 crc kubenswrapper[4585]: I1201 14:13:08.222641 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 01 14:13:08 crc kubenswrapper[4585]: I1201 14:13:08.331634 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbcd4b11-d625-4425-82d8-7c32d8c24c5c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"bbcd4b11-d625-4425-82d8-7c32d8c24c5c\") " pod="openstack/memcached-0" Dec 01 14:13:08 crc kubenswrapper[4585]: I1201 14:13:08.332034 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbcd4b11-d625-4425-82d8-7c32d8c24c5c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"bbcd4b11-d625-4425-82d8-7c32d8c24c5c\") " pod="openstack/memcached-0" Dec 01 14:13:08 crc kubenswrapper[4585]: I1201 14:13:08.332078 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bbcd4b11-d625-4425-82d8-7c32d8c24c5c-config-data\") pod \"memcached-0\" (UID: \"bbcd4b11-d625-4425-82d8-7c32d8c24c5c\") " pod="openstack/memcached-0" Dec 01 14:13:08 crc kubenswrapper[4585]: I1201 14:13:08.332140 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjmht\" (UniqueName: \"kubernetes.io/projected/bbcd4b11-d625-4425-82d8-7c32d8c24c5c-kube-api-access-pjmht\") pod \"memcached-0\" (UID: \"bbcd4b11-d625-4425-82d8-7c32d8c24c5c\") " pod="openstack/memcached-0" Dec 01 14:13:08 crc kubenswrapper[4585]: I1201 14:13:08.332156 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bbcd4b11-d625-4425-82d8-7c32d8c24c5c-kolla-config\") pod \"memcached-0\" (UID: \"bbcd4b11-d625-4425-82d8-7c32d8c24c5c\") " pod="openstack/memcached-0" Dec 01 14:13:08 crc kubenswrapper[4585]: I1201 14:13:08.436470 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjmht\" (UniqueName: \"kubernetes.io/projected/bbcd4b11-d625-4425-82d8-7c32d8c24c5c-kube-api-access-pjmht\") pod \"memcached-0\" (UID: \"bbcd4b11-d625-4425-82d8-7c32d8c24c5c\") " pod="openstack/memcached-0" Dec 01 14:13:08 crc kubenswrapper[4585]: I1201 14:13:08.436511 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bbcd4b11-d625-4425-82d8-7c32d8c24c5c-kolla-config\") pod \"memcached-0\" (UID: \"bbcd4b11-d625-4425-82d8-7c32d8c24c5c\") " pod="openstack/memcached-0" Dec 01 14:13:08 crc kubenswrapper[4585]: I1201 14:13:08.436556 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbcd4b11-d625-4425-82d8-7c32d8c24c5c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"bbcd4b11-d625-4425-82d8-7c32d8c24c5c\") " pod="openstack/memcached-0" Dec 01 14:13:08 crc kubenswrapper[4585]: I1201 14:13:08.436591 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbcd4b11-d625-4425-82d8-7c32d8c24c5c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"bbcd4b11-d625-4425-82d8-7c32d8c24c5c\") " pod="openstack/memcached-0" Dec 01 14:13:08 crc kubenswrapper[4585]: I1201 14:13:08.436624 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bbcd4b11-d625-4425-82d8-7c32d8c24c5c-config-data\") pod \"memcached-0\" (UID: \"bbcd4b11-d625-4425-82d8-7c32d8c24c5c\") " pod="openstack/memcached-0" Dec 01 14:13:08 crc kubenswrapper[4585]: I1201 14:13:08.437525 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bbcd4b11-d625-4425-82d8-7c32d8c24c5c-config-data\") pod \"memcached-0\" (UID: \"bbcd4b11-d625-4425-82d8-7c32d8c24c5c\") " pod="openstack/memcached-0" Dec 01 14:13:08 crc kubenswrapper[4585]: I1201 14:13:08.438241 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bbcd4b11-d625-4425-82d8-7c32d8c24c5c-kolla-config\") pod \"memcached-0\" (UID: \"bbcd4b11-d625-4425-82d8-7c32d8c24c5c\") " pod="openstack/memcached-0" Dec 01 14:13:08 crc kubenswrapper[4585]: I1201 14:13:08.443106 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbcd4b11-d625-4425-82d8-7c32d8c24c5c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"bbcd4b11-d625-4425-82d8-7c32d8c24c5c\") " pod="openstack/memcached-0" Dec 01 14:13:08 crc kubenswrapper[4585]: I1201 14:13:08.455443 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbcd4b11-d625-4425-82d8-7c32d8c24c5c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"bbcd4b11-d625-4425-82d8-7c32d8c24c5c\") " pod="openstack/memcached-0" Dec 01 14:13:08 crc kubenswrapper[4585]: I1201 14:13:08.477786 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjmht\" (UniqueName: \"kubernetes.io/projected/bbcd4b11-d625-4425-82d8-7c32d8c24c5c-kube-api-access-pjmht\") pod \"memcached-0\" (UID: \"bbcd4b11-d625-4425-82d8-7c32d8c24c5c\") " pod="openstack/memcached-0" Dec 01 14:13:08 crc kubenswrapper[4585]: I1201 14:13:08.507408 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 01 14:13:08 crc kubenswrapper[4585]: I1201 14:13:08.538822 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"61a59437-2c03-417a-839f-6b610fa43a83","Type":"ContainerStarted","Data":"2b17caadefd6921bcedcede15e2f870ba4a271e5710ee1427f4ed94c70424550"} Dec 01 14:13:08 crc kubenswrapper[4585]: I1201 14:13:08.884912 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 01 14:13:09 crc kubenswrapper[4585]: I1201 14:13:09.419492 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 01 14:13:09 crc kubenswrapper[4585]: I1201 14:13:09.573684 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"471a678b-d81a-4526-b826-65b359685c99","Type":"ContainerStarted","Data":"d15aa62745527df548e7e1899915546119c80552d342952d35e0e2d0971b6ceb"} Dec 01 14:13:10 crc kubenswrapper[4585]: I1201 14:13:10.204865 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 14:13:10 crc kubenswrapper[4585]: I1201 14:13:10.218416 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 01 14:13:10 crc kubenswrapper[4585]: I1201 14:13:10.221630 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 14:13:10 crc kubenswrapper[4585]: I1201 14:13:10.222682 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-hzsgf" Dec 01 14:13:10 crc kubenswrapper[4585]: I1201 14:13:10.381314 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-585z6\" (UniqueName: \"kubernetes.io/projected/b47683b1-2753-468a-b272-f9f1760a71f3-kube-api-access-585z6\") pod \"kube-state-metrics-0\" (UID: \"b47683b1-2753-468a-b272-f9f1760a71f3\") " pod="openstack/kube-state-metrics-0" Dec 01 14:13:10 crc kubenswrapper[4585]: I1201 14:13:10.482874 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-585z6\" (UniqueName: \"kubernetes.io/projected/b47683b1-2753-468a-b272-f9f1760a71f3-kube-api-access-585z6\") pod \"kube-state-metrics-0\" (UID: \"b47683b1-2753-468a-b272-f9f1760a71f3\") " pod="openstack/kube-state-metrics-0" Dec 01 14:13:10 crc kubenswrapper[4585]: I1201 14:13:10.502144 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-585z6\" (UniqueName: \"kubernetes.io/projected/b47683b1-2753-468a-b272-f9f1760a71f3-kube-api-access-585z6\") pod \"kube-state-metrics-0\" (UID: \"b47683b1-2753-468a-b272-f9f1760a71f3\") " pod="openstack/kube-state-metrics-0" Dec 01 14:13:10 crc kubenswrapper[4585]: I1201 14:13:10.576256 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 01 14:13:10 crc kubenswrapper[4585]: I1201 14:13:10.612414 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"bbcd4b11-d625-4425-82d8-7c32d8c24c5c","Type":"ContainerStarted","Data":"c80659db4803f4fb94c04275b2ba4295e7dabfd203f3cf51b5d2c6c94dc42f79"} Dec 01 14:13:11 crc kubenswrapper[4585]: I1201 14:13:11.511201 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 14:13:12 crc kubenswrapper[4585]: I1201 14:13:12.590192 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 01 14:13:12 crc kubenswrapper[4585]: I1201 14:13:12.607487 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 01 14:13:12 crc kubenswrapper[4585]: I1201 14:13:12.607686 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 01 14:13:12 crc kubenswrapper[4585]: I1201 14:13:12.613188 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 01 14:13:12 crc kubenswrapper[4585]: I1201 14:13:12.613212 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-pg7n8" Dec 01 14:13:12 crc kubenswrapper[4585]: I1201 14:13:12.613370 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 01 14:13:12 crc kubenswrapper[4585]: I1201 14:13:12.613522 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 01 14:13:12 crc kubenswrapper[4585]: I1201 14:13:12.623015 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 01 14:13:12 crc kubenswrapper[4585]: I1201 14:13:12.728204 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"fa90f38d-8525-4b91-9a7b-717ddc968614\") " pod="openstack/ovsdbserver-nb-0" Dec 01 14:13:12 crc kubenswrapper[4585]: I1201 14:13:12.728529 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fa90f38d-8525-4b91-9a7b-717ddc968614-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"fa90f38d-8525-4b91-9a7b-717ddc968614\") " pod="openstack/ovsdbserver-nb-0" Dec 01 14:13:12 crc kubenswrapper[4585]: I1201 14:13:12.728559 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa90f38d-8525-4b91-9a7b-717ddc968614-config\") pod \"ovsdbserver-nb-0\" (UID: \"fa90f38d-8525-4b91-9a7b-717ddc968614\") " pod="openstack/ovsdbserver-nb-0" Dec 01 14:13:12 crc kubenswrapper[4585]: I1201 14:13:12.728573 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa90f38d-8525-4b91-9a7b-717ddc968614-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"fa90f38d-8525-4b91-9a7b-717ddc968614\") " pod="openstack/ovsdbserver-nb-0" Dec 01 14:13:12 crc kubenswrapper[4585]: I1201 14:13:12.728624 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85sg5\" (UniqueName: \"kubernetes.io/projected/fa90f38d-8525-4b91-9a7b-717ddc968614-kube-api-access-85sg5\") pod \"ovsdbserver-nb-0\" (UID: \"fa90f38d-8525-4b91-9a7b-717ddc968614\") " pod="openstack/ovsdbserver-nb-0" Dec 01 14:13:12 crc kubenswrapper[4585]: I1201 14:13:12.728660 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa90f38d-8525-4b91-9a7b-717ddc968614-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"fa90f38d-8525-4b91-9a7b-717ddc968614\") " pod="openstack/ovsdbserver-nb-0" Dec 01 14:13:12 crc kubenswrapper[4585]: I1201 14:13:12.728680 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa90f38d-8525-4b91-9a7b-717ddc968614-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"fa90f38d-8525-4b91-9a7b-717ddc968614\") " pod="openstack/ovsdbserver-nb-0" Dec 01 14:13:12 crc kubenswrapper[4585]: I1201 14:13:12.728697 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fa90f38d-8525-4b91-9a7b-717ddc968614-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"fa90f38d-8525-4b91-9a7b-717ddc968614\") " pod="openstack/ovsdbserver-nb-0" Dec 01 14:13:12 crc kubenswrapper[4585]: I1201 14:13:12.829695 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"fa90f38d-8525-4b91-9a7b-717ddc968614\") " pod="openstack/ovsdbserver-nb-0" Dec 01 14:13:12 crc kubenswrapper[4585]: I1201 14:13:12.829778 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fa90f38d-8525-4b91-9a7b-717ddc968614-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"fa90f38d-8525-4b91-9a7b-717ddc968614\") " pod="openstack/ovsdbserver-nb-0" Dec 01 14:13:12 crc kubenswrapper[4585]: I1201 14:13:12.829805 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa90f38d-8525-4b91-9a7b-717ddc968614-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"fa90f38d-8525-4b91-9a7b-717ddc968614\") " pod="openstack/ovsdbserver-nb-0" Dec 01 14:13:12 crc kubenswrapper[4585]: I1201 14:13:12.829823 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa90f38d-8525-4b91-9a7b-717ddc968614-config\") pod \"ovsdbserver-nb-0\" (UID: \"fa90f38d-8525-4b91-9a7b-717ddc968614\") " pod="openstack/ovsdbserver-nb-0" Dec 01 14:13:12 crc kubenswrapper[4585]: I1201 14:13:12.829878 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85sg5\" (UniqueName: \"kubernetes.io/projected/fa90f38d-8525-4b91-9a7b-717ddc968614-kube-api-access-85sg5\") pod \"ovsdbserver-nb-0\" (UID: \"fa90f38d-8525-4b91-9a7b-717ddc968614\") " pod="openstack/ovsdbserver-nb-0" Dec 01 14:13:12 crc kubenswrapper[4585]: I1201 14:13:12.830291 4585 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"fa90f38d-8525-4b91-9a7b-717ddc968614\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/ovsdbserver-nb-0" Dec 01 14:13:12 crc kubenswrapper[4585]: I1201 14:13:12.830313 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa90f38d-8525-4b91-9a7b-717ddc968614-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"fa90f38d-8525-4b91-9a7b-717ddc968614\") " pod="openstack/ovsdbserver-nb-0" Dec 01 14:13:12 crc kubenswrapper[4585]: I1201 14:13:12.830510 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa90f38d-8525-4b91-9a7b-717ddc968614-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"fa90f38d-8525-4b91-9a7b-717ddc968614\") " pod="openstack/ovsdbserver-nb-0" Dec 01 14:13:12 crc kubenswrapper[4585]: I1201 14:13:12.830559 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fa90f38d-8525-4b91-9a7b-717ddc968614-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"fa90f38d-8525-4b91-9a7b-717ddc968614\") " pod="openstack/ovsdbserver-nb-0" Dec 01 14:13:12 crc kubenswrapper[4585]: I1201 14:13:12.831730 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fa90f38d-8525-4b91-9a7b-717ddc968614-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"fa90f38d-8525-4b91-9a7b-717ddc968614\") " pod="openstack/ovsdbserver-nb-0" Dec 01 14:13:12 crc kubenswrapper[4585]: I1201 14:13:12.832260 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa90f38d-8525-4b91-9a7b-717ddc968614-config\") pod \"ovsdbserver-nb-0\" (UID: \"fa90f38d-8525-4b91-9a7b-717ddc968614\") " pod="openstack/ovsdbserver-nb-0" Dec 01 14:13:12 crc kubenswrapper[4585]: I1201 14:13:12.833475 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fa90f38d-8525-4b91-9a7b-717ddc968614-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"fa90f38d-8525-4b91-9a7b-717ddc968614\") " pod="openstack/ovsdbserver-nb-0" Dec 01 14:13:12 crc kubenswrapper[4585]: I1201 14:13:12.859308 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa90f38d-8525-4b91-9a7b-717ddc968614-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"fa90f38d-8525-4b91-9a7b-717ddc968614\") " pod="openstack/ovsdbserver-nb-0" Dec 01 14:13:12 crc kubenswrapper[4585]: I1201 14:13:12.859337 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa90f38d-8525-4b91-9a7b-717ddc968614-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"fa90f38d-8525-4b91-9a7b-717ddc968614\") " pod="openstack/ovsdbserver-nb-0" Dec 01 14:13:12 crc kubenswrapper[4585]: I1201 14:13:12.860522 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa90f38d-8525-4b91-9a7b-717ddc968614-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"fa90f38d-8525-4b91-9a7b-717ddc968614\") " pod="openstack/ovsdbserver-nb-0" Dec 01 14:13:12 crc kubenswrapper[4585]: I1201 14:13:12.865980 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85sg5\" (UniqueName: \"kubernetes.io/projected/fa90f38d-8525-4b91-9a7b-717ddc968614-kube-api-access-85sg5\") pod \"ovsdbserver-nb-0\" (UID: \"fa90f38d-8525-4b91-9a7b-717ddc968614\") " pod="openstack/ovsdbserver-nb-0" Dec 01 14:13:12 crc kubenswrapper[4585]: I1201 14:13:12.872558 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"fa90f38d-8525-4b91-9a7b-717ddc968614\") " pod="openstack/ovsdbserver-nb-0" Dec 01 14:13:12 crc kubenswrapper[4585]: I1201 14:13:12.978051 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 01 14:13:14 crc kubenswrapper[4585]: I1201 14:13:14.357317 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-m7xvm"] Dec 01 14:13:14 crc kubenswrapper[4585]: I1201 14:13:14.358725 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-m7xvm" Dec 01 14:13:14 crc kubenswrapper[4585]: I1201 14:13:14.361305 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 01 14:13:14 crc kubenswrapper[4585]: I1201 14:13:14.361424 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-mfbb4" Dec 01 14:13:14 crc kubenswrapper[4585]: I1201 14:13:14.361524 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 01 14:13:14 crc kubenswrapper[4585]: I1201 14:13:14.376101 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-m7xvm"] Dec 01 14:13:14 crc kubenswrapper[4585]: I1201 14:13:14.404627 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-rt4xq"] Dec 01 14:13:14 crc kubenswrapper[4585]: I1201 14:13:14.406775 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-rt4xq" Dec 01 14:13:14 crc kubenswrapper[4585]: I1201 14:13:14.433400 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-rt4xq"] Dec 01 14:13:14 crc kubenswrapper[4585]: I1201 14:13:14.465152 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7c01d629-7b26-457f-8ab7-e67464b2e578-var-run\") pod \"ovn-controller-ovs-rt4xq\" (UID: \"7c01d629-7b26-457f-8ab7-e67464b2e578\") " pod="openstack/ovn-controller-ovs-rt4xq" Dec 01 14:13:14 crc kubenswrapper[4585]: I1201 14:13:14.465417 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f3d9474-e60e-401e-8597-1bd7af4f34c3-ovn-controller-tls-certs\") pod \"ovn-controller-m7xvm\" (UID: \"2f3d9474-e60e-401e-8597-1bd7af4f34c3\") " pod="openstack/ovn-controller-m7xvm" Dec 01 14:13:14 crc kubenswrapper[4585]: I1201 14:13:14.465482 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/7c01d629-7b26-457f-8ab7-e67464b2e578-etc-ovs\") pod \"ovn-controller-ovs-rt4xq\" (UID: \"7c01d629-7b26-457f-8ab7-e67464b2e578\") " pod="openstack/ovn-controller-ovs-rt4xq" Dec 01 14:13:14 crc kubenswrapper[4585]: I1201 14:13:14.465533 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wqmc\" (UniqueName: \"kubernetes.io/projected/7c01d629-7b26-457f-8ab7-e67464b2e578-kube-api-access-9wqmc\") pod \"ovn-controller-ovs-rt4xq\" (UID: \"7c01d629-7b26-457f-8ab7-e67464b2e578\") " pod="openstack/ovn-controller-ovs-rt4xq" Dec 01 14:13:14 crc kubenswrapper[4585]: I1201 14:13:14.465584 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2f3d9474-e60e-401e-8597-1bd7af4f34c3-scripts\") pod \"ovn-controller-m7xvm\" (UID: \"2f3d9474-e60e-401e-8597-1bd7af4f34c3\") " pod="openstack/ovn-controller-m7xvm" Dec 01 14:13:14 crc kubenswrapper[4585]: I1201 14:13:14.465637 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxn2n\" (UniqueName: \"kubernetes.io/projected/2f3d9474-e60e-401e-8597-1bd7af4f34c3-kube-api-access-zxn2n\") pod \"ovn-controller-m7xvm\" (UID: \"2f3d9474-e60e-401e-8597-1bd7af4f34c3\") " pod="openstack/ovn-controller-m7xvm" Dec 01 14:13:14 crc kubenswrapper[4585]: I1201 14:13:14.465700 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/7c01d629-7b26-457f-8ab7-e67464b2e578-var-lib\") pod \"ovn-controller-ovs-rt4xq\" (UID: \"7c01d629-7b26-457f-8ab7-e67464b2e578\") " pod="openstack/ovn-controller-ovs-rt4xq" Dec 01 14:13:14 crc kubenswrapper[4585]: I1201 14:13:14.465733 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7c01d629-7b26-457f-8ab7-e67464b2e578-scripts\") pod \"ovn-controller-ovs-rt4xq\" (UID: \"7c01d629-7b26-457f-8ab7-e67464b2e578\") " pod="openstack/ovn-controller-ovs-rt4xq" Dec 01 14:13:14 crc kubenswrapper[4585]: I1201 14:13:14.465770 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f3d9474-e60e-401e-8597-1bd7af4f34c3-combined-ca-bundle\") pod \"ovn-controller-m7xvm\" (UID: \"2f3d9474-e60e-401e-8597-1bd7af4f34c3\") " pod="openstack/ovn-controller-m7xvm" Dec 01 14:13:14 crc kubenswrapper[4585]: I1201 14:13:14.465795 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2f3d9474-e60e-401e-8597-1bd7af4f34c3-var-run\") pod \"ovn-controller-m7xvm\" (UID: \"2f3d9474-e60e-401e-8597-1bd7af4f34c3\") " pod="openstack/ovn-controller-m7xvm" Dec 01 14:13:14 crc kubenswrapper[4585]: I1201 14:13:14.465861 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2f3d9474-e60e-401e-8597-1bd7af4f34c3-var-run-ovn\") pod \"ovn-controller-m7xvm\" (UID: \"2f3d9474-e60e-401e-8597-1bd7af4f34c3\") " pod="openstack/ovn-controller-m7xvm" Dec 01 14:13:14 crc kubenswrapper[4585]: I1201 14:13:14.465911 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2f3d9474-e60e-401e-8597-1bd7af4f34c3-var-log-ovn\") pod \"ovn-controller-m7xvm\" (UID: \"2f3d9474-e60e-401e-8597-1bd7af4f34c3\") " pod="openstack/ovn-controller-m7xvm" Dec 01 14:13:14 crc kubenswrapper[4585]: I1201 14:13:14.465931 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/7c01d629-7b26-457f-8ab7-e67464b2e578-var-log\") pod \"ovn-controller-ovs-rt4xq\" (UID: \"7c01d629-7b26-457f-8ab7-e67464b2e578\") " pod="openstack/ovn-controller-ovs-rt4xq" Dec 01 14:13:14 crc kubenswrapper[4585]: I1201 14:13:14.567409 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f3d9474-e60e-401e-8597-1bd7af4f34c3-combined-ca-bundle\") pod \"ovn-controller-m7xvm\" (UID: \"2f3d9474-e60e-401e-8597-1bd7af4f34c3\") " pod="openstack/ovn-controller-m7xvm" Dec 01 14:13:14 crc kubenswrapper[4585]: I1201 14:13:14.567461 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2f3d9474-e60e-401e-8597-1bd7af4f34c3-var-run\") pod \"ovn-controller-m7xvm\" (UID: \"2f3d9474-e60e-401e-8597-1bd7af4f34c3\") " pod="openstack/ovn-controller-m7xvm" Dec 01 14:13:14 crc kubenswrapper[4585]: I1201 14:13:14.567501 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2f3d9474-e60e-401e-8597-1bd7af4f34c3-var-run-ovn\") pod \"ovn-controller-m7xvm\" (UID: \"2f3d9474-e60e-401e-8597-1bd7af4f34c3\") " pod="openstack/ovn-controller-m7xvm" Dec 01 14:13:14 crc kubenswrapper[4585]: I1201 14:13:14.567518 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2f3d9474-e60e-401e-8597-1bd7af4f34c3-var-log-ovn\") pod \"ovn-controller-m7xvm\" (UID: \"2f3d9474-e60e-401e-8597-1bd7af4f34c3\") " pod="openstack/ovn-controller-m7xvm" Dec 01 14:13:14 crc kubenswrapper[4585]: I1201 14:13:14.567539 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/7c01d629-7b26-457f-8ab7-e67464b2e578-var-log\") pod \"ovn-controller-ovs-rt4xq\" (UID: \"7c01d629-7b26-457f-8ab7-e67464b2e578\") " pod="openstack/ovn-controller-ovs-rt4xq" Dec 01 14:13:14 crc kubenswrapper[4585]: I1201 14:13:14.567561 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7c01d629-7b26-457f-8ab7-e67464b2e578-var-run\") pod \"ovn-controller-ovs-rt4xq\" (UID: \"7c01d629-7b26-457f-8ab7-e67464b2e578\") " pod="openstack/ovn-controller-ovs-rt4xq" Dec 01 14:13:14 crc kubenswrapper[4585]: I1201 14:13:14.567575 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f3d9474-e60e-401e-8597-1bd7af4f34c3-ovn-controller-tls-certs\") pod \"ovn-controller-m7xvm\" (UID: \"2f3d9474-e60e-401e-8597-1bd7af4f34c3\") " pod="openstack/ovn-controller-m7xvm" Dec 01 14:13:14 crc kubenswrapper[4585]: I1201 14:13:14.567593 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/7c01d629-7b26-457f-8ab7-e67464b2e578-etc-ovs\") pod \"ovn-controller-ovs-rt4xq\" (UID: \"7c01d629-7b26-457f-8ab7-e67464b2e578\") " pod="openstack/ovn-controller-ovs-rt4xq" Dec 01 14:13:14 crc kubenswrapper[4585]: I1201 14:13:14.567613 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wqmc\" (UniqueName: \"kubernetes.io/projected/7c01d629-7b26-457f-8ab7-e67464b2e578-kube-api-access-9wqmc\") pod \"ovn-controller-ovs-rt4xq\" (UID: \"7c01d629-7b26-457f-8ab7-e67464b2e578\") " pod="openstack/ovn-controller-ovs-rt4xq" Dec 01 14:13:14 crc kubenswrapper[4585]: I1201 14:13:14.567632 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2f3d9474-e60e-401e-8597-1bd7af4f34c3-scripts\") pod \"ovn-controller-m7xvm\" (UID: \"2f3d9474-e60e-401e-8597-1bd7af4f34c3\") " pod="openstack/ovn-controller-m7xvm" Dec 01 14:13:14 crc kubenswrapper[4585]: I1201 14:13:14.567661 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxn2n\" (UniqueName: \"kubernetes.io/projected/2f3d9474-e60e-401e-8597-1bd7af4f34c3-kube-api-access-zxn2n\") pod \"ovn-controller-m7xvm\" (UID: \"2f3d9474-e60e-401e-8597-1bd7af4f34c3\") " pod="openstack/ovn-controller-m7xvm" Dec 01 14:13:14 crc kubenswrapper[4585]: I1201 14:13:14.567701 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/7c01d629-7b26-457f-8ab7-e67464b2e578-var-lib\") pod \"ovn-controller-ovs-rt4xq\" (UID: \"7c01d629-7b26-457f-8ab7-e67464b2e578\") " pod="openstack/ovn-controller-ovs-rt4xq" Dec 01 14:13:14 crc kubenswrapper[4585]: I1201 14:13:14.567717 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7c01d629-7b26-457f-8ab7-e67464b2e578-scripts\") pod \"ovn-controller-ovs-rt4xq\" (UID: \"7c01d629-7b26-457f-8ab7-e67464b2e578\") " pod="openstack/ovn-controller-ovs-rt4xq" Dec 01 14:13:14 crc kubenswrapper[4585]: I1201 14:13:14.570717 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7c01d629-7b26-457f-8ab7-e67464b2e578-scripts\") pod \"ovn-controller-ovs-rt4xq\" (UID: \"7c01d629-7b26-457f-8ab7-e67464b2e578\") " pod="openstack/ovn-controller-ovs-rt4xq" Dec 01 14:13:14 crc kubenswrapper[4585]: I1201 14:13:14.571127 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7c01d629-7b26-457f-8ab7-e67464b2e578-var-run\") pod \"ovn-controller-ovs-rt4xq\" (UID: \"7c01d629-7b26-457f-8ab7-e67464b2e578\") " pod="openstack/ovn-controller-ovs-rt4xq" Dec 01 14:13:14 crc kubenswrapper[4585]: I1201 14:13:14.573665 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/7c01d629-7b26-457f-8ab7-e67464b2e578-etc-ovs\") pod \"ovn-controller-ovs-rt4xq\" (UID: \"7c01d629-7b26-457f-8ab7-e67464b2e578\") " pod="openstack/ovn-controller-ovs-rt4xq" Dec 01 14:13:14 crc kubenswrapper[4585]: I1201 14:13:14.574185 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2f3d9474-e60e-401e-8597-1bd7af4f34c3-var-log-ovn\") pod \"ovn-controller-m7xvm\" (UID: \"2f3d9474-e60e-401e-8597-1bd7af4f34c3\") " pod="openstack/ovn-controller-m7xvm" Dec 01 14:13:14 crc kubenswrapper[4585]: I1201 14:13:14.574291 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/7c01d629-7b26-457f-8ab7-e67464b2e578-var-log\") pod \"ovn-controller-ovs-rt4xq\" (UID: \"7c01d629-7b26-457f-8ab7-e67464b2e578\") " pod="openstack/ovn-controller-ovs-rt4xq" Dec 01 14:13:14 crc kubenswrapper[4585]: I1201 14:13:14.574348 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2f3d9474-e60e-401e-8597-1bd7af4f34c3-var-run\") pod \"ovn-controller-m7xvm\" (UID: \"2f3d9474-e60e-401e-8597-1bd7af4f34c3\") " pod="openstack/ovn-controller-m7xvm" Dec 01 14:13:14 crc kubenswrapper[4585]: I1201 14:13:14.574620 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/7c01d629-7b26-457f-8ab7-e67464b2e578-var-lib\") pod \"ovn-controller-ovs-rt4xq\" (UID: \"7c01d629-7b26-457f-8ab7-e67464b2e578\") " pod="openstack/ovn-controller-ovs-rt4xq" Dec 01 14:13:14 crc kubenswrapper[4585]: I1201 14:13:14.574719 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2f3d9474-e60e-401e-8597-1bd7af4f34c3-var-run-ovn\") pod \"ovn-controller-m7xvm\" (UID: \"2f3d9474-e60e-401e-8597-1bd7af4f34c3\") " pod="openstack/ovn-controller-m7xvm" Dec 01 14:13:14 crc kubenswrapper[4585]: I1201 14:13:14.592808 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wqmc\" (UniqueName: \"kubernetes.io/projected/7c01d629-7b26-457f-8ab7-e67464b2e578-kube-api-access-9wqmc\") pod \"ovn-controller-ovs-rt4xq\" (UID: \"7c01d629-7b26-457f-8ab7-e67464b2e578\") " pod="openstack/ovn-controller-ovs-rt4xq" Dec 01 14:13:14 crc kubenswrapper[4585]: I1201 14:13:14.596696 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxn2n\" (UniqueName: \"kubernetes.io/projected/2f3d9474-e60e-401e-8597-1bd7af4f34c3-kube-api-access-zxn2n\") pod \"ovn-controller-m7xvm\" (UID: \"2f3d9474-e60e-401e-8597-1bd7af4f34c3\") " pod="openstack/ovn-controller-m7xvm" Dec 01 14:13:14 crc kubenswrapper[4585]: I1201 14:13:14.601019 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f3d9474-e60e-401e-8597-1bd7af4f34c3-ovn-controller-tls-certs\") pod \"ovn-controller-m7xvm\" (UID: \"2f3d9474-e60e-401e-8597-1bd7af4f34c3\") " pod="openstack/ovn-controller-m7xvm" Dec 01 14:13:14 crc kubenswrapper[4585]: I1201 14:13:14.601832 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f3d9474-e60e-401e-8597-1bd7af4f34c3-combined-ca-bundle\") pod \"ovn-controller-m7xvm\" (UID: \"2f3d9474-e60e-401e-8597-1bd7af4f34c3\") " pod="openstack/ovn-controller-m7xvm" Dec 01 14:13:14 crc kubenswrapper[4585]: I1201 14:13:14.602818 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2f3d9474-e60e-401e-8597-1bd7af4f34c3-scripts\") pod \"ovn-controller-m7xvm\" (UID: \"2f3d9474-e60e-401e-8597-1bd7af4f34c3\") " pod="openstack/ovn-controller-m7xvm" Dec 01 14:13:14 crc kubenswrapper[4585]: I1201 14:13:14.684536 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-m7xvm" Dec 01 14:13:14 crc kubenswrapper[4585]: I1201 14:13:14.744175 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-rt4xq" Dec 01 14:13:16 crc kubenswrapper[4585]: I1201 14:13:16.839650 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 01 14:13:16 crc kubenswrapper[4585]: I1201 14:13:16.846936 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 01 14:13:16 crc kubenswrapper[4585]: I1201 14:13:16.863553 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 01 14:13:16 crc kubenswrapper[4585]: I1201 14:13:16.870852 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 01 14:13:16 crc kubenswrapper[4585]: I1201 14:13:16.871066 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 01 14:13:16 crc kubenswrapper[4585]: I1201 14:13:16.871134 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 01 14:13:16 crc kubenswrapper[4585]: I1201 14:13:16.871528 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-xbcsm" Dec 01 14:13:16 crc kubenswrapper[4585]: I1201 14:13:16.946649 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3ceca23-b268-4b00-a4c2-026390eae759-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"a3ceca23-b268-4b00-a4c2-026390eae759\") " pod="openstack/ovsdbserver-sb-0" Dec 01 14:13:16 crc kubenswrapper[4585]: I1201 14:13:16.946746 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a3ceca23-b268-4b00-a4c2-026390eae759-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"a3ceca23-b268-4b00-a4c2-026390eae759\") " pod="openstack/ovsdbserver-sb-0" Dec 01 14:13:16 crc kubenswrapper[4585]: I1201 14:13:16.946854 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bnq5\" (UniqueName: \"kubernetes.io/projected/a3ceca23-b268-4b00-a4c2-026390eae759-kube-api-access-4bnq5\") pod \"ovsdbserver-sb-0\" (UID: \"a3ceca23-b268-4b00-a4c2-026390eae759\") " pod="openstack/ovsdbserver-sb-0" Dec 01 14:13:16 crc kubenswrapper[4585]: I1201 14:13:16.946944 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"a3ceca23-b268-4b00-a4c2-026390eae759\") " pod="openstack/ovsdbserver-sb-0" Dec 01 14:13:16 crc kubenswrapper[4585]: I1201 14:13:16.947015 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3ceca23-b268-4b00-a4c2-026390eae759-config\") pod \"ovsdbserver-sb-0\" (UID: \"a3ceca23-b268-4b00-a4c2-026390eae759\") " pod="openstack/ovsdbserver-sb-0" Dec 01 14:13:16 crc kubenswrapper[4585]: I1201 14:13:16.947043 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a3ceca23-b268-4b00-a4c2-026390eae759-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"a3ceca23-b268-4b00-a4c2-026390eae759\") " pod="openstack/ovsdbserver-sb-0" Dec 01 14:13:16 crc kubenswrapper[4585]: I1201 14:13:16.947534 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3ceca23-b268-4b00-a4c2-026390eae759-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"a3ceca23-b268-4b00-a4c2-026390eae759\") " pod="openstack/ovsdbserver-sb-0" Dec 01 14:13:16 crc kubenswrapper[4585]: I1201 14:13:16.947564 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3ceca23-b268-4b00-a4c2-026390eae759-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"a3ceca23-b268-4b00-a4c2-026390eae759\") " pod="openstack/ovsdbserver-sb-0" Dec 01 14:13:17 crc kubenswrapper[4585]: I1201 14:13:17.049486 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a3ceca23-b268-4b00-a4c2-026390eae759-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"a3ceca23-b268-4b00-a4c2-026390eae759\") " pod="openstack/ovsdbserver-sb-0" Dec 01 14:13:17 crc kubenswrapper[4585]: I1201 14:13:17.049555 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3ceca23-b268-4b00-a4c2-026390eae759-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"a3ceca23-b268-4b00-a4c2-026390eae759\") " pod="openstack/ovsdbserver-sb-0" Dec 01 14:13:17 crc kubenswrapper[4585]: I1201 14:13:17.049581 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3ceca23-b268-4b00-a4c2-026390eae759-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"a3ceca23-b268-4b00-a4c2-026390eae759\") " pod="openstack/ovsdbserver-sb-0" Dec 01 14:13:17 crc kubenswrapper[4585]: I1201 14:13:17.049633 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3ceca23-b268-4b00-a4c2-026390eae759-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"a3ceca23-b268-4b00-a4c2-026390eae759\") " pod="openstack/ovsdbserver-sb-0" Dec 01 14:13:17 crc kubenswrapper[4585]: I1201 14:13:17.049706 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a3ceca23-b268-4b00-a4c2-026390eae759-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"a3ceca23-b268-4b00-a4c2-026390eae759\") " pod="openstack/ovsdbserver-sb-0" Dec 01 14:13:17 crc kubenswrapper[4585]: I1201 14:13:17.049747 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bnq5\" (UniqueName: \"kubernetes.io/projected/a3ceca23-b268-4b00-a4c2-026390eae759-kube-api-access-4bnq5\") pod \"ovsdbserver-sb-0\" (UID: \"a3ceca23-b268-4b00-a4c2-026390eae759\") " pod="openstack/ovsdbserver-sb-0" Dec 01 14:13:17 crc kubenswrapper[4585]: I1201 14:13:17.049838 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"a3ceca23-b268-4b00-a4c2-026390eae759\") " pod="openstack/ovsdbserver-sb-0" Dec 01 14:13:17 crc kubenswrapper[4585]: I1201 14:13:17.049876 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3ceca23-b268-4b00-a4c2-026390eae759-config\") pod \"ovsdbserver-sb-0\" (UID: \"a3ceca23-b268-4b00-a4c2-026390eae759\") " pod="openstack/ovsdbserver-sb-0" Dec 01 14:13:17 crc kubenswrapper[4585]: I1201 14:13:17.051094 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3ceca23-b268-4b00-a4c2-026390eae759-config\") pod \"ovsdbserver-sb-0\" (UID: \"a3ceca23-b268-4b00-a4c2-026390eae759\") " pod="openstack/ovsdbserver-sb-0" Dec 01 14:13:17 crc kubenswrapper[4585]: I1201 14:13:17.051334 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a3ceca23-b268-4b00-a4c2-026390eae759-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"a3ceca23-b268-4b00-a4c2-026390eae759\") " pod="openstack/ovsdbserver-sb-0" Dec 01 14:13:17 crc kubenswrapper[4585]: I1201 14:13:17.051471 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a3ceca23-b268-4b00-a4c2-026390eae759-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"a3ceca23-b268-4b00-a4c2-026390eae759\") " pod="openstack/ovsdbserver-sb-0" Dec 01 14:13:17 crc kubenswrapper[4585]: I1201 14:13:17.051709 4585 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"a3ceca23-b268-4b00-a4c2-026390eae759\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/ovsdbserver-sb-0" Dec 01 14:13:17 crc kubenswrapper[4585]: I1201 14:13:17.057394 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3ceca23-b268-4b00-a4c2-026390eae759-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"a3ceca23-b268-4b00-a4c2-026390eae759\") " pod="openstack/ovsdbserver-sb-0" Dec 01 14:13:17 crc kubenswrapper[4585]: I1201 14:13:17.059901 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3ceca23-b268-4b00-a4c2-026390eae759-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"a3ceca23-b268-4b00-a4c2-026390eae759\") " pod="openstack/ovsdbserver-sb-0" Dec 01 14:13:17 crc kubenswrapper[4585]: I1201 14:13:17.068785 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3ceca23-b268-4b00-a4c2-026390eae759-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"a3ceca23-b268-4b00-a4c2-026390eae759\") " pod="openstack/ovsdbserver-sb-0" Dec 01 14:13:17 crc kubenswrapper[4585]: I1201 14:13:17.071741 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bnq5\" (UniqueName: \"kubernetes.io/projected/a3ceca23-b268-4b00-a4c2-026390eae759-kube-api-access-4bnq5\") pod \"ovsdbserver-sb-0\" (UID: \"a3ceca23-b268-4b00-a4c2-026390eae759\") " pod="openstack/ovsdbserver-sb-0" Dec 01 14:13:17 crc kubenswrapper[4585]: I1201 14:13:17.114087 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"a3ceca23-b268-4b00-a4c2-026390eae759\") " pod="openstack/ovsdbserver-sb-0" Dec 01 14:13:17 crc kubenswrapper[4585]: I1201 14:13:17.179011 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 01 14:13:19 crc kubenswrapper[4585]: I1201 14:13:19.443410 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ncdb4"] Dec 01 14:13:19 crc kubenswrapper[4585]: I1201 14:13:19.446089 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ncdb4" Dec 01 14:13:19 crc kubenswrapper[4585]: I1201 14:13:19.460444 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ncdb4"] Dec 01 14:13:19 crc kubenswrapper[4585]: I1201 14:13:19.491305 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cca7a4db-9ec1-4a3c-8458-ab724cdd6861-utilities\") pod \"redhat-operators-ncdb4\" (UID: \"cca7a4db-9ec1-4a3c-8458-ab724cdd6861\") " pod="openshift-marketplace/redhat-operators-ncdb4" Dec 01 14:13:19 crc kubenswrapper[4585]: I1201 14:13:19.491406 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cca7a4db-9ec1-4a3c-8458-ab724cdd6861-catalog-content\") pod \"redhat-operators-ncdb4\" (UID: \"cca7a4db-9ec1-4a3c-8458-ab724cdd6861\") " pod="openshift-marketplace/redhat-operators-ncdb4" Dec 01 14:13:19 crc kubenswrapper[4585]: I1201 14:13:19.491426 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttnbx\" (UniqueName: \"kubernetes.io/projected/cca7a4db-9ec1-4a3c-8458-ab724cdd6861-kube-api-access-ttnbx\") pod \"redhat-operators-ncdb4\" (UID: \"cca7a4db-9ec1-4a3c-8458-ab724cdd6861\") " pod="openshift-marketplace/redhat-operators-ncdb4" Dec 01 14:13:19 crc kubenswrapper[4585]: I1201 14:13:19.592712 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cca7a4db-9ec1-4a3c-8458-ab724cdd6861-utilities\") pod \"redhat-operators-ncdb4\" (UID: \"cca7a4db-9ec1-4a3c-8458-ab724cdd6861\") " pod="openshift-marketplace/redhat-operators-ncdb4" Dec 01 14:13:19 crc kubenswrapper[4585]: I1201 14:13:19.592820 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cca7a4db-9ec1-4a3c-8458-ab724cdd6861-catalog-content\") pod \"redhat-operators-ncdb4\" (UID: \"cca7a4db-9ec1-4a3c-8458-ab724cdd6861\") " pod="openshift-marketplace/redhat-operators-ncdb4" Dec 01 14:13:19 crc kubenswrapper[4585]: I1201 14:13:19.592852 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttnbx\" (UniqueName: \"kubernetes.io/projected/cca7a4db-9ec1-4a3c-8458-ab724cdd6861-kube-api-access-ttnbx\") pod \"redhat-operators-ncdb4\" (UID: \"cca7a4db-9ec1-4a3c-8458-ab724cdd6861\") " pod="openshift-marketplace/redhat-operators-ncdb4" Dec 01 14:13:19 crc kubenswrapper[4585]: I1201 14:13:19.593382 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cca7a4db-9ec1-4a3c-8458-ab724cdd6861-catalog-content\") pod \"redhat-operators-ncdb4\" (UID: \"cca7a4db-9ec1-4a3c-8458-ab724cdd6861\") " pod="openshift-marketplace/redhat-operators-ncdb4" Dec 01 14:13:19 crc kubenswrapper[4585]: I1201 14:13:19.593440 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cca7a4db-9ec1-4a3c-8458-ab724cdd6861-utilities\") pod \"redhat-operators-ncdb4\" (UID: \"cca7a4db-9ec1-4a3c-8458-ab724cdd6861\") " pod="openshift-marketplace/redhat-operators-ncdb4" Dec 01 14:13:19 crc kubenswrapper[4585]: I1201 14:13:19.627491 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttnbx\" (UniqueName: \"kubernetes.io/projected/cca7a4db-9ec1-4a3c-8458-ab724cdd6861-kube-api-access-ttnbx\") pod \"redhat-operators-ncdb4\" (UID: \"cca7a4db-9ec1-4a3c-8458-ab724cdd6861\") " pod="openshift-marketplace/redhat-operators-ncdb4" Dec 01 14:13:19 crc kubenswrapper[4585]: I1201 14:13:19.775423 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ncdb4" Dec 01 14:13:23 crc kubenswrapper[4585]: I1201 14:13:23.737213 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b47683b1-2753-468a-b272-f9f1760a71f3","Type":"ContainerStarted","Data":"a11ccd462fd7d2294421da4cc2582211d5c2da0344a85037b5341f6ea6cccbfa"} Dec 01 14:13:27 crc kubenswrapper[4585]: E1201 14:13:27.048814 4585 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Dec 01 14:13:27 crc kubenswrapper[4585]: E1201 14:13:27.049391 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jtdpp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(6c266121-e7d2-42aa-b1d9-0d15bdd0f798): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 14:13:27 crc kubenswrapper[4585]: E1201 14:13:27.050595 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="6c266121-e7d2-42aa-b1d9-0d15bdd0f798" Dec 01 14:13:27 crc kubenswrapper[4585]: E1201 14:13:27.118785 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="6c266121-e7d2-42aa-b1d9-0d15bdd0f798" Dec 01 14:13:27 crc kubenswrapper[4585]: E1201 14:13:27.736810 4585 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-memcached:current-podified" Dec 01 14:13:27 crc kubenswrapper[4585]: E1201 14:13:27.737321 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:memcached,Image:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,Command:[/usr/bin/dumb-init -- /usr/local/bin/kolla_start],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:memcached,HostPort:0,ContainerPort:11211,Protocol:TCP,HostIP:,},ContainerPort{Name:memcached-tls,HostPort:0,ContainerPort:11212,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:POD_IPS,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIPs,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:CONFIG_HASH,Value:n5bdhf8h64fhbch65bh55dh65dhbdhffh67bhdfh696h5d8h5ddh5fbh555hddh5c9h5d5h5b8h695hfbh5b6h644hf4hb5h657hc6h54bh78h649h57dq,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/src,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/certs/memcached.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/private/memcached.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pjmht,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42457,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42457,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod memcached-0_openstack(bbcd4b11-d625-4425-82d8-7c32d8c24c5c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 14:13:27 crc kubenswrapper[4585]: E1201 14:13:27.738667 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/memcached-0" podUID="bbcd4b11-d625-4425-82d8-7c32d8c24c5c" Dec 01 14:13:27 crc kubenswrapper[4585]: E1201 14:13:27.751024 4585 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Dec 01 14:13:27 crc kubenswrapper[4585]: E1201 14:13:27.751205 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gwnv5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(d41c9a27-f15b-44c5-84b2-0e083f8dc837): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 14:13:27 crc kubenswrapper[4585]: E1201 14:13:27.752403 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="d41c9a27-f15b-44c5-84b2-0e083f8dc837" Dec 01 14:13:28 crc kubenswrapper[4585]: E1201 14:13:28.123461 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-server-0" podUID="d41c9a27-f15b-44c5-84b2-0e083f8dc837" Dec 01 14:13:28 crc kubenswrapper[4585]: E1201 14:13:28.123802 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-memcached:current-podified\\\"\"" pod="openstack/memcached-0" podUID="bbcd4b11-d625-4425-82d8-7c32d8c24c5c" Dec 01 14:13:33 crc kubenswrapper[4585]: I1201 14:13:33.776998 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wwrrs"] Dec 01 14:13:33 crc kubenswrapper[4585]: I1201 14:13:33.782336 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wwrrs" Dec 01 14:13:33 crc kubenswrapper[4585]: I1201 14:13:33.788552 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wwrrs"] Dec 01 14:13:33 crc kubenswrapper[4585]: I1201 14:13:33.914186 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c27ce924-56e0-4896-9809-6b44ba1c215b-catalog-content\") pod \"certified-operators-wwrrs\" (UID: \"c27ce924-56e0-4896-9809-6b44ba1c215b\") " pod="openshift-marketplace/certified-operators-wwrrs" Dec 01 14:13:33 crc kubenswrapper[4585]: I1201 14:13:33.914261 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvz9t\" (UniqueName: \"kubernetes.io/projected/c27ce924-56e0-4896-9809-6b44ba1c215b-kube-api-access-fvz9t\") pod \"certified-operators-wwrrs\" (UID: \"c27ce924-56e0-4896-9809-6b44ba1c215b\") " pod="openshift-marketplace/certified-operators-wwrrs" Dec 01 14:13:33 crc kubenswrapper[4585]: I1201 14:13:33.914420 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c27ce924-56e0-4896-9809-6b44ba1c215b-utilities\") pod \"certified-operators-wwrrs\" (UID: \"c27ce924-56e0-4896-9809-6b44ba1c215b\") " pod="openshift-marketplace/certified-operators-wwrrs" Dec 01 14:13:34 crc kubenswrapper[4585]: I1201 14:13:34.015474 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c27ce924-56e0-4896-9809-6b44ba1c215b-catalog-content\") pod \"certified-operators-wwrrs\" (UID: \"c27ce924-56e0-4896-9809-6b44ba1c215b\") " pod="openshift-marketplace/certified-operators-wwrrs" Dec 01 14:13:34 crc kubenswrapper[4585]: I1201 14:13:34.015524 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvz9t\" (UniqueName: \"kubernetes.io/projected/c27ce924-56e0-4896-9809-6b44ba1c215b-kube-api-access-fvz9t\") pod \"certified-operators-wwrrs\" (UID: \"c27ce924-56e0-4896-9809-6b44ba1c215b\") " pod="openshift-marketplace/certified-operators-wwrrs" Dec 01 14:13:34 crc kubenswrapper[4585]: I1201 14:13:34.015581 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c27ce924-56e0-4896-9809-6b44ba1c215b-utilities\") pod \"certified-operators-wwrrs\" (UID: \"c27ce924-56e0-4896-9809-6b44ba1c215b\") " pod="openshift-marketplace/certified-operators-wwrrs" Dec 01 14:13:34 crc kubenswrapper[4585]: I1201 14:13:34.016140 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c27ce924-56e0-4896-9809-6b44ba1c215b-utilities\") pod \"certified-operators-wwrrs\" (UID: \"c27ce924-56e0-4896-9809-6b44ba1c215b\") " pod="openshift-marketplace/certified-operators-wwrrs" Dec 01 14:13:34 crc kubenswrapper[4585]: I1201 14:13:34.016050 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c27ce924-56e0-4896-9809-6b44ba1c215b-catalog-content\") pod \"certified-operators-wwrrs\" (UID: \"c27ce924-56e0-4896-9809-6b44ba1c215b\") " pod="openshift-marketplace/certified-operators-wwrrs" Dec 01 14:13:34 crc kubenswrapper[4585]: I1201 14:13:34.041419 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvz9t\" (UniqueName: \"kubernetes.io/projected/c27ce924-56e0-4896-9809-6b44ba1c215b-kube-api-access-fvz9t\") pod \"certified-operators-wwrrs\" (UID: \"c27ce924-56e0-4896-9809-6b44ba1c215b\") " pod="openshift-marketplace/certified-operators-wwrrs" Dec 01 14:13:34 crc kubenswrapper[4585]: I1201 14:13:34.120063 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wwrrs" Dec 01 14:13:35 crc kubenswrapper[4585]: I1201 14:13:35.981897 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ncdb4"] Dec 01 14:13:36 crc kubenswrapper[4585]: E1201 14:13:36.465960 4585 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 01 14:13:36 crc kubenswrapper[4585]: E1201 14:13:36.466304 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fbxw6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5ccc8479f9-rghfw_openstack(35d48e93-76cf-4411-9772-d650bb88c378): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 14:13:36 crc kubenswrapper[4585]: E1201 14:13:36.467774 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5ccc8479f9-rghfw" podUID="35d48e93-76cf-4411-9772-d650bb88c378" Dec 01 14:13:36 crc kubenswrapper[4585]: E1201 14:13:36.499899 4585 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 01 14:13:36 crc kubenswrapper[4585]: E1201 14:13:36.500401 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nvlxp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-828g9_openstack(9fb3832e-e925-4f7d-8409-fc123bc61b44): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 14:13:36 crc kubenswrapper[4585]: E1201 14:13:36.505610 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-828g9" podUID="9fb3832e-e925-4f7d-8409-fc123bc61b44" Dec 01 14:13:36 crc kubenswrapper[4585]: E1201 14:13:36.515734 4585 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 01 14:13:36 crc kubenswrapper[4585]: E1201 14:13:36.515947 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rm2k5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-sc2xr_openstack(d6f85986-34f0-4c32-9c59-2a9d9aa8d7f7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 14:13:36 crc kubenswrapper[4585]: E1201 14:13:36.517379 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-sc2xr" podUID="d6f85986-34f0-4c32-9c59-2a9d9aa8d7f7" Dec 01 14:13:36 crc kubenswrapper[4585]: E1201 14:13:36.639460 4585 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 01 14:13:36 crc kubenswrapper[4585]: E1201 14:13:36.639794 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6hjt4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-ntss9_openstack(3e3ec753-b8c2-4b1c-b05e-06a62eefe234): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 14:13:36 crc kubenswrapper[4585]: E1201 14:13:36.641279 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-ntss9" podUID="3e3ec753-b8c2-4b1c-b05e-06a62eefe234" Dec 01 14:13:36 crc kubenswrapper[4585]: I1201 14:13:36.714514 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-m7xvm"] Dec 01 14:13:36 crc kubenswrapper[4585]: W1201 14:13:36.778072 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f3d9474_e60e_401e_8597_1bd7af4f34c3.slice/crio-6ea0b0dc987fcef1ab5e3afeb81688c3e54567fc2d8f10e76a7e04bbc5341ff3 WatchSource:0}: Error finding container 6ea0b0dc987fcef1ab5e3afeb81688c3e54567fc2d8f10e76a7e04bbc5341ff3: Status 404 returned error can't find the container with id 6ea0b0dc987fcef1ab5e3afeb81688c3e54567fc2d8f10e76a7e04bbc5341ff3 Dec 01 14:13:36 crc kubenswrapper[4585]: I1201 14:13:36.834804 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wwrrs"] Dec 01 14:13:36 crc kubenswrapper[4585]: W1201 14:13:36.943237 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc27ce924_56e0_4896_9809_6b44ba1c215b.slice/crio-251bb33fb2202960997322c33ff2b83b7a245b778256961376467542e98738f1 WatchSource:0}: Error finding container 251bb33fb2202960997322c33ff2b83b7a245b778256961376467542e98738f1: Status 404 returned error can't find the container with id 251bb33fb2202960997322c33ff2b83b7a245b778256961376467542e98738f1 Dec 01 14:13:37 crc kubenswrapper[4585]: I1201 14:13:37.194078 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-m7xvm" event={"ID":"2f3d9474-e60e-401e-8597-1bd7af4f34c3","Type":"ContainerStarted","Data":"6ea0b0dc987fcef1ab5e3afeb81688c3e54567fc2d8f10e76a7e04bbc5341ff3"} Dec 01 14:13:37 crc kubenswrapper[4585]: I1201 14:13:37.195492 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wwrrs" event={"ID":"c27ce924-56e0-4896-9809-6b44ba1c215b","Type":"ContainerStarted","Data":"251bb33fb2202960997322c33ff2b83b7a245b778256961376467542e98738f1"} Dec 01 14:13:37 crc kubenswrapper[4585]: I1201 14:13:37.197692 4585 generic.go:334] "Generic (PLEG): container finished" podID="cca7a4db-9ec1-4a3c-8458-ab724cdd6861" containerID="f4af9d3d5df9bb817d43a4fcf227b346ab6f58b2e386bff42298a911d6612d3a" exitCode=0 Dec 01 14:13:37 crc kubenswrapper[4585]: I1201 14:13:37.197794 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ncdb4" event={"ID":"cca7a4db-9ec1-4a3c-8458-ab724cdd6861","Type":"ContainerDied","Data":"f4af9d3d5df9bb817d43a4fcf227b346ab6f58b2e386bff42298a911d6612d3a"} Dec 01 14:13:37 crc kubenswrapper[4585]: I1201 14:13:37.197839 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ncdb4" event={"ID":"cca7a4db-9ec1-4a3c-8458-ab724cdd6861","Type":"ContainerStarted","Data":"cbd1c68a0082301f4ac70db9ba3bdc8168461564e0e5b637f025c4bc634414d3"} Dec 01 14:13:37 crc kubenswrapper[4585]: E1201 14:13:37.200661 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-5ccc8479f9-rghfw" podUID="35d48e93-76cf-4411-9772-d650bb88c378" Dec 01 14:13:37 crc kubenswrapper[4585]: E1201 14:13:37.200674 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-828g9" podUID="9fb3832e-e925-4f7d-8409-fc123bc61b44" Dec 01 14:13:37 crc kubenswrapper[4585]: I1201 14:13:37.413427 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 01 14:13:37 crc kubenswrapper[4585]: W1201 14:13:37.678688 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa90f38d_8525_4b91_9a7b_717ddc968614.slice/crio-fd41ebf35127476a508339094d7b01ca1879fa9d8448fe4b1da100614c3d4645 WatchSource:0}: Error finding container fd41ebf35127476a508339094d7b01ca1879fa9d8448fe4b1da100614c3d4645: Status 404 returned error can't find the container with id fd41ebf35127476a508339094d7b01ca1879fa9d8448fe4b1da100614c3d4645 Dec 01 14:13:37 crc kubenswrapper[4585]: I1201 14:13:37.694063 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-ntss9" Dec 01 14:13:37 crc kubenswrapper[4585]: I1201 14:13:37.779763 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hjt4\" (UniqueName: \"kubernetes.io/projected/3e3ec753-b8c2-4b1c-b05e-06a62eefe234-kube-api-access-6hjt4\") pod \"3e3ec753-b8c2-4b1c-b05e-06a62eefe234\" (UID: \"3e3ec753-b8c2-4b1c-b05e-06a62eefe234\") " Dec 01 14:13:37 crc kubenswrapper[4585]: I1201 14:13:37.780211 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e3ec753-b8c2-4b1c-b05e-06a62eefe234-config\") pod \"3e3ec753-b8c2-4b1c-b05e-06a62eefe234\" (UID: \"3e3ec753-b8c2-4b1c-b05e-06a62eefe234\") " Dec 01 14:13:37 crc kubenswrapper[4585]: I1201 14:13:37.781069 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e3ec753-b8c2-4b1c-b05e-06a62eefe234-config" (OuterVolumeSpecName: "config") pod "3e3ec753-b8c2-4b1c-b05e-06a62eefe234" (UID: "3e3ec753-b8c2-4b1c-b05e-06a62eefe234"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:13:37 crc kubenswrapper[4585]: I1201 14:13:37.786702 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e3ec753-b8c2-4b1c-b05e-06a62eefe234-kube-api-access-6hjt4" (OuterVolumeSpecName: "kube-api-access-6hjt4") pod "3e3ec753-b8c2-4b1c-b05e-06a62eefe234" (UID: "3e3ec753-b8c2-4b1c-b05e-06a62eefe234"). InnerVolumeSpecName "kube-api-access-6hjt4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:13:37 crc kubenswrapper[4585]: I1201 14:13:37.876186 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-sc2xr" Dec 01 14:13:37 crc kubenswrapper[4585]: I1201 14:13:37.882212 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e3ec753-b8c2-4b1c-b05e-06a62eefe234-config\") on node \"crc\" DevicePath \"\"" Dec 01 14:13:37 crc kubenswrapper[4585]: I1201 14:13:37.882255 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hjt4\" (UniqueName: \"kubernetes.io/projected/3e3ec753-b8c2-4b1c-b05e-06a62eefe234-kube-api-access-6hjt4\") on node \"crc\" DevicePath \"\"" Dec 01 14:13:37 crc kubenswrapper[4585]: I1201 14:13:37.982773 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6f85986-34f0-4c32-9c59-2a9d9aa8d7f7-config\") pod \"d6f85986-34f0-4c32-9c59-2a9d9aa8d7f7\" (UID: \"d6f85986-34f0-4c32-9c59-2a9d9aa8d7f7\") " Dec 01 14:13:37 crc kubenswrapper[4585]: I1201 14:13:37.982862 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6f85986-34f0-4c32-9c59-2a9d9aa8d7f7-dns-svc\") pod \"d6f85986-34f0-4c32-9c59-2a9d9aa8d7f7\" (UID: \"d6f85986-34f0-4c32-9c59-2a9d9aa8d7f7\") " Dec 01 14:13:37 crc kubenswrapper[4585]: I1201 14:13:37.982926 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rm2k5\" (UniqueName: \"kubernetes.io/projected/d6f85986-34f0-4c32-9c59-2a9d9aa8d7f7-kube-api-access-rm2k5\") pod \"d6f85986-34f0-4c32-9c59-2a9d9aa8d7f7\" (UID: \"d6f85986-34f0-4c32-9c59-2a9d9aa8d7f7\") " Dec 01 14:13:37 crc kubenswrapper[4585]: I1201 14:13:37.984260 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6f85986-34f0-4c32-9c59-2a9d9aa8d7f7-config" (OuterVolumeSpecName: "config") pod "d6f85986-34f0-4c32-9c59-2a9d9aa8d7f7" (UID: "d6f85986-34f0-4c32-9c59-2a9d9aa8d7f7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:13:37 crc kubenswrapper[4585]: I1201 14:13:37.985234 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6f85986-34f0-4c32-9c59-2a9d9aa8d7f7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d6f85986-34f0-4c32-9c59-2a9d9aa8d7f7" (UID: "d6f85986-34f0-4c32-9c59-2a9d9aa8d7f7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:13:37 crc kubenswrapper[4585]: I1201 14:13:37.990320 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6f85986-34f0-4c32-9c59-2a9d9aa8d7f7-kube-api-access-rm2k5" (OuterVolumeSpecName: "kube-api-access-rm2k5") pod "d6f85986-34f0-4c32-9c59-2a9d9aa8d7f7" (UID: "d6f85986-34f0-4c32-9c59-2a9d9aa8d7f7"). InnerVolumeSpecName "kube-api-access-rm2k5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:13:38 crc kubenswrapper[4585]: I1201 14:13:38.058134 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 01 14:13:38 crc kubenswrapper[4585]: I1201 14:13:38.085511 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6f85986-34f0-4c32-9c59-2a9d9aa8d7f7-config\") on node \"crc\" DevicePath \"\"" Dec 01 14:13:38 crc kubenswrapper[4585]: I1201 14:13:38.085545 4585 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6f85986-34f0-4c32-9c59-2a9d9aa8d7f7-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 14:13:38 crc kubenswrapper[4585]: I1201 14:13:38.085870 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rm2k5\" (UniqueName: \"kubernetes.io/projected/d6f85986-34f0-4c32-9c59-2a9d9aa8d7f7-kube-api-access-rm2k5\") on node \"crc\" DevicePath \"\"" Dec 01 14:13:38 crc kubenswrapper[4585]: W1201 14:13:38.095147 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3ceca23_b268_4b00_a4c2_026390eae759.slice/crio-26079ceaeb4635f35214e53c7d2e5d905aad1fb1bcc8ff54b221372cc6b88f2d WatchSource:0}: Error finding container 26079ceaeb4635f35214e53c7d2e5d905aad1fb1bcc8ff54b221372cc6b88f2d: Status 404 returned error can't find the container with id 26079ceaeb4635f35214e53c7d2e5d905aad1fb1bcc8ff54b221372cc6b88f2d Dec 01 14:13:38 crc kubenswrapper[4585]: I1201 14:13:38.209919 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-sc2xr" event={"ID":"d6f85986-34f0-4c32-9c59-2a9d9aa8d7f7","Type":"ContainerDied","Data":"1c2f7c6107fac6cd2c16ca92932336250a182fe6ed6fbabdf9314edde9b711cf"} Dec 01 14:13:38 crc kubenswrapper[4585]: I1201 14:13:38.210033 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-sc2xr" Dec 01 14:13:38 crc kubenswrapper[4585]: I1201 14:13:38.224247 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"61a59437-2c03-417a-839f-6b610fa43a83","Type":"ContainerStarted","Data":"2a2f75ff79700d434c7c94d4fed6574a68aff306bff0d77148ab9daff92359bc"} Dec 01 14:13:38 crc kubenswrapper[4585]: I1201 14:13:38.252193 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"471a678b-d81a-4526-b826-65b359685c99","Type":"ContainerStarted","Data":"fc96bbf32e64b1e892481f39e49d4afc45b0eb4943c591e1aa1135da3ee465d8"} Dec 01 14:13:38 crc kubenswrapper[4585]: I1201 14:13:38.255309 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"fa90f38d-8525-4b91-9a7b-717ddc968614","Type":"ContainerStarted","Data":"fd41ebf35127476a508339094d7b01ca1879fa9d8448fe4b1da100614c3d4645"} Dec 01 14:13:38 crc kubenswrapper[4585]: I1201 14:13:38.257505 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-ntss9" event={"ID":"3e3ec753-b8c2-4b1c-b05e-06a62eefe234","Type":"ContainerDied","Data":"e4d3c904cded6d957f4a99fa0a639d1b631b797adfec75ce10b40b2ae5572bf0"} Dec 01 14:13:38 crc kubenswrapper[4585]: I1201 14:13:38.257690 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-ntss9" Dec 01 14:13:38 crc kubenswrapper[4585]: I1201 14:13:38.259007 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"a3ceca23-b268-4b00-a4c2-026390eae759","Type":"ContainerStarted","Data":"26079ceaeb4635f35214e53c7d2e5d905aad1fb1bcc8ff54b221372cc6b88f2d"} Dec 01 14:13:38 crc kubenswrapper[4585]: I1201 14:13:38.339990 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-sc2xr"] Dec 01 14:13:38 crc kubenswrapper[4585]: I1201 14:13:38.347434 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-sc2xr"] Dec 01 14:13:38 crc kubenswrapper[4585]: I1201 14:13:38.391283 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-ntss9"] Dec 01 14:13:38 crc kubenswrapper[4585]: I1201 14:13:38.400650 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-ntss9"] Dec 01 14:13:38 crc kubenswrapper[4585]: I1201 14:13:38.443210 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e3ec753-b8c2-4b1c-b05e-06a62eefe234" path="/var/lib/kubelet/pods/3e3ec753-b8c2-4b1c-b05e-06a62eefe234/volumes" Dec 01 14:13:38 crc kubenswrapper[4585]: I1201 14:13:38.443628 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6f85986-34f0-4c32-9c59-2a9d9aa8d7f7" path="/var/lib/kubelet/pods/d6f85986-34f0-4c32-9c59-2a9d9aa8d7f7/volumes" Dec 01 14:13:39 crc kubenswrapper[4585]: I1201 14:13:39.138190 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-rt4xq"] Dec 01 14:13:39 crc kubenswrapper[4585]: I1201 14:13:39.268551 4585 generic.go:334] "Generic (PLEG): container finished" podID="c27ce924-56e0-4896-9809-6b44ba1c215b" containerID="5fb137ae4e61f67062fe6cf034ea1b9fd0c0b6986ead83b4d1c29d954332db67" exitCode=0 Dec 01 14:13:39 crc kubenswrapper[4585]: I1201 14:13:39.269395 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wwrrs" event={"ID":"c27ce924-56e0-4896-9809-6b44ba1c215b","Type":"ContainerDied","Data":"5fb137ae4e61f67062fe6cf034ea1b9fd0c0b6986ead83b4d1c29d954332db67"} Dec 01 14:13:40 crc kubenswrapper[4585]: E1201 14:13:40.094237 4585 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcca7a4db_9ec1_4a3c_8458_ab724cdd6861.slice/crio-8c8f1bebbe3ceaba9904177a45cdce226b641556b1de7e9272b34b37fde27c53.scope\": RecentStats: unable to find data in memory cache]" Dec 01 14:13:40 crc kubenswrapper[4585]: I1201 14:13:40.276500 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ncdb4" event={"ID":"cca7a4db-9ec1-4a3c-8458-ab724cdd6861","Type":"ContainerStarted","Data":"8c8f1bebbe3ceaba9904177a45cdce226b641556b1de7e9272b34b37fde27c53"} Dec 01 14:13:40 crc kubenswrapper[4585]: I1201 14:13:40.279467 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b47683b1-2753-468a-b272-f9f1760a71f3","Type":"ContainerStarted","Data":"8147e2dbb97af8be7979d82e5fa350b2f97bc7cf7ff3d0b0cf18ef412e320908"} Dec 01 14:13:40 crc kubenswrapper[4585]: I1201 14:13:40.279876 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 01 14:13:40 crc kubenswrapper[4585]: I1201 14:13:40.318219 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=14.645050239 podStartE2EDuration="30.318202894s" podCreationTimestamp="2025-12-01 14:13:10 +0000 UTC" firstStartedPulling="2025-12-01 14:13:23.352728561 +0000 UTC m=+917.336942416" lastFinishedPulling="2025-12-01 14:13:39.025881216 +0000 UTC m=+933.010095071" observedRunningTime="2025-12-01 14:13:40.317965048 +0000 UTC m=+934.302178903" watchObservedRunningTime="2025-12-01 14:13:40.318202894 +0000 UTC m=+934.302416749" Dec 01 14:13:41 crc kubenswrapper[4585]: W1201 14:13:41.936252 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c01d629_7b26_457f_8ab7_e67464b2e578.slice/crio-b292ffd5807d568be403255b548b0c2f1784cdf2a584c91a15900f5b8fe353a4 WatchSource:0}: Error finding container b292ffd5807d568be403255b548b0c2f1784cdf2a584c91a15900f5b8fe353a4: Status 404 returned error can't find the container with id b292ffd5807d568be403255b548b0c2f1784cdf2a584c91a15900f5b8fe353a4 Dec 01 14:13:42 crc kubenswrapper[4585]: I1201 14:13:42.292700 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-rt4xq" event={"ID":"7c01d629-7b26-457f-8ab7-e67464b2e578","Type":"ContainerStarted","Data":"b292ffd5807d568be403255b548b0c2f1784cdf2a584c91a15900f5b8fe353a4"} Dec 01 14:13:42 crc kubenswrapper[4585]: I1201 14:13:42.296004 4585 generic.go:334] "Generic (PLEG): container finished" podID="cca7a4db-9ec1-4a3c-8458-ab724cdd6861" containerID="8c8f1bebbe3ceaba9904177a45cdce226b641556b1de7e9272b34b37fde27c53" exitCode=0 Dec 01 14:13:42 crc kubenswrapper[4585]: I1201 14:13:42.296046 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ncdb4" event={"ID":"cca7a4db-9ec1-4a3c-8458-ab724cdd6861","Type":"ContainerDied","Data":"8c8f1bebbe3ceaba9904177a45cdce226b641556b1de7e9272b34b37fde27c53"} Dec 01 14:13:43 crc kubenswrapper[4585]: I1201 14:13:43.310410 4585 generic.go:334] "Generic (PLEG): container finished" podID="61a59437-2c03-417a-839f-6b610fa43a83" containerID="2a2f75ff79700d434c7c94d4fed6574a68aff306bff0d77148ab9daff92359bc" exitCode=0 Dec 01 14:13:43 crc kubenswrapper[4585]: I1201 14:13:43.310967 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"61a59437-2c03-417a-839f-6b610fa43a83","Type":"ContainerDied","Data":"2a2f75ff79700d434c7c94d4fed6574a68aff306bff0d77148ab9daff92359bc"} Dec 01 14:13:43 crc kubenswrapper[4585]: I1201 14:13:43.315725 4585 generic.go:334] "Generic (PLEG): container finished" podID="471a678b-d81a-4526-b826-65b359685c99" containerID="fc96bbf32e64b1e892481f39e49d4afc45b0eb4943c591e1aa1135da3ee465d8" exitCode=0 Dec 01 14:13:43 crc kubenswrapper[4585]: I1201 14:13:43.315818 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"471a678b-d81a-4526-b826-65b359685c99","Type":"ContainerDied","Data":"fc96bbf32e64b1e892481f39e49d4afc45b0eb4943c591e1aa1135da3ee465d8"} Dec 01 14:13:43 crc kubenswrapper[4585]: I1201 14:13:43.322316 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"fa90f38d-8525-4b91-9a7b-717ddc968614","Type":"ContainerStarted","Data":"77fc7dd63cac5d8ee1e9b590b3691b2c251521c14fc0dd67ab55fa4036a6f759"} Dec 01 14:13:43 crc kubenswrapper[4585]: I1201 14:13:43.344885 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-m7xvm" event={"ID":"2f3d9474-e60e-401e-8597-1bd7af4f34c3","Type":"ContainerStarted","Data":"813f3fa758b5f2193069208d9b97aeaf92b8fa087588177965acaa49bb3feb50"} Dec 01 14:13:43 crc kubenswrapper[4585]: I1201 14:13:43.345689 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-m7xvm" Dec 01 14:13:43 crc kubenswrapper[4585]: I1201 14:13:43.367003 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"a3ceca23-b268-4b00-a4c2-026390eae759","Type":"ContainerStarted","Data":"6ad0fe5bffba1884911d60b8f8245f5043c176902e48fe33621fc7a262ce2c95"} Dec 01 14:13:43 crc kubenswrapper[4585]: I1201 14:13:43.377813 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6c266121-e7d2-42aa-b1d9-0d15bdd0f798","Type":"ContainerStarted","Data":"01471bb78ac8279a59d5f59a8cd08029ea754a6ffbcf48823c084903b339191c"} Dec 01 14:13:43 crc kubenswrapper[4585]: I1201 14:13:43.414889 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-m7xvm" podStartSLOduration=23.521856055 podStartE2EDuration="29.414869398s" podCreationTimestamp="2025-12-01 14:13:14 +0000 UTC" firstStartedPulling="2025-12-01 14:13:36.805492379 +0000 UTC m=+930.789706224" lastFinishedPulling="2025-12-01 14:13:42.698505702 +0000 UTC m=+936.682719567" observedRunningTime="2025-12-01 14:13:43.411525879 +0000 UTC m=+937.395739734" watchObservedRunningTime="2025-12-01 14:13:43.414869398 +0000 UTC m=+937.399083253" Dec 01 14:13:43 crc kubenswrapper[4585]: I1201 14:13:43.420357 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wwrrs" event={"ID":"c27ce924-56e0-4896-9809-6b44ba1c215b","Type":"ContainerStarted","Data":"51dce2d73e3eb49ba393ede7f058a91cd475563b03913982b5e7df3148f643c5"} Dec 01 14:13:44 crc kubenswrapper[4585]: I1201 14:13:44.435603 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"bbcd4b11-d625-4425-82d8-7c32d8c24c5c","Type":"ContainerStarted","Data":"4cc4e3bd64a7e972bed8c6c48c963007c0854f0b0ed635674b5824481b009f18"} Dec 01 14:13:44 crc kubenswrapper[4585]: I1201 14:13:44.437314 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 01 14:13:44 crc kubenswrapper[4585]: I1201 14:13:44.442163 4585 generic.go:334] "Generic (PLEG): container finished" podID="c27ce924-56e0-4896-9809-6b44ba1c215b" containerID="51dce2d73e3eb49ba393ede7f058a91cd475563b03913982b5e7df3148f643c5" exitCode=0 Dec 01 14:13:44 crc kubenswrapper[4585]: I1201 14:13:44.442283 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wwrrs" event={"ID":"c27ce924-56e0-4896-9809-6b44ba1c215b","Type":"ContainerDied","Data":"51dce2d73e3eb49ba393ede7f058a91cd475563b03913982b5e7df3148f643c5"} Dec 01 14:13:44 crc kubenswrapper[4585]: I1201 14:13:44.446558 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ncdb4" event={"ID":"cca7a4db-9ec1-4a3c-8458-ab724cdd6861","Type":"ContainerStarted","Data":"ac9841e053a337a64b3d1b41546d631fd4d1dcb25d96d60d6660f931e228d252"} Dec 01 14:13:44 crc kubenswrapper[4585]: I1201 14:13:44.470209 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d41c9a27-f15b-44c5-84b2-0e083f8dc837","Type":"ContainerStarted","Data":"136bde2b6e8e606ac594a703cd22914247375724ce60c2580695ad1f22d011e9"} Dec 01 14:13:44 crc kubenswrapper[4585]: I1201 14:13:44.478805 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"61a59437-2c03-417a-839f-6b610fa43a83","Type":"ContainerStarted","Data":"5ceee744be9020760158850a642844752c6821eb578847749d0bc907fb206796"} Dec 01 14:13:44 crc kubenswrapper[4585]: I1201 14:13:44.481856 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"471a678b-d81a-4526-b826-65b359685c99","Type":"ContainerStarted","Data":"97d02ef604a589d24146bd4b3d73bdb23f059059d2ad34ecad616ab80dc0f8fd"} Dec 01 14:13:44 crc kubenswrapper[4585]: I1201 14:13:44.488155 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=3.414531146 podStartE2EDuration="36.488133146s" podCreationTimestamp="2025-12-01 14:13:08 +0000 UTC" firstStartedPulling="2025-12-01 14:13:09.626349311 +0000 UTC m=+903.610563166" lastFinishedPulling="2025-12-01 14:13:42.699951291 +0000 UTC m=+936.684165166" observedRunningTime="2025-12-01 14:13:44.465447901 +0000 UTC m=+938.449661756" watchObservedRunningTime="2025-12-01 14:13:44.488133146 +0000 UTC m=+938.472347001" Dec 01 14:13:44 crc kubenswrapper[4585]: I1201 14:13:44.501346 4585 generic.go:334] "Generic (PLEG): container finished" podID="7c01d629-7b26-457f-8ab7-e67464b2e578" containerID="2adb5a10ca3c2215153a293552b3b277a14a6ca6d6bf857f6ad4119884ff79d0" exitCode=0 Dec 01 14:13:44 crc kubenswrapper[4585]: I1201 14:13:44.501571 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-rt4xq" event={"ID":"7c01d629-7b26-457f-8ab7-e67464b2e578","Type":"ContainerDied","Data":"2adb5a10ca3c2215153a293552b3b277a14a6ca6d6bf857f6ad4119884ff79d0"} Dec 01 14:13:44 crc kubenswrapper[4585]: I1201 14:13:44.511070 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ncdb4" podStartSLOduration=19.85087846 podStartE2EDuration="25.511050957s" podCreationTimestamp="2025-12-01 14:13:19 +0000 UTC" firstStartedPulling="2025-12-01 14:13:37.413716002 +0000 UTC m=+931.397929857" lastFinishedPulling="2025-12-01 14:13:43.073888499 +0000 UTC m=+937.058102354" observedRunningTime="2025-12-01 14:13:44.508876489 +0000 UTC m=+938.493090344" watchObservedRunningTime="2025-12-01 14:13:44.511050957 +0000 UTC m=+938.495264812" Dec 01 14:13:44 crc kubenswrapper[4585]: I1201 14:13:44.536852 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=10.745094016 podStartE2EDuration="38.536835844s" podCreationTimestamp="2025-12-01 14:13:06 +0000 UTC" firstStartedPulling="2025-12-01 14:13:08.986498275 +0000 UTC m=+902.970712130" lastFinishedPulling="2025-12-01 14:13:36.778240103 +0000 UTC m=+930.762453958" observedRunningTime="2025-12-01 14:13:44.529846338 +0000 UTC m=+938.514060193" watchObservedRunningTime="2025-12-01 14:13:44.536835844 +0000 UTC m=+938.521049699" Dec 01 14:13:44 crc kubenswrapper[4585]: I1201 14:13:44.620064 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=10.819658998 podStartE2EDuration="39.620044482s" podCreationTimestamp="2025-12-01 14:13:05 +0000 UTC" firstStartedPulling="2025-12-01 14:13:07.628021954 +0000 UTC m=+901.612235809" lastFinishedPulling="2025-12-01 14:13:36.428407438 +0000 UTC m=+930.412621293" observedRunningTime="2025-12-01 14:13:44.618784299 +0000 UTC m=+938.602998154" watchObservedRunningTime="2025-12-01 14:13:44.620044482 +0000 UTC m=+938.604258337" Dec 01 14:13:46 crc kubenswrapper[4585]: I1201 14:13:46.146121 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-p2d6v"] Dec 01 14:13:46 crc kubenswrapper[4585]: I1201 14:13:46.148430 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p2d6v" Dec 01 14:13:46 crc kubenswrapper[4585]: I1201 14:13:46.154887 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p2d6v"] Dec 01 14:13:46 crc kubenswrapper[4585]: I1201 14:13:46.336965 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pprvq\" (UniqueName: \"kubernetes.io/projected/29ef2e82-35e4-4b31-9324-4fd4274a82b1-kube-api-access-pprvq\") pod \"redhat-marketplace-p2d6v\" (UID: \"29ef2e82-35e4-4b31-9324-4fd4274a82b1\") " pod="openshift-marketplace/redhat-marketplace-p2d6v" Dec 01 14:13:46 crc kubenswrapper[4585]: I1201 14:13:46.337090 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29ef2e82-35e4-4b31-9324-4fd4274a82b1-catalog-content\") pod \"redhat-marketplace-p2d6v\" (UID: \"29ef2e82-35e4-4b31-9324-4fd4274a82b1\") " pod="openshift-marketplace/redhat-marketplace-p2d6v" Dec 01 14:13:46 crc kubenswrapper[4585]: I1201 14:13:46.337176 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29ef2e82-35e4-4b31-9324-4fd4274a82b1-utilities\") pod \"redhat-marketplace-p2d6v\" (UID: \"29ef2e82-35e4-4b31-9324-4fd4274a82b1\") " pod="openshift-marketplace/redhat-marketplace-p2d6v" Dec 01 14:13:46 crc kubenswrapper[4585]: I1201 14:13:46.438952 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29ef2e82-35e4-4b31-9324-4fd4274a82b1-catalog-content\") pod \"redhat-marketplace-p2d6v\" (UID: \"29ef2e82-35e4-4b31-9324-4fd4274a82b1\") " pod="openshift-marketplace/redhat-marketplace-p2d6v" Dec 01 14:13:46 crc kubenswrapper[4585]: I1201 14:13:46.439085 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29ef2e82-35e4-4b31-9324-4fd4274a82b1-utilities\") pod \"redhat-marketplace-p2d6v\" (UID: \"29ef2e82-35e4-4b31-9324-4fd4274a82b1\") " pod="openshift-marketplace/redhat-marketplace-p2d6v" Dec 01 14:13:46 crc kubenswrapper[4585]: I1201 14:13:46.439158 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pprvq\" (UniqueName: \"kubernetes.io/projected/29ef2e82-35e4-4b31-9324-4fd4274a82b1-kube-api-access-pprvq\") pod \"redhat-marketplace-p2d6v\" (UID: \"29ef2e82-35e4-4b31-9324-4fd4274a82b1\") " pod="openshift-marketplace/redhat-marketplace-p2d6v" Dec 01 14:13:46 crc kubenswrapper[4585]: I1201 14:13:46.439580 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29ef2e82-35e4-4b31-9324-4fd4274a82b1-utilities\") pod \"redhat-marketplace-p2d6v\" (UID: \"29ef2e82-35e4-4b31-9324-4fd4274a82b1\") " pod="openshift-marketplace/redhat-marketplace-p2d6v" Dec 01 14:13:46 crc kubenswrapper[4585]: I1201 14:13:46.439611 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29ef2e82-35e4-4b31-9324-4fd4274a82b1-catalog-content\") pod \"redhat-marketplace-p2d6v\" (UID: \"29ef2e82-35e4-4b31-9324-4fd4274a82b1\") " pod="openshift-marketplace/redhat-marketplace-p2d6v" Dec 01 14:13:46 crc kubenswrapper[4585]: I1201 14:13:46.479764 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pprvq\" (UniqueName: \"kubernetes.io/projected/29ef2e82-35e4-4b31-9324-4fd4274a82b1-kube-api-access-pprvq\") pod \"redhat-marketplace-p2d6v\" (UID: \"29ef2e82-35e4-4b31-9324-4fd4274a82b1\") " pod="openshift-marketplace/redhat-marketplace-p2d6v" Dec 01 14:13:46 crc kubenswrapper[4585]: I1201 14:13:46.770648 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p2d6v" Dec 01 14:13:46 crc kubenswrapper[4585]: I1201 14:13:46.903123 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 01 14:13:46 crc kubenswrapper[4585]: I1201 14:13:46.903507 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 01 14:13:47 crc kubenswrapper[4585]: I1201 14:13:47.219273 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p2d6v"] Dec 01 14:13:47 crc kubenswrapper[4585]: I1201 14:13:47.524558 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p2d6v" event={"ID":"29ef2e82-35e4-4b31-9324-4fd4274a82b1","Type":"ContainerStarted","Data":"ad7739df0666fc00220eddb370a2b21b8e84e63ce396c44abb118e0136b88730"} Dec 01 14:13:47 crc kubenswrapper[4585]: I1201 14:13:47.823500 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 01 14:13:47 crc kubenswrapper[4585]: I1201 14:13:47.823746 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 01 14:13:48 crc kubenswrapper[4585]: I1201 14:13:48.513794 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 01 14:13:48 crc kubenswrapper[4585]: I1201 14:13:48.560236 4585 generic.go:334] "Generic (PLEG): container finished" podID="29ef2e82-35e4-4b31-9324-4fd4274a82b1" containerID="58a24cb030fbaa6dee22fe3aa48c5d1e5d0db0d3418d954c72346d90bff87464" exitCode=0 Dec 01 14:13:48 crc kubenswrapper[4585]: I1201 14:13:48.561952 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p2d6v" event={"ID":"29ef2e82-35e4-4b31-9324-4fd4274a82b1","Type":"ContainerDied","Data":"58a24cb030fbaa6dee22fe3aa48c5d1e5d0db0d3418d954c72346d90bff87464"} Dec 01 14:13:49 crc kubenswrapper[4585]: I1201 14:13:49.574857 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-rt4xq" event={"ID":"7c01d629-7b26-457f-8ab7-e67464b2e578","Type":"ContainerStarted","Data":"1d81ca6d82febe2f1850bef708c0e8543a35c343bbd392f85d3240464129c0da"} Dec 01 14:13:49 crc kubenswrapper[4585]: I1201 14:13:49.775870 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ncdb4" Dec 01 14:13:49 crc kubenswrapper[4585]: I1201 14:13:49.775921 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ncdb4" Dec 01 14:13:50 crc kubenswrapper[4585]: I1201 14:13:50.551545 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-828g9"] Dec 01 14:13:50 crc kubenswrapper[4585]: I1201 14:13:50.615787 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 01 14:13:50 crc kubenswrapper[4585]: I1201 14:13:50.625760 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-7s67d"] Dec 01 14:13:50 crc kubenswrapper[4585]: I1201 14:13:50.627063 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-7s67d" Dec 01 14:13:50 crc kubenswrapper[4585]: I1201 14:13:50.641293 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-7s67d"] Dec 01 14:13:50 crc kubenswrapper[4585]: I1201 14:13:50.725808 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/522a4623-50b5-4e3c-97bf-d67856196c1f-config\") pod \"dnsmasq-dns-7cb5889db5-7s67d\" (UID: \"522a4623-50b5-4e3c-97bf-d67856196c1f\") " pod="openstack/dnsmasq-dns-7cb5889db5-7s67d" Dec 01 14:13:50 crc kubenswrapper[4585]: I1201 14:13:50.725854 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jthv\" (UniqueName: \"kubernetes.io/projected/522a4623-50b5-4e3c-97bf-d67856196c1f-kube-api-access-2jthv\") pod \"dnsmasq-dns-7cb5889db5-7s67d\" (UID: \"522a4623-50b5-4e3c-97bf-d67856196c1f\") " pod="openstack/dnsmasq-dns-7cb5889db5-7s67d" Dec 01 14:13:50 crc kubenswrapper[4585]: I1201 14:13:50.725899 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/522a4623-50b5-4e3c-97bf-d67856196c1f-dns-svc\") pod \"dnsmasq-dns-7cb5889db5-7s67d\" (UID: \"522a4623-50b5-4e3c-97bf-d67856196c1f\") " pod="openstack/dnsmasq-dns-7cb5889db5-7s67d" Dec 01 14:13:50 crc kubenswrapper[4585]: I1201 14:13:50.825665 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ncdb4" podUID="cca7a4db-9ec1-4a3c-8458-ab724cdd6861" containerName="registry-server" probeResult="failure" output=< Dec 01 14:13:50 crc kubenswrapper[4585]: timeout: failed to connect service ":50051" within 1s Dec 01 14:13:50 crc kubenswrapper[4585]: > Dec 01 14:13:50 crc kubenswrapper[4585]: I1201 14:13:50.829904 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/522a4623-50b5-4e3c-97bf-d67856196c1f-config\") pod \"dnsmasq-dns-7cb5889db5-7s67d\" (UID: \"522a4623-50b5-4e3c-97bf-d67856196c1f\") " pod="openstack/dnsmasq-dns-7cb5889db5-7s67d" Dec 01 14:13:50 crc kubenswrapper[4585]: I1201 14:13:50.829966 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jthv\" (UniqueName: \"kubernetes.io/projected/522a4623-50b5-4e3c-97bf-d67856196c1f-kube-api-access-2jthv\") pod \"dnsmasq-dns-7cb5889db5-7s67d\" (UID: \"522a4623-50b5-4e3c-97bf-d67856196c1f\") " pod="openstack/dnsmasq-dns-7cb5889db5-7s67d" Dec 01 14:13:50 crc kubenswrapper[4585]: I1201 14:13:50.830032 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/522a4623-50b5-4e3c-97bf-d67856196c1f-dns-svc\") pod \"dnsmasq-dns-7cb5889db5-7s67d\" (UID: \"522a4623-50b5-4e3c-97bf-d67856196c1f\") " pod="openstack/dnsmasq-dns-7cb5889db5-7s67d" Dec 01 14:13:50 crc kubenswrapper[4585]: I1201 14:13:50.830887 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/522a4623-50b5-4e3c-97bf-d67856196c1f-dns-svc\") pod \"dnsmasq-dns-7cb5889db5-7s67d\" (UID: \"522a4623-50b5-4e3c-97bf-d67856196c1f\") " pod="openstack/dnsmasq-dns-7cb5889db5-7s67d" Dec 01 14:13:50 crc kubenswrapper[4585]: I1201 14:13:50.830930 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/522a4623-50b5-4e3c-97bf-d67856196c1f-config\") pod \"dnsmasq-dns-7cb5889db5-7s67d\" (UID: \"522a4623-50b5-4e3c-97bf-d67856196c1f\") " pod="openstack/dnsmasq-dns-7cb5889db5-7s67d" Dec 01 14:13:50 crc kubenswrapper[4585]: I1201 14:13:50.870910 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jthv\" (UniqueName: \"kubernetes.io/projected/522a4623-50b5-4e3c-97bf-d67856196c1f-kube-api-access-2jthv\") pod \"dnsmasq-dns-7cb5889db5-7s67d\" (UID: \"522a4623-50b5-4e3c-97bf-d67856196c1f\") " pod="openstack/dnsmasq-dns-7cb5889db5-7s67d" Dec 01 14:13:50 crc kubenswrapper[4585]: I1201 14:13:50.972988 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-7s67d" Dec 01 14:13:51 crc kubenswrapper[4585]: I1201 14:13:51.665630 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Dec 01 14:13:51 crc kubenswrapper[4585]: I1201 14:13:51.677343 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 01 14:13:51 crc kubenswrapper[4585]: I1201 14:13:51.699348 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Dec 01 14:13:51 crc kubenswrapper[4585]: I1201 14:13:51.699626 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Dec 01 14:13:51 crc kubenswrapper[4585]: I1201 14:13:51.699745 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-s94v4" Dec 01 14:13:51 crc kubenswrapper[4585]: I1201 14:13:51.700092 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Dec 01 14:13:51 crc kubenswrapper[4585]: I1201 14:13:51.701366 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 01 14:13:51 crc kubenswrapper[4585]: I1201 14:13:51.746215 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-225fn\" (UniqueName: \"kubernetes.io/projected/3bc5c97f-1882-47da-843c-f8dba234f1f3-kube-api-access-225fn\") pod \"swift-storage-0\" (UID: \"3bc5c97f-1882-47da-843c-f8dba234f1f3\") " pod="openstack/swift-storage-0" Dec 01 14:13:51 crc kubenswrapper[4585]: I1201 14:13:51.746271 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/3bc5c97f-1882-47da-843c-f8dba234f1f3-lock\") pod \"swift-storage-0\" (UID: \"3bc5c97f-1882-47da-843c-f8dba234f1f3\") " pod="openstack/swift-storage-0" Dec 01 14:13:51 crc kubenswrapper[4585]: I1201 14:13:51.746292 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"3bc5c97f-1882-47da-843c-f8dba234f1f3\") " pod="openstack/swift-storage-0" Dec 01 14:13:51 crc kubenswrapper[4585]: I1201 14:13:51.746338 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/3bc5c97f-1882-47da-843c-f8dba234f1f3-cache\") pod \"swift-storage-0\" (UID: \"3bc5c97f-1882-47da-843c-f8dba234f1f3\") " pod="openstack/swift-storage-0" Dec 01 14:13:51 crc kubenswrapper[4585]: I1201 14:13:51.746356 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3bc5c97f-1882-47da-843c-f8dba234f1f3-etc-swift\") pod \"swift-storage-0\" (UID: \"3bc5c97f-1882-47da-843c-f8dba234f1f3\") " pod="openstack/swift-storage-0" Dec 01 14:13:51 crc kubenswrapper[4585]: I1201 14:13:51.847425 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/3bc5c97f-1882-47da-843c-f8dba234f1f3-lock\") pod \"swift-storage-0\" (UID: \"3bc5c97f-1882-47da-843c-f8dba234f1f3\") " pod="openstack/swift-storage-0" Dec 01 14:13:51 crc kubenswrapper[4585]: I1201 14:13:51.847474 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"3bc5c97f-1882-47da-843c-f8dba234f1f3\") " pod="openstack/swift-storage-0" Dec 01 14:13:51 crc kubenswrapper[4585]: I1201 14:13:51.847542 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/3bc5c97f-1882-47da-843c-f8dba234f1f3-cache\") pod \"swift-storage-0\" (UID: \"3bc5c97f-1882-47da-843c-f8dba234f1f3\") " pod="openstack/swift-storage-0" Dec 01 14:13:51 crc kubenswrapper[4585]: I1201 14:13:51.847572 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3bc5c97f-1882-47da-843c-f8dba234f1f3-etc-swift\") pod \"swift-storage-0\" (UID: \"3bc5c97f-1882-47da-843c-f8dba234f1f3\") " pod="openstack/swift-storage-0" Dec 01 14:13:51 crc kubenswrapper[4585]: I1201 14:13:51.847669 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-225fn\" (UniqueName: \"kubernetes.io/projected/3bc5c97f-1882-47da-843c-f8dba234f1f3-kube-api-access-225fn\") pod \"swift-storage-0\" (UID: \"3bc5c97f-1882-47da-843c-f8dba234f1f3\") " pod="openstack/swift-storage-0" Dec 01 14:13:51 crc kubenswrapper[4585]: I1201 14:13:51.847882 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/3bc5c97f-1882-47da-843c-f8dba234f1f3-lock\") pod \"swift-storage-0\" (UID: \"3bc5c97f-1882-47da-843c-f8dba234f1f3\") " pod="openstack/swift-storage-0" Dec 01 14:13:51 crc kubenswrapper[4585]: I1201 14:13:51.848236 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/3bc5c97f-1882-47da-843c-f8dba234f1f3-cache\") pod \"swift-storage-0\" (UID: \"3bc5c97f-1882-47da-843c-f8dba234f1f3\") " pod="openstack/swift-storage-0" Dec 01 14:13:51 crc kubenswrapper[4585]: E1201 14:13:51.848241 4585 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 01 14:13:51 crc kubenswrapper[4585]: E1201 14:13:51.848274 4585 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 01 14:13:51 crc kubenswrapper[4585]: E1201 14:13:51.848313 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3bc5c97f-1882-47da-843c-f8dba234f1f3-etc-swift podName:3bc5c97f-1882-47da-843c-f8dba234f1f3 nodeName:}" failed. No retries permitted until 2025-12-01 14:13:52.348298329 +0000 UTC m=+946.332512184 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/3bc5c97f-1882-47da-843c-f8dba234f1f3-etc-swift") pod "swift-storage-0" (UID: "3bc5c97f-1882-47da-843c-f8dba234f1f3") : configmap "swift-ring-files" not found Dec 01 14:13:51 crc kubenswrapper[4585]: I1201 14:13:51.848473 4585 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"3bc5c97f-1882-47da-843c-f8dba234f1f3\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/swift-storage-0" Dec 01 14:13:51 crc kubenswrapper[4585]: I1201 14:13:51.868636 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-225fn\" (UniqueName: \"kubernetes.io/projected/3bc5c97f-1882-47da-843c-f8dba234f1f3-kube-api-access-225fn\") pod \"swift-storage-0\" (UID: \"3bc5c97f-1882-47da-843c-f8dba234f1f3\") " pod="openstack/swift-storage-0" Dec 01 14:13:51 crc kubenswrapper[4585]: I1201 14:13:51.872617 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"3bc5c97f-1882-47da-843c-f8dba234f1f3\") " pod="openstack/swift-storage-0" Dec 01 14:13:52 crc kubenswrapper[4585]: I1201 14:13:52.115227 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-7zn7j"] Dec 01 14:13:52 crc kubenswrapper[4585]: I1201 14:13:52.116840 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-7zn7j" Dec 01 14:13:52 crc kubenswrapper[4585]: I1201 14:13:52.121398 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 01 14:13:52 crc kubenswrapper[4585]: I1201 14:13:52.121860 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 01 14:13:52 crc kubenswrapper[4585]: I1201 14:13:52.122478 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 01 14:13:52 crc kubenswrapper[4585]: I1201 14:13:52.122833 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-7zn7j"] Dec 01 14:13:52 crc kubenswrapper[4585]: I1201 14:13:52.254730 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/26bcdef2-b1e8-4848-abc4-b1f6a45c9916-ring-data-devices\") pod \"swift-ring-rebalance-7zn7j\" (UID: \"26bcdef2-b1e8-4848-abc4-b1f6a45c9916\") " pod="openstack/swift-ring-rebalance-7zn7j" Dec 01 14:13:52 crc kubenswrapper[4585]: I1201 14:13:52.255117 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26bcdef2-b1e8-4848-abc4-b1f6a45c9916-combined-ca-bundle\") pod \"swift-ring-rebalance-7zn7j\" (UID: \"26bcdef2-b1e8-4848-abc4-b1f6a45c9916\") " pod="openstack/swift-ring-rebalance-7zn7j" Dec 01 14:13:52 crc kubenswrapper[4585]: I1201 14:13:52.255145 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/26bcdef2-b1e8-4848-abc4-b1f6a45c9916-dispersionconf\") pod \"swift-ring-rebalance-7zn7j\" (UID: \"26bcdef2-b1e8-4848-abc4-b1f6a45c9916\") " pod="openstack/swift-ring-rebalance-7zn7j" Dec 01 14:13:52 crc kubenswrapper[4585]: I1201 14:13:52.255216 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/26bcdef2-b1e8-4848-abc4-b1f6a45c9916-swiftconf\") pod \"swift-ring-rebalance-7zn7j\" (UID: \"26bcdef2-b1e8-4848-abc4-b1f6a45c9916\") " pod="openstack/swift-ring-rebalance-7zn7j" Dec 01 14:13:52 crc kubenswrapper[4585]: I1201 14:13:52.255242 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/26bcdef2-b1e8-4848-abc4-b1f6a45c9916-scripts\") pod \"swift-ring-rebalance-7zn7j\" (UID: \"26bcdef2-b1e8-4848-abc4-b1f6a45c9916\") " pod="openstack/swift-ring-rebalance-7zn7j" Dec 01 14:13:52 crc kubenswrapper[4585]: I1201 14:13:52.255265 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/26bcdef2-b1e8-4848-abc4-b1f6a45c9916-etc-swift\") pod \"swift-ring-rebalance-7zn7j\" (UID: \"26bcdef2-b1e8-4848-abc4-b1f6a45c9916\") " pod="openstack/swift-ring-rebalance-7zn7j" Dec 01 14:13:52 crc kubenswrapper[4585]: I1201 14:13:52.255334 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9m9qn\" (UniqueName: \"kubernetes.io/projected/26bcdef2-b1e8-4848-abc4-b1f6a45c9916-kube-api-access-9m9qn\") pod \"swift-ring-rebalance-7zn7j\" (UID: \"26bcdef2-b1e8-4848-abc4-b1f6a45c9916\") " pod="openstack/swift-ring-rebalance-7zn7j" Dec 01 14:13:52 crc kubenswrapper[4585]: I1201 14:13:52.378247 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/26bcdef2-b1e8-4848-abc4-b1f6a45c9916-swiftconf\") pod \"swift-ring-rebalance-7zn7j\" (UID: \"26bcdef2-b1e8-4848-abc4-b1f6a45c9916\") " pod="openstack/swift-ring-rebalance-7zn7j" Dec 01 14:13:52 crc kubenswrapper[4585]: I1201 14:13:52.378448 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/26bcdef2-b1e8-4848-abc4-b1f6a45c9916-scripts\") pod \"swift-ring-rebalance-7zn7j\" (UID: \"26bcdef2-b1e8-4848-abc4-b1f6a45c9916\") " pod="openstack/swift-ring-rebalance-7zn7j" Dec 01 14:13:52 crc kubenswrapper[4585]: I1201 14:13:52.378571 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/26bcdef2-b1e8-4848-abc4-b1f6a45c9916-etc-swift\") pod \"swift-ring-rebalance-7zn7j\" (UID: \"26bcdef2-b1e8-4848-abc4-b1f6a45c9916\") " pod="openstack/swift-ring-rebalance-7zn7j" Dec 01 14:13:52 crc kubenswrapper[4585]: I1201 14:13:52.378825 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9m9qn\" (UniqueName: \"kubernetes.io/projected/26bcdef2-b1e8-4848-abc4-b1f6a45c9916-kube-api-access-9m9qn\") pod \"swift-ring-rebalance-7zn7j\" (UID: \"26bcdef2-b1e8-4848-abc4-b1f6a45c9916\") " pod="openstack/swift-ring-rebalance-7zn7j" Dec 01 14:13:52 crc kubenswrapper[4585]: I1201 14:13:52.379534 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/26bcdef2-b1e8-4848-abc4-b1f6a45c9916-ring-data-devices\") pod \"swift-ring-rebalance-7zn7j\" (UID: \"26bcdef2-b1e8-4848-abc4-b1f6a45c9916\") " pod="openstack/swift-ring-rebalance-7zn7j" Dec 01 14:13:52 crc kubenswrapper[4585]: I1201 14:13:52.379684 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3bc5c97f-1882-47da-843c-f8dba234f1f3-etc-swift\") pod \"swift-storage-0\" (UID: \"3bc5c97f-1882-47da-843c-f8dba234f1f3\") " pod="openstack/swift-storage-0" Dec 01 14:13:52 crc kubenswrapper[4585]: I1201 14:13:52.379790 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26bcdef2-b1e8-4848-abc4-b1f6a45c9916-combined-ca-bundle\") pod \"swift-ring-rebalance-7zn7j\" (UID: \"26bcdef2-b1e8-4848-abc4-b1f6a45c9916\") " pod="openstack/swift-ring-rebalance-7zn7j" Dec 01 14:13:52 crc kubenswrapper[4585]: I1201 14:13:52.379933 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/26bcdef2-b1e8-4848-abc4-b1f6a45c9916-dispersionconf\") pod \"swift-ring-rebalance-7zn7j\" (UID: \"26bcdef2-b1e8-4848-abc4-b1f6a45c9916\") " pod="openstack/swift-ring-rebalance-7zn7j" Dec 01 14:13:52 crc kubenswrapper[4585]: I1201 14:13:52.380618 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/26bcdef2-b1e8-4848-abc4-b1f6a45c9916-ring-data-devices\") pod \"swift-ring-rebalance-7zn7j\" (UID: \"26bcdef2-b1e8-4848-abc4-b1f6a45c9916\") " pod="openstack/swift-ring-rebalance-7zn7j" Dec 01 14:13:52 crc kubenswrapper[4585]: I1201 14:13:52.379187 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/26bcdef2-b1e8-4848-abc4-b1f6a45c9916-etc-swift\") pod \"swift-ring-rebalance-7zn7j\" (UID: \"26bcdef2-b1e8-4848-abc4-b1f6a45c9916\") " pod="openstack/swift-ring-rebalance-7zn7j" Dec 01 14:13:52 crc kubenswrapper[4585]: E1201 14:13:52.381653 4585 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 01 14:13:52 crc kubenswrapper[4585]: E1201 14:13:52.381750 4585 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 01 14:13:52 crc kubenswrapper[4585]: I1201 14:13:52.382146 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/26bcdef2-b1e8-4848-abc4-b1f6a45c9916-scripts\") pod \"swift-ring-rebalance-7zn7j\" (UID: \"26bcdef2-b1e8-4848-abc4-b1f6a45c9916\") " pod="openstack/swift-ring-rebalance-7zn7j" Dec 01 14:13:52 crc kubenswrapper[4585]: E1201 14:13:52.382561 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3bc5c97f-1882-47da-843c-f8dba234f1f3-etc-swift podName:3bc5c97f-1882-47da-843c-f8dba234f1f3 nodeName:}" failed. No retries permitted until 2025-12-01 14:13:53.38253865 +0000 UTC m=+947.366752555 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/3bc5c97f-1882-47da-843c-f8dba234f1f3-etc-swift") pod "swift-storage-0" (UID: "3bc5c97f-1882-47da-843c-f8dba234f1f3") : configmap "swift-ring-files" not found Dec 01 14:13:52 crc kubenswrapper[4585]: I1201 14:13:52.385065 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26bcdef2-b1e8-4848-abc4-b1f6a45c9916-combined-ca-bundle\") pod \"swift-ring-rebalance-7zn7j\" (UID: \"26bcdef2-b1e8-4848-abc4-b1f6a45c9916\") " pod="openstack/swift-ring-rebalance-7zn7j" Dec 01 14:13:52 crc kubenswrapper[4585]: I1201 14:13:52.394146 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/26bcdef2-b1e8-4848-abc4-b1f6a45c9916-swiftconf\") pod \"swift-ring-rebalance-7zn7j\" (UID: \"26bcdef2-b1e8-4848-abc4-b1f6a45c9916\") " pod="openstack/swift-ring-rebalance-7zn7j" Dec 01 14:13:52 crc kubenswrapper[4585]: I1201 14:13:52.394425 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/26bcdef2-b1e8-4848-abc4-b1f6a45c9916-dispersionconf\") pod \"swift-ring-rebalance-7zn7j\" (UID: \"26bcdef2-b1e8-4848-abc4-b1f6a45c9916\") " pod="openstack/swift-ring-rebalance-7zn7j" Dec 01 14:13:52 crc kubenswrapper[4585]: I1201 14:13:52.398888 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9m9qn\" (UniqueName: \"kubernetes.io/projected/26bcdef2-b1e8-4848-abc4-b1f6a45c9916-kube-api-access-9m9qn\") pod \"swift-ring-rebalance-7zn7j\" (UID: \"26bcdef2-b1e8-4848-abc4-b1f6a45c9916\") " pod="openstack/swift-ring-rebalance-7zn7j" Dec 01 14:13:52 crc kubenswrapper[4585]: I1201 14:13:52.481757 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-7zn7j" Dec 01 14:13:52 crc kubenswrapper[4585]: I1201 14:13:52.632304 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wwrrs" event={"ID":"c27ce924-56e0-4896-9809-6b44ba1c215b","Type":"ContainerStarted","Data":"3bf3fd2122dbab8298d005cce01a835477f0b0863fefd97c684be49b4fa8d381"} Dec 01 14:13:52 crc kubenswrapper[4585]: I1201 14:13:52.664075 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wwrrs" podStartSLOduration=9.404909277 podStartE2EDuration="19.664057694s" podCreationTimestamp="2025-12-01 14:13:33 +0000 UTC" firstStartedPulling="2025-12-01 14:13:41.931626651 +0000 UTC m=+935.915840506" lastFinishedPulling="2025-12-01 14:13:52.190775078 +0000 UTC m=+946.174988923" observedRunningTime="2025-12-01 14:13:52.661126956 +0000 UTC m=+946.645340811" watchObservedRunningTime="2025-12-01 14:13:52.664057694 +0000 UTC m=+946.648271549" Dec 01 14:13:52 crc kubenswrapper[4585]: I1201 14:13:52.680501 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-7s67d"] Dec 01 14:13:53 crc kubenswrapper[4585]: I1201 14:13:53.009406 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-7zn7j"] Dec 01 14:13:53 crc kubenswrapper[4585]: I1201 14:13:53.242887 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 01 14:13:53 crc kubenswrapper[4585]: I1201 14:13:53.378436 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 01 14:13:53 crc kubenswrapper[4585]: I1201 14:13:53.406125 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3bc5c97f-1882-47da-843c-f8dba234f1f3-etc-swift\") pod \"swift-storage-0\" (UID: \"3bc5c97f-1882-47da-843c-f8dba234f1f3\") " pod="openstack/swift-storage-0" Dec 01 14:13:53 crc kubenswrapper[4585]: E1201 14:13:53.406320 4585 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 01 14:13:53 crc kubenswrapper[4585]: E1201 14:13:53.406346 4585 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 01 14:13:53 crc kubenswrapper[4585]: E1201 14:13:53.406406 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3bc5c97f-1882-47da-843c-f8dba234f1f3-etc-swift podName:3bc5c97f-1882-47da-843c-f8dba234f1f3 nodeName:}" failed. No retries permitted until 2025-12-01 14:13:55.406382772 +0000 UTC m=+949.390596677 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/3bc5c97f-1882-47da-843c-f8dba234f1f3-etc-swift") pod "swift-storage-0" (UID: "3bc5c97f-1882-47da-843c-f8dba234f1f3") : configmap "swift-ring-files" not found Dec 01 14:13:53 crc kubenswrapper[4585]: I1201 14:13:53.639161 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"fa90f38d-8525-4b91-9a7b-717ddc968614","Type":"ContainerStarted","Data":"aa0638bae941b7d11fb14db8106e4611a02fbaed3cff380e618fca19707781aa"} Dec 01 14:13:53 crc kubenswrapper[4585]: I1201 14:13:53.642263 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-rt4xq" event={"ID":"7c01d629-7b26-457f-8ab7-e67464b2e578","Type":"ContainerStarted","Data":"1b81dc325c423b8170f45b3afd5a3a5013a78c73bd19c6b03f75fc319f980e6a"} Dec 01 14:13:53 crc kubenswrapper[4585]: I1201 14:13:53.642473 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-rt4xq" Dec 01 14:13:53 crc kubenswrapper[4585]: I1201 14:13:53.642666 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-rt4xq" Dec 01 14:13:53 crc kubenswrapper[4585]: I1201 14:13:53.644116 4585 generic.go:334] "Generic (PLEG): container finished" podID="522a4623-50b5-4e3c-97bf-d67856196c1f" containerID="53d5b094c54f144e2e8cccbd318414bf2352303f76dc4ed9c126e64f0dc13550" exitCode=0 Dec 01 14:13:53 crc kubenswrapper[4585]: I1201 14:13:53.644278 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-7s67d" event={"ID":"522a4623-50b5-4e3c-97bf-d67856196c1f","Type":"ContainerDied","Data":"53d5b094c54f144e2e8cccbd318414bf2352303f76dc4ed9c126e64f0dc13550"} Dec 01 14:13:53 crc kubenswrapper[4585]: I1201 14:13:53.644399 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-7s67d" event={"ID":"522a4623-50b5-4e3c-97bf-d67856196c1f","Type":"ContainerStarted","Data":"fd31558c02c6399721060ec979262fe5bffce8f002ddc06fb001beb99b7dbce3"} Dec 01 14:13:53 crc kubenswrapper[4585]: I1201 14:13:53.646010 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"a3ceca23-b268-4b00-a4c2-026390eae759","Type":"ContainerStarted","Data":"60c467c1e4a02d4e752811f2f6eea40c6a524bd403b78462f6041ae423ecaca4"} Dec 01 14:13:53 crc kubenswrapper[4585]: I1201 14:13:53.647789 4585 generic.go:334] "Generic (PLEG): container finished" podID="35d48e93-76cf-4411-9772-d650bb88c378" containerID="50ffa3ba2d85cfd74212a0890a8a8bcbe4d3caac88328492211f31bebf254999" exitCode=0 Dec 01 14:13:53 crc kubenswrapper[4585]: I1201 14:13:53.659885 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-rghfw" event={"ID":"35d48e93-76cf-4411-9772-d650bb88c378","Type":"ContainerDied","Data":"50ffa3ba2d85cfd74212a0890a8a8bcbe4d3caac88328492211f31bebf254999"} Dec 01 14:13:53 crc kubenswrapper[4585]: I1201 14:13:53.664696 4585 generic.go:334] "Generic (PLEG): container finished" podID="9fb3832e-e925-4f7d-8409-fc123bc61b44" containerID="5a7c95196e727c38a5226b71fcb927d06c845f3e7b9743ddb377463a204bac04" exitCode=0 Dec 01 14:13:53 crc kubenswrapper[4585]: I1201 14:13:53.665828 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-828g9" event={"ID":"9fb3832e-e925-4f7d-8409-fc123bc61b44","Type":"ContainerDied","Data":"5a7c95196e727c38a5226b71fcb927d06c845f3e7b9743ddb377463a204bac04"} Dec 01 14:13:53 crc kubenswrapper[4585]: I1201 14:13:53.668014 4585 generic.go:334] "Generic (PLEG): container finished" podID="29ef2e82-35e4-4b31-9324-4fd4274a82b1" containerID="430b270ffeb7e50a2b070aad3755126a55b242deaa33c100d79a334dd36c77f6" exitCode=0 Dec 01 14:13:53 crc kubenswrapper[4585]: I1201 14:13:53.668142 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p2d6v" event={"ID":"29ef2e82-35e4-4b31-9324-4fd4274a82b1","Type":"ContainerDied","Data":"430b270ffeb7e50a2b070aad3755126a55b242deaa33c100d79a334dd36c77f6"} Dec 01 14:13:53 crc kubenswrapper[4585]: I1201 14:13:53.676150 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-7zn7j" event={"ID":"26bcdef2-b1e8-4848-abc4-b1f6a45c9916","Type":"ContainerStarted","Data":"4e55ba08df100205d4936300028b810792534d1c4f92c5fe78d834f9eb0c4adb"} Dec 01 14:13:53 crc kubenswrapper[4585]: I1201 14:13:53.694742 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=28.08419912 podStartE2EDuration="42.694720408s" podCreationTimestamp="2025-12-01 14:13:11 +0000 UTC" firstStartedPulling="2025-12-01 14:13:37.694770953 +0000 UTC m=+931.678984798" lastFinishedPulling="2025-12-01 14:13:52.305292231 +0000 UTC m=+946.289506086" observedRunningTime="2025-12-01 14:13:53.683214021 +0000 UTC m=+947.667427886" watchObservedRunningTime="2025-12-01 14:13:53.694720408 +0000 UTC m=+947.678934263" Dec 01 14:13:53 crc kubenswrapper[4585]: I1201 14:13:53.806563 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=24.61874831 podStartE2EDuration="38.806549279s" podCreationTimestamp="2025-12-01 14:13:15 +0000 UTC" firstStartedPulling="2025-12-01 14:13:38.118294783 +0000 UTC m=+932.102508638" lastFinishedPulling="2025-12-01 14:13:52.306095752 +0000 UTC m=+946.290309607" observedRunningTime="2025-12-01 14:13:53.744521385 +0000 UTC m=+947.728735240" watchObservedRunningTime="2025-12-01 14:13:53.806549279 +0000 UTC m=+947.790763134" Dec 01 14:13:53 crc kubenswrapper[4585]: I1201 14:13:53.863400 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-rt4xq" podStartSLOduration=39.101920466 podStartE2EDuration="39.863377143s" podCreationTimestamp="2025-12-01 14:13:14 +0000 UTC" firstStartedPulling="2025-12-01 14:13:41.937665702 +0000 UTC m=+935.921879557" lastFinishedPulling="2025-12-01 14:13:42.699122379 +0000 UTC m=+936.683336234" observedRunningTime="2025-12-01 14:13:53.839946119 +0000 UTC m=+947.824159974" watchObservedRunningTime="2025-12-01 14:13:53.863377143 +0000 UTC m=+947.847590998" Dec 01 14:13:54 crc kubenswrapper[4585]: I1201 14:13:54.120622 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wwrrs" Dec 01 14:13:54 crc kubenswrapper[4585]: I1201 14:13:54.120779 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-828g9" Dec 01 14:13:54 crc kubenswrapper[4585]: I1201 14:13:54.121106 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wwrrs" Dec 01 14:13:54 crc kubenswrapper[4585]: E1201 14:13:54.258967 4585 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Dec 01 14:13:54 crc kubenswrapper[4585]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/35d48e93-76cf-4411-9772-d650bb88c378/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 01 14:13:54 crc kubenswrapper[4585]: > podSandboxID="10fc8348f876bd89ed59316915e8046b9ad2dec12c5f7a2b4e012fc0aaccbdce" Dec 01 14:13:54 crc kubenswrapper[4585]: E1201 14:13:54.259314 4585 kuberuntime_manager.go:1274] "Unhandled Error" err=< Dec 01 14:13:54 crc kubenswrapper[4585]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fbxw6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5ccc8479f9-rghfw_openstack(35d48e93-76cf-4411-9772-d650bb88c378): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/35d48e93-76cf-4411-9772-d650bb88c378/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 01 14:13:54 crc kubenswrapper[4585]: > logger="UnhandledError" Dec 01 14:13:54 crc kubenswrapper[4585]: E1201 14:13:54.260561 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/35d48e93-76cf-4411-9772-d650bb88c378/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-5ccc8479f9-rghfw" podUID="35d48e93-76cf-4411-9772-d650bb88c378" Dec 01 14:13:54 crc kubenswrapper[4585]: I1201 14:13:54.323724 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fb3832e-e925-4f7d-8409-fc123bc61b44-config\") pod \"9fb3832e-e925-4f7d-8409-fc123bc61b44\" (UID: \"9fb3832e-e925-4f7d-8409-fc123bc61b44\") " Dec 01 14:13:54 crc kubenswrapper[4585]: I1201 14:13:54.324045 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9fb3832e-e925-4f7d-8409-fc123bc61b44-dns-svc\") pod \"9fb3832e-e925-4f7d-8409-fc123bc61b44\" (UID: \"9fb3832e-e925-4f7d-8409-fc123bc61b44\") " Dec 01 14:13:54 crc kubenswrapper[4585]: I1201 14:13:54.324094 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvlxp\" (UniqueName: \"kubernetes.io/projected/9fb3832e-e925-4f7d-8409-fc123bc61b44-kube-api-access-nvlxp\") pod \"9fb3832e-e925-4f7d-8409-fc123bc61b44\" (UID: \"9fb3832e-e925-4f7d-8409-fc123bc61b44\") " Dec 01 14:13:54 crc kubenswrapper[4585]: I1201 14:13:54.337617 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fb3832e-e925-4f7d-8409-fc123bc61b44-kube-api-access-nvlxp" (OuterVolumeSpecName: "kube-api-access-nvlxp") pod "9fb3832e-e925-4f7d-8409-fc123bc61b44" (UID: "9fb3832e-e925-4f7d-8409-fc123bc61b44"). InnerVolumeSpecName "kube-api-access-nvlxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:13:54 crc kubenswrapper[4585]: I1201 14:13:54.370996 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fb3832e-e925-4f7d-8409-fc123bc61b44-config" (OuterVolumeSpecName: "config") pod "9fb3832e-e925-4f7d-8409-fc123bc61b44" (UID: "9fb3832e-e925-4f7d-8409-fc123bc61b44"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:13:54 crc kubenswrapper[4585]: I1201 14:13:54.418305 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fb3832e-e925-4f7d-8409-fc123bc61b44-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9fb3832e-e925-4f7d-8409-fc123bc61b44" (UID: "9fb3832e-e925-4f7d-8409-fc123bc61b44"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:13:54 crc kubenswrapper[4585]: I1201 14:13:54.432050 4585 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9fb3832e-e925-4f7d-8409-fc123bc61b44-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 14:13:54 crc kubenswrapper[4585]: I1201 14:13:54.432082 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvlxp\" (UniqueName: \"kubernetes.io/projected/9fb3832e-e925-4f7d-8409-fc123bc61b44-kube-api-access-nvlxp\") on node \"crc\" DevicePath \"\"" Dec 01 14:13:54 crc kubenswrapper[4585]: I1201 14:13:54.432092 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fb3832e-e925-4f7d-8409-fc123bc61b44-config\") on node \"crc\" DevicePath \"\"" Dec 01 14:13:54 crc kubenswrapper[4585]: I1201 14:13:54.690931 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-828g9" event={"ID":"9fb3832e-e925-4f7d-8409-fc123bc61b44","Type":"ContainerDied","Data":"12fe5572a6f285ef8787773d7f33d45caf4571bf216fcb4ade872fe82568c531"} Dec 01 14:13:54 crc kubenswrapper[4585]: I1201 14:13:54.691310 4585 scope.go:117] "RemoveContainer" containerID="5a7c95196e727c38a5226b71fcb927d06c845f3e7b9743ddb377463a204bac04" Dec 01 14:13:54 crc kubenswrapper[4585]: I1201 14:13:54.690986 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-828g9" Dec 01 14:13:54 crc kubenswrapper[4585]: I1201 14:13:54.694725 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p2d6v" event={"ID":"29ef2e82-35e4-4b31-9324-4fd4274a82b1","Type":"ContainerStarted","Data":"2b604a606be77b06926321878fcdaf1cd226fca047d14b5274c7240d2558363f"} Dec 01 14:13:54 crc kubenswrapper[4585]: I1201 14:13:54.707415 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-7s67d" event={"ID":"522a4623-50b5-4e3c-97bf-d67856196c1f","Type":"ContainerStarted","Data":"2a1ecf6518aae1143de2b3c2518e58fc7979b6d1954c5b5deb5688bca4cee177"} Dec 01 14:13:54 crc kubenswrapper[4585]: I1201 14:13:54.759061 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-828g9"] Dec 01 14:13:54 crc kubenswrapper[4585]: I1201 14:13:54.770894 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-828g9"] Dec 01 14:13:54 crc kubenswrapper[4585]: I1201 14:13:54.791204 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-p2d6v" podStartSLOduration=5.867142331 podStartE2EDuration="8.791184335s" podCreationTimestamp="2025-12-01 14:13:46 +0000 UTC" firstStartedPulling="2025-12-01 14:13:51.42666611 +0000 UTC m=+945.410879965" lastFinishedPulling="2025-12-01 14:13:54.350708114 +0000 UTC m=+948.334921969" observedRunningTime="2025-12-01 14:13:54.779822602 +0000 UTC m=+948.764036477" watchObservedRunningTime="2025-12-01 14:13:54.791184335 +0000 UTC m=+948.775398190" Dec 01 14:13:54 crc kubenswrapper[4585]: I1201 14:13:54.809879 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7cb5889db5-7s67d" podStartSLOduration=4.809863243 podStartE2EDuration="4.809863243s" podCreationTimestamp="2025-12-01 14:13:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:13:54.806259607 +0000 UTC m=+948.790473452" watchObservedRunningTime="2025-12-01 14:13:54.809863243 +0000 UTC m=+948.794077098" Dec 01 14:13:54 crc kubenswrapper[4585]: I1201 14:13:54.979053 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 01 14:13:55 crc kubenswrapper[4585]: I1201 14:13:55.022280 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 01 14:13:55 crc kubenswrapper[4585]: I1201 14:13:55.173199 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-wwrrs" podUID="c27ce924-56e0-4896-9809-6b44ba1c215b" containerName="registry-server" probeResult="failure" output=< Dec 01 14:13:55 crc kubenswrapper[4585]: timeout: failed to connect service ":50051" within 1s Dec 01 14:13:55 crc kubenswrapper[4585]: > Dec 01 14:13:55 crc kubenswrapper[4585]: I1201 14:13:55.448562 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3bc5c97f-1882-47da-843c-f8dba234f1f3-etc-swift\") pod \"swift-storage-0\" (UID: \"3bc5c97f-1882-47da-843c-f8dba234f1f3\") " pod="openstack/swift-storage-0" Dec 01 14:13:55 crc kubenswrapper[4585]: E1201 14:13:55.448731 4585 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 01 14:13:55 crc kubenswrapper[4585]: E1201 14:13:55.448749 4585 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 01 14:13:55 crc kubenswrapper[4585]: E1201 14:13:55.448803 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3bc5c97f-1882-47da-843c-f8dba234f1f3-etc-swift podName:3bc5c97f-1882-47da-843c-f8dba234f1f3 nodeName:}" failed. No retries permitted until 2025-12-01 14:13:59.448786403 +0000 UTC m=+953.433000248 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/3bc5c97f-1882-47da-843c-f8dba234f1f3-etc-swift") pod "swift-storage-0" (UID: "3bc5c97f-1882-47da-843c-f8dba234f1f3") : configmap "swift-ring-files" not found Dec 01 14:13:55 crc kubenswrapper[4585]: I1201 14:13:55.715825 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7cb5889db5-7s67d" Dec 01 14:13:55 crc kubenswrapper[4585]: I1201 14:13:55.715861 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 01 14:13:55 crc kubenswrapper[4585]: I1201 14:13:55.753399 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.042776 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-rghfw"] Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.073322 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-xddp8"] Dec 01 14:13:56 crc kubenswrapper[4585]: E1201 14:13:56.073620 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fb3832e-e925-4f7d-8409-fc123bc61b44" containerName="init" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.073637 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fb3832e-e925-4f7d-8409-fc123bc61b44" containerName="init" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.073818 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fb3832e-e925-4f7d-8409-fc123bc61b44" containerName="init" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.074560 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-xddp8" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.076674 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.137676 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74f6f696b9-xqldv"] Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.145916 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6f696b9-xqldv" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.159309 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.163877 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5299e047-b328-440f-a888-8001cad4933b-config\") pod \"ovn-controller-metrics-xddp8\" (UID: \"5299e047-b328-440f-a888-8001cad4933b\") " pod="openstack/ovn-controller-metrics-xddp8" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.163945 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/5299e047-b328-440f-a888-8001cad4933b-ovn-rundir\") pod \"ovn-controller-metrics-xddp8\" (UID: \"5299e047-b328-440f-a888-8001cad4933b\") " pod="openstack/ovn-controller-metrics-xddp8" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.164008 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5299e047-b328-440f-a888-8001cad4933b-combined-ca-bundle\") pod \"ovn-controller-metrics-xddp8\" (UID: \"5299e047-b328-440f-a888-8001cad4933b\") " pod="openstack/ovn-controller-metrics-xddp8" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.164061 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/5299e047-b328-440f-a888-8001cad4933b-ovs-rundir\") pod \"ovn-controller-metrics-xddp8\" (UID: \"5299e047-b328-440f-a888-8001cad4933b\") " pod="openstack/ovn-controller-metrics-xddp8" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.164093 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcd4g\" (UniqueName: \"kubernetes.io/projected/5299e047-b328-440f-a888-8001cad4933b-kube-api-access-lcd4g\") pod \"ovn-controller-metrics-xddp8\" (UID: \"5299e047-b328-440f-a888-8001cad4933b\") " pod="openstack/ovn-controller-metrics-xddp8" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.164149 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5299e047-b328-440f-a888-8001cad4933b-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-xddp8\" (UID: \"5299e047-b328-440f-a888-8001cad4933b\") " pod="openstack/ovn-controller-metrics-xddp8" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.183170 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.244317 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-xddp8"] Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.261076 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6f696b9-xqldv"] Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.265713 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/5299e047-b328-440f-a888-8001cad4933b-ovn-rundir\") pod \"ovn-controller-metrics-xddp8\" (UID: \"5299e047-b328-440f-a888-8001cad4933b\") " pod="openstack/ovn-controller-metrics-xddp8" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.265769 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5299e047-b328-440f-a888-8001cad4933b-combined-ca-bundle\") pod \"ovn-controller-metrics-xddp8\" (UID: \"5299e047-b328-440f-a888-8001cad4933b\") " pod="openstack/ovn-controller-metrics-xddp8" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.265833 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/5299e047-b328-440f-a888-8001cad4933b-ovs-rundir\") pod \"ovn-controller-metrics-xddp8\" (UID: \"5299e047-b328-440f-a888-8001cad4933b\") " pod="openstack/ovn-controller-metrics-xddp8" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.265858 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcd4g\" (UniqueName: \"kubernetes.io/projected/5299e047-b328-440f-a888-8001cad4933b-kube-api-access-lcd4g\") pod \"ovn-controller-metrics-xddp8\" (UID: \"5299e047-b328-440f-a888-8001cad4933b\") " pod="openstack/ovn-controller-metrics-xddp8" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.265894 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e11bdcb-4b1d-466a-8c52-8dc536396e07-dns-svc\") pod \"dnsmasq-dns-74f6f696b9-xqldv\" (UID: \"4e11bdcb-4b1d-466a-8c52-8dc536396e07\") " pod="openstack/dnsmasq-dns-74f6f696b9-xqldv" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.265913 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5299e047-b328-440f-a888-8001cad4933b-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-xddp8\" (UID: \"5299e047-b328-440f-a888-8001cad4933b\") " pod="openstack/ovn-controller-metrics-xddp8" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.265989 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e11bdcb-4b1d-466a-8c52-8dc536396e07-config\") pod \"dnsmasq-dns-74f6f696b9-xqldv\" (UID: \"4e11bdcb-4b1d-466a-8c52-8dc536396e07\") " pod="openstack/dnsmasq-dns-74f6f696b9-xqldv" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.266010 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvfgb\" (UniqueName: \"kubernetes.io/projected/4e11bdcb-4b1d-466a-8c52-8dc536396e07-kube-api-access-fvfgb\") pod \"dnsmasq-dns-74f6f696b9-xqldv\" (UID: \"4e11bdcb-4b1d-466a-8c52-8dc536396e07\") " pod="openstack/dnsmasq-dns-74f6f696b9-xqldv" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.266071 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e11bdcb-4b1d-466a-8c52-8dc536396e07-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6f696b9-xqldv\" (UID: \"4e11bdcb-4b1d-466a-8c52-8dc536396e07\") " pod="openstack/dnsmasq-dns-74f6f696b9-xqldv" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.266091 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5299e047-b328-440f-a888-8001cad4933b-config\") pod \"ovn-controller-metrics-xddp8\" (UID: \"5299e047-b328-440f-a888-8001cad4933b\") " pod="openstack/ovn-controller-metrics-xddp8" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.268794 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/5299e047-b328-440f-a888-8001cad4933b-ovn-rundir\") pod \"ovn-controller-metrics-xddp8\" (UID: \"5299e047-b328-440f-a888-8001cad4933b\") " pod="openstack/ovn-controller-metrics-xddp8" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.270389 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5299e047-b328-440f-a888-8001cad4933b-config\") pod \"ovn-controller-metrics-xddp8\" (UID: \"5299e047-b328-440f-a888-8001cad4933b\") " pod="openstack/ovn-controller-metrics-xddp8" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.271424 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/5299e047-b328-440f-a888-8001cad4933b-ovs-rundir\") pod \"ovn-controller-metrics-xddp8\" (UID: \"5299e047-b328-440f-a888-8001cad4933b\") " pod="openstack/ovn-controller-metrics-xddp8" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.279413 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5299e047-b328-440f-a888-8001cad4933b-combined-ca-bundle\") pod \"ovn-controller-metrics-xddp8\" (UID: \"5299e047-b328-440f-a888-8001cad4933b\") " pod="openstack/ovn-controller-metrics-xddp8" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.291951 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcd4g\" (UniqueName: \"kubernetes.io/projected/5299e047-b328-440f-a888-8001cad4933b-kube-api-access-lcd4g\") pod \"ovn-controller-metrics-xddp8\" (UID: \"5299e047-b328-440f-a888-8001cad4933b\") " pod="openstack/ovn-controller-metrics-xddp8" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.293726 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.296624 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5299e047-b328-440f-a888-8001cad4933b-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-xddp8\" (UID: \"5299e047-b328-440f-a888-8001cad4933b\") " pod="openstack/ovn-controller-metrics-xddp8" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.369066 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvfgb\" (UniqueName: \"kubernetes.io/projected/4e11bdcb-4b1d-466a-8c52-8dc536396e07-kube-api-access-fvfgb\") pod \"dnsmasq-dns-74f6f696b9-xqldv\" (UID: \"4e11bdcb-4b1d-466a-8c52-8dc536396e07\") " pod="openstack/dnsmasq-dns-74f6f696b9-xqldv" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.369171 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e11bdcb-4b1d-466a-8c52-8dc536396e07-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6f696b9-xqldv\" (UID: \"4e11bdcb-4b1d-466a-8c52-8dc536396e07\") " pod="openstack/dnsmasq-dns-74f6f696b9-xqldv" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.369298 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e11bdcb-4b1d-466a-8c52-8dc536396e07-dns-svc\") pod \"dnsmasq-dns-74f6f696b9-xqldv\" (UID: \"4e11bdcb-4b1d-466a-8c52-8dc536396e07\") " pod="openstack/dnsmasq-dns-74f6f696b9-xqldv" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.369336 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e11bdcb-4b1d-466a-8c52-8dc536396e07-config\") pod \"dnsmasq-dns-74f6f696b9-xqldv\" (UID: \"4e11bdcb-4b1d-466a-8c52-8dc536396e07\") " pod="openstack/dnsmasq-dns-74f6f696b9-xqldv" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.371781 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e11bdcb-4b1d-466a-8c52-8dc536396e07-config\") pod \"dnsmasq-dns-74f6f696b9-xqldv\" (UID: \"4e11bdcb-4b1d-466a-8c52-8dc536396e07\") " pod="openstack/dnsmasq-dns-74f6f696b9-xqldv" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.372827 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e11bdcb-4b1d-466a-8c52-8dc536396e07-dns-svc\") pod \"dnsmasq-dns-74f6f696b9-xqldv\" (UID: \"4e11bdcb-4b1d-466a-8c52-8dc536396e07\") " pod="openstack/dnsmasq-dns-74f6f696b9-xqldv" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.373627 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e11bdcb-4b1d-466a-8c52-8dc536396e07-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6f696b9-xqldv\" (UID: \"4e11bdcb-4b1d-466a-8c52-8dc536396e07\") " pod="openstack/dnsmasq-dns-74f6f696b9-xqldv" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.393030 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-xddp8" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.393195 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvfgb\" (UniqueName: \"kubernetes.io/projected/4e11bdcb-4b1d-466a-8c52-8dc536396e07-kube-api-access-fvfgb\") pod \"dnsmasq-dns-74f6f696b9-xqldv\" (UID: \"4e11bdcb-4b1d-466a-8c52-8dc536396e07\") " pod="openstack/dnsmasq-dns-74f6f696b9-xqldv" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.440532 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fb3832e-e925-4f7d-8409-fc123bc61b44" path="/var/lib/kubelet/pods/9fb3832e-e925-4f7d-8409-fc123bc61b44/volumes" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.478561 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.508139 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6f696b9-xqldv" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.568882 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-7s67d"] Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.606826 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-xb2ht"] Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.608144 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-xb2ht" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.613826 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.643087 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-xb2ht"] Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.679071 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53-config\") pod \"dnsmasq-dns-698758b865-xb2ht\" (UID: \"dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53\") " pod="openstack/dnsmasq-dns-698758b865-xb2ht" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.679164 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-xb2ht\" (UID: \"dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53\") " pod="openstack/dnsmasq-dns-698758b865-xb2ht" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.679311 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53-dns-svc\") pod \"dnsmasq-dns-698758b865-xb2ht\" (UID: \"dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53\") " pod="openstack/dnsmasq-dns-698758b865-xb2ht" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.679383 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-xb2ht\" (UID: \"dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53\") " pod="openstack/dnsmasq-dns-698758b865-xb2ht" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.679538 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rf9qj\" (UniqueName: \"kubernetes.io/projected/dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53-kube-api-access-rf9qj\") pod \"dnsmasq-dns-698758b865-xb2ht\" (UID: \"dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53\") " pod="openstack/dnsmasq-dns-698758b865-xb2ht" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.721767 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.741952 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.761941 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.772124 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-p2d6v" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.772480 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-p2d6v" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.781168 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rf9qj\" (UniqueName: \"kubernetes.io/projected/dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53-kube-api-access-rf9qj\") pod \"dnsmasq-dns-698758b865-xb2ht\" (UID: \"dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53\") " pod="openstack/dnsmasq-dns-698758b865-xb2ht" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.781220 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53-config\") pod \"dnsmasq-dns-698758b865-xb2ht\" (UID: \"dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53\") " pod="openstack/dnsmasq-dns-698758b865-xb2ht" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.781309 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-xb2ht\" (UID: \"dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53\") " pod="openstack/dnsmasq-dns-698758b865-xb2ht" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.781364 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53-dns-svc\") pod \"dnsmasq-dns-698758b865-xb2ht\" (UID: \"dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53\") " pod="openstack/dnsmasq-dns-698758b865-xb2ht" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.781402 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-xb2ht\" (UID: \"dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53\") " pod="openstack/dnsmasq-dns-698758b865-xb2ht" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.782192 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-xb2ht\" (UID: \"dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53\") " pod="openstack/dnsmasq-dns-698758b865-xb2ht" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.783003 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-xb2ht\" (UID: \"dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53\") " pod="openstack/dnsmasq-dns-698758b865-xb2ht" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.784055 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53-dns-svc\") pod \"dnsmasq-dns-698758b865-xb2ht\" (UID: \"dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53\") " pod="openstack/dnsmasq-dns-698758b865-xb2ht" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.784795 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53-config\") pod \"dnsmasq-dns-698758b865-xb2ht\" (UID: \"dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53\") " pod="openstack/dnsmasq-dns-698758b865-xb2ht" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.818307 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rf9qj\" (UniqueName: \"kubernetes.io/projected/dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53-kube-api-access-rf9qj\") pod \"dnsmasq-dns-698758b865-xb2ht\" (UID: \"dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53\") " pod="openstack/dnsmasq-dns-698758b865-xb2ht" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.884834 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-p2d6v" Dec 01 14:13:56 crc kubenswrapper[4585]: I1201 14:13:56.959997 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-xb2ht" Dec 01 14:13:57 crc kubenswrapper[4585]: I1201 14:13:57.108522 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 01 14:13:57 crc kubenswrapper[4585]: I1201 14:13:57.109871 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 01 14:13:57 crc kubenswrapper[4585]: I1201 14:13:57.112187 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 01 14:13:57 crc kubenswrapper[4585]: I1201 14:13:57.112796 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-m74rh" Dec 01 14:13:57 crc kubenswrapper[4585]: I1201 14:13:57.113052 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 01 14:13:57 crc kubenswrapper[4585]: I1201 14:13:57.117365 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 01 14:13:57 crc kubenswrapper[4585]: I1201 14:13:57.127304 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 01 14:13:57 crc kubenswrapper[4585]: I1201 14:13:57.192577 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28dn9\" (UniqueName: \"kubernetes.io/projected/e15da9d0-0ba7-4885-8da4-89631b7886f6-kube-api-access-28dn9\") pod \"ovn-northd-0\" (UID: \"e15da9d0-0ba7-4885-8da4-89631b7886f6\") " pod="openstack/ovn-northd-0" Dec 01 14:13:57 crc kubenswrapper[4585]: I1201 14:13:57.192634 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e15da9d0-0ba7-4885-8da4-89631b7886f6-scripts\") pod \"ovn-northd-0\" (UID: \"e15da9d0-0ba7-4885-8da4-89631b7886f6\") " pod="openstack/ovn-northd-0" Dec 01 14:13:57 crc kubenswrapper[4585]: I1201 14:13:57.192652 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e15da9d0-0ba7-4885-8da4-89631b7886f6-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"e15da9d0-0ba7-4885-8da4-89631b7886f6\") " pod="openstack/ovn-northd-0" Dec 01 14:13:57 crc kubenswrapper[4585]: I1201 14:13:57.192680 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e15da9d0-0ba7-4885-8da4-89631b7886f6-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"e15da9d0-0ba7-4885-8da4-89631b7886f6\") " pod="openstack/ovn-northd-0" Dec 01 14:13:57 crc kubenswrapper[4585]: I1201 14:13:57.192712 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/e15da9d0-0ba7-4885-8da4-89631b7886f6-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"e15da9d0-0ba7-4885-8da4-89631b7886f6\") " pod="openstack/ovn-northd-0" Dec 01 14:13:57 crc kubenswrapper[4585]: I1201 14:13:57.192756 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e15da9d0-0ba7-4885-8da4-89631b7886f6-config\") pod \"ovn-northd-0\" (UID: \"e15da9d0-0ba7-4885-8da4-89631b7886f6\") " pod="openstack/ovn-northd-0" Dec 01 14:13:57 crc kubenswrapper[4585]: I1201 14:13:57.192850 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e15da9d0-0ba7-4885-8da4-89631b7886f6-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"e15da9d0-0ba7-4885-8da4-89631b7886f6\") " pod="openstack/ovn-northd-0" Dec 01 14:13:57 crc kubenswrapper[4585]: I1201 14:13:57.314215 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e15da9d0-0ba7-4885-8da4-89631b7886f6-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"e15da9d0-0ba7-4885-8da4-89631b7886f6\") " pod="openstack/ovn-northd-0" Dec 01 14:13:57 crc kubenswrapper[4585]: I1201 14:13:57.315468 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e15da9d0-0ba7-4885-8da4-89631b7886f6-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"e15da9d0-0ba7-4885-8da4-89631b7886f6\") " pod="openstack/ovn-northd-0" Dec 01 14:13:57 crc kubenswrapper[4585]: I1201 14:13:57.315933 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28dn9\" (UniqueName: \"kubernetes.io/projected/e15da9d0-0ba7-4885-8da4-89631b7886f6-kube-api-access-28dn9\") pod \"ovn-northd-0\" (UID: \"e15da9d0-0ba7-4885-8da4-89631b7886f6\") " pod="openstack/ovn-northd-0" Dec 01 14:13:57 crc kubenswrapper[4585]: I1201 14:13:57.316028 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e15da9d0-0ba7-4885-8da4-89631b7886f6-scripts\") pod \"ovn-northd-0\" (UID: \"e15da9d0-0ba7-4885-8da4-89631b7886f6\") " pod="openstack/ovn-northd-0" Dec 01 14:13:57 crc kubenswrapper[4585]: I1201 14:13:57.316460 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e15da9d0-0ba7-4885-8da4-89631b7886f6-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"e15da9d0-0ba7-4885-8da4-89631b7886f6\") " pod="openstack/ovn-northd-0" Dec 01 14:13:57 crc kubenswrapper[4585]: I1201 14:13:57.316706 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e15da9d0-0ba7-4885-8da4-89631b7886f6-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"e15da9d0-0ba7-4885-8da4-89631b7886f6\") " pod="openstack/ovn-northd-0" Dec 01 14:13:57 crc kubenswrapper[4585]: I1201 14:13:57.316889 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/e15da9d0-0ba7-4885-8da4-89631b7886f6-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"e15da9d0-0ba7-4885-8da4-89631b7886f6\") " pod="openstack/ovn-northd-0" Dec 01 14:13:57 crc kubenswrapper[4585]: I1201 14:13:57.317315 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e15da9d0-0ba7-4885-8da4-89631b7886f6-config\") pod \"ovn-northd-0\" (UID: \"e15da9d0-0ba7-4885-8da4-89631b7886f6\") " pod="openstack/ovn-northd-0" Dec 01 14:13:57 crc kubenswrapper[4585]: I1201 14:13:57.319149 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e15da9d0-0ba7-4885-8da4-89631b7886f6-config\") pod \"ovn-northd-0\" (UID: \"e15da9d0-0ba7-4885-8da4-89631b7886f6\") " pod="openstack/ovn-northd-0" Dec 01 14:13:57 crc kubenswrapper[4585]: I1201 14:13:57.319280 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e15da9d0-0ba7-4885-8da4-89631b7886f6-scripts\") pod \"ovn-northd-0\" (UID: \"e15da9d0-0ba7-4885-8da4-89631b7886f6\") " pod="openstack/ovn-northd-0" Dec 01 14:13:57 crc kubenswrapper[4585]: I1201 14:13:57.323728 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/e15da9d0-0ba7-4885-8da4-89631b7886f6-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"e15da9d0-0ba7-4885-8da4-89631b7886f6\") " pod="openstack/ovn-northd-0" Dec 01 14:13:57 crc kubenswrapper[4585]: I1201 14:13:57.332107 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e15da9d0-0ba7-4885-8da4-89631b7886f6-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"e15da9d0-0ba7-4885-8da4-89631b7886f6\") " pod="openstack/ovn-northd-0" Dec 01 14:13:57 crc kubenswrapper[4585]: I1201 14:13:57.341998 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e15da9d0-0ba7-4885-8da4-89631b7886f6-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"e15da9d0-0ba7-4885-8da4-89631b7886f6\") " pod="openstack/ovn-northd-0" Dec 01 14:13:57 crc kubenswrapper[4585]: I1201 14:13:57.351154 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28dn9\" (UniqueName: \"kubernetes.io/projected/e15da9d0-0ba7-4885-8da4-89631b7886f6-kube-api-access-28dn9\") pod \"ovn-northd-0\" (UID: \"e15da9d0-0ba7-4885-8da4-89631b7886f6\") " pod="openstack/ovn-northd-0" Dec 01 14:13:57 crc kubenswrapper[4585]: I1201 14:13:57.427795 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 01 14:13:57 crc kubenswrapper[4585]: I1201 14:13:57.729615 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7cb5889db5-7s67d" podUID="522a4623-50b5-4e3c-97bf-d67856196c1f" containerName="dnsmasq-dns" containerID="cri-o://2a1ecf6518aae1143de2b3c2518e58fc7979b6d1954c5b5deb5688bca4cee177" gracePeriod=10 Dec 01 14:13:57 crc kubenswrapper[4585]: I1201 14:13:57.975302 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-jfsfm"] Dec 01 14:13:57 crc kubenswrapper[4585]: I1201 14:13:57.976307 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-jfsfm" Dec 01 14:13:57 crc kubenswrapper[4585]: I1201 14:13:57.988049 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-jfsfm"] Dec 01 14:13:58 crc kubenswrapper[4585]: I1201 14:13:58.104701 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-3772-account-create-update-x874m"] Dec 01 14:13:58 crc kubenswrapper[4585]: I1201 14:13:58.105680 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3772-account-create-update-x874m" Dec 01 14:13:58 crc kubenswrapper[4585]: I1201 14:13:58.110897 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 01 14:13:58 crc kubenswrapper[4585]: I1201 14:13:58.130605 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-3772-account-create-update-x874m"] Dec 01 14:13:58 crc kubenswrapper[4585]: I1201 14:13:58.138737 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4dnw\" (UniqueName: \"kubernetes.io/projected/14c28b2f-8076-4123-8a7b-e907e8d88a30-kube-api-access-p4dnw\") pod \"keystone-db-create-jfsfm\" (UID: \"14c28b2f-8076-4123-8a7b-e907e8d88a30\") " pod="openstack/keystone-db-create-jfsfm" Dec 01 14:13:58 crc kubenswrapper[4585]: I1201 14:13:58.138813 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14c28b2f-8076-4123-8a7b-e907e8d88a30-operator-scripts\") pod \"keystone-db-create-jfsfm\" (UID: \"14c28b2f-8076-4123-8a7b-e907e8d88a30\") " pod="openstack/keystone-db-create-jfsfm" Dec 01 14:13:58 crc kubenswrapper[4585]: I1201 14:13:58.240253 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4dnw\" (UniqueName: \"kubernetes.io/projected/14c28b2f-8076-4123-8a7b-e907e8d88a30-kube-api-access-p4dnw\") pod \"keystone-db-create-jfsfm\" (UID: \"14c28b2f-8076-4123-8a7b-e907e8d88a30\") " pod="openstack/keystone-db-create-jfsfm" Dec 01 14:13:58 crc kubenswrapper[4585]: I1201 14:13:58.240321 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14c28b2f-8076-4123-8a7b-e907e8d88a30-operator-scripts\") pod \"keystone-db-create-jfsfm\" (UID: \"14c28b2f-8076-4123-8a7b-e907e8d88a30\") " pod="openstack/keystone-db-create-jfsfm" Dec 01 14:13:58 crc kubenswrapper[4585]: I1201 14:13:58.240374 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9jfw\" (UniqueName: \"kubernetes.io/projected/999a47fa-96dd-4791-88bc-ff5e45fe9d6b-kube-api-access-v9jfw\") pod \"keystone-3772-account-create-update-x874m\" (UID: \"999a47fa-96dd-4791-88bc-ff5e45fe9d6b\") " pod="openstack/keystone-3772-account-create-update-x874m" Dec 01 14:13:58 crc kubenswrapper[4585]: I1201 14:13:58.240434 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/999a47fa-96dd-4791-88bc-ff5e45fe9d6b-operator-scripts\") pod \"keystone-3772-account-create-update-x874m\" (UID: \"999a47fa-96dd-4791-88bc-ff5e45fe9d6b\") " pod="openstack/keystone-3772-account-create-update-x874m" Dec 01 14:13:58 crc kubenswrapper[4585]: I1201 14:13:58.241460 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14c28b2f-8076-4123-8a7b-e907e8d88a30-operator-scripts\") pod \"keystone-db-create-jfsfm\" (UID: \"14c28b2f-8076-4123-8a7b-e907e8d88a30\") " pod="openstack/keystone-db-create-jfsfm" Dec 01 14:13:58 crc kubenswrapper[4585]: I1201 14:13:58.269007 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4dnw\" (UniqueName: \"kubernetes.io/projected/14c28b2f-8076-4123-8a7b-e907e8d88a30-kube-api-access-p4dnw\") pod \"keystone-db-create-jfsfm\" (UID: \"14c28b2f-8076-4123-8a7b-e907e8d88a30\") " pod="openstack/keystone-db-create-jfsfm" Dec 01 14:13:58 crc kubenswrapper[4585]: I1201 14:13:58.299826 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-jfsfm" Dec 01 14:13:58 crc kubenswrapper[4585]: I1201 14:13:58.341724 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9jfw\" (UniqueName: \"kubernetes.io/projected/999a47fa-96dd-4791-88bc-ff5e45fe9d6b-kube-api-access-v9jfw\") pod \"keystone-3772-account-create-update-x874m\" (UID: \"999a47fa-96dd-4791-88bc-ff5e45fe9d6b\") " pod="openstack/keystone-3772-account-create-update-x874m" Dec 01 14:13:58 crc kubenswrapper[4585]: I1201 14:13:58.341815 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/999a47fa-96dd-4791-88bc-ff5e45fe9d6b-operator-scripts\") pod \"keystone-3772-account-create-update-x874m\" (UID: \"999a47fa-96dd-4791-88bc-ff5e45fe9d6b\") " pod="openstack/keystone-3772-account-create-update-x874m" Dec 01 14:13:58 crc kubenswrapper[4585]: I1201 14:13:58.342666 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/999a47fa-96dd-4791-88bc-ff5e45fe9d6b-operator-scripts\") pod \"keystone-3772-account-create-update-x874m\" (UID: \"999a47fa-96dd-4791-88bc-ff5e45fe9d6b\") " pod="openstack/keystone-3772-account-create-update-x874m" Dec 01 14:13:58 crc kubenswrapper[4585]: I1201 14:13:58.381598 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9jfw\" (UniqueName: \"kubernetes.io/projected/999a47fa-96dd-4791-88bc-ff5e45fe9d6b-kube-api-access-v9jfw\") pod \"keystone-3772-account-create-update-x874m\" (UID: \"999a47fa-96dd-4791-88bc-ff5e45fe9d6b\") " pod="openstack/keystone-3772-account-create-update-x874m" Dec 01 14:13:58 crc kubenswrapper[4585]: I1201 14:13:58.422248 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3772-account-create-update-x874m" Dec 01 14:13:58 crc kubenswrapper[4585]: I1201 14:13:58.468881 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-x849h"] Dec 01 14:13:58 crc kubenswrapper[4585]: I1201 14:13:58.470022 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-x849h" Dec 01 14:13:58 crc kubenswrapper[4585]: I1201 14:13:58.480008 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-x849h"] Dec 01 14:13:58 crc kubenswrapper[4585]: I1201 14:13:58.548517 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5rwz\" (UniqueName: \"kubernetes.io/projected/dfdc1d20-5b7c-4dff-988a-a8528d764fdf-kube-api-access-r5rwz\") pod \"placement-db-create-x849h\" (UID: \"dfdc1d20-5b7c-4dff-988a-a8528d764fdf\") " pod="openstack/placement-db-create-x849h" Dec 01 14:13:58 crc kubenswrapper[4585]: I1201 14:13:58.548631 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dfdc1d20-5b7c-4dff-988a-a8528d764fdf-operator-scripts\") pod \"placement-db-create-x849h\" (UID: \"dfdc1d20-5b7c-4dff-988a-a8528d764fdf\") " pod="openstack/placement-db-create-x849h" Dec 01 14:13:58 crc kubenswrapper[4585]: I1201 14:13:58.555105 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-4d2c-account-create-update-d4hsr"] Dec 01 14:13:58 crc kubenswrapper[4585]: I1201 14:13:58.556250 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-4d2c-account-create-update-d4hsr" Dec 01 14:13:58 crc kubenswrapper[4585]: I1201 14:13:58.559955 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 01 14:13:58 crc kubenswrapper[4585]: I1201 14:13:58.565407 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-4d2c-account-create-update-d4hsr"] Dec 01 14:13:58 crc kubenswrapper[4585]: I1201 14:13:58.650880 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jzl7\" (UniqueName: \"kubernetes.io/projected/7bef25f3-d94c-4f4a-aa88-e48fb532fcec-kube-api-access-5jzl7\") pod \"placement-4d2c-account-create-update-d4hsr\" (UID: \"7bef25f3-d94c-4f4a-aa88-e48fb532fcec\") " pod="openstack/placement-4d2c-account-create-update-d4hsr" Dec 01 14:13:58 crc kubenswrapper[4585]: I1201 14:13:58.650964 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5rwz\" (UniqueName: \"kubernetes.io/projected/dfdc1d20-5b7c-4dff-988a-a8528d764fdf-kube-api-access-r5rwz\") pod \"placement-db-create-x849h\" (UID: \"dfdc1d20-5b7c-4dff-988a-a8528d764fdf\") " pod="openstack/placement-db-create-x849h" Dec 01 14:13:58 crc kubenswrapper[4585]: I1201 14:13:58.651052 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dfdc1d20-5b7c-4dff-988a-a8528d764fdf-operator-scripts\") pod \"placement-db-create-x849h\" (UID: \"dfdc1d20-5b7c-4dff-988a-a8528d764fdf\") " pod="openstack/placement-db-create-x849h" Dec 01 14:13:58 crc kubenswrapper[4585]: I1201 14:13:58.651187 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bef25f3-d94c-4f4a-aa88-e48fb532fcec-operator-scripts\") pod \"placement-4d2c-account-create-update-d4hsr\" (UID: \"7bef25f3-d94c-4f4a-aa88-e48fb532fcec\") " pod="openstack/placement-4d2c-account-create-update-d4hsr" Dec 01 14:13:58 crc kubenswrapper[4585]: I1201 14:13:58.652077 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dfdc1d20-5b7c-4dff-988a-a8528d764fdf-operator-scripts\") pod \"placement-db-create-x849h\" (UID: \"dfdc1d20-5b7c-4dff-988a-a8528d764fdf\") " pod="openstack/placement-db-create-x849h" Dec 01 14:13:58 crc kubenswrapper[4585]: I1201 14:13:58.668027 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5rwz\" (UniqueName: \"kubernetes.io/projected/dfdc1d20-5b7c-4dff-988a-a8528d764fdf-kube-api-access-r5rwz\") pod \"placement-db-create-x849h\" (UID: \"dfdc1d20-5b7c-4dff-988a-a8528d764fdf\") " pod="openstack/placement-db-create-x849h" Dec 01 14:13:58 crc kubenswrapper[4585]: I1201 14:13:58.739083 4585 generic.go:334] "Generic (PLEG): container finished" podID="522a4623-50b5-4e3c-97bf-d67856196c1f" containerID="2a1ecf6518aae1143de2b3c2518e58fc7979b6d1954c5b5deb5688bca4cee177" exitCode=0 Dec 01 14:13:58 crc kubenswrapper[4585]: I1201 14:13:58.739916 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-7s67d" event={"ID":"522a4623-50b5-4e3c-97bf-d67856196c1f","Type":"ContainerDied","Data":"2a1ecf6518aae1143de2b3c2518e58fc7979b6d1954c5b5deb5688bca4cee177"} Dec 01 14:13:58 crc kubenswrapper[4585]: I1201 14:13:58.752604 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bef25f3-d94c-4f4a-aa88-e48fb532fcec-operator-scripts\") pod \"placement-4d2c-account-create-update-d4hsr\" (UID: \"7bef25f3-d94c-4f4a-aa88-e48fb532fcec\") " pod="openstack/placement-4d2c-account-create-update-d4hsr" Dec 01 14:13:58 crc kubenswrapper[4585]: I1201 14:13:58.752689 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jzl7\" (UniqueName: \"kubernetes.io/projected/7bef25f3-d94c-4f4a-aa88-e48fb532fcec-kube-api-access-5jzl7\") pod \"placement-4d2c-account-create-update-d4hsr\" (UID: \"7bef25f3-d94c-4f4a-aa88-e48fb532fcec\") " pod="openstack/placement-4d2c-account-create-update-d4hsr" Dec 01 14:13:58 crc kubenswrapper[4585]: I1201 14:13:58.753278 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bef25f3-d94c-4f4a-aa88-e48fb532fcec-operator-scripts\") pod \"placement-4d2c-account-create-update-d4hsr\" (UID: \"7bef25f3-d94c-4f4a-aa88-e48fb532fcec\") " pod="openstack/placement-4d2c-account-create-update-d4hsr" Dec 01 14:13:58 crc kubenswrapper[4585]: I1201 14:13:58.770287 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jzl7\" (UniqueName: \"kubernetes.io/projected/7bef25f3-d94c-4f4a-aa88-e48fb532fcec-kube-api-access-5jzl7\") pod \"placement-4d2c-account-create-update-d4hsr\" (UID: \"7bef25f3-d94c-4f4a-aa88-e48fb532fcec\") " pod="openstack/placement-4d2c-account-create-update-d4hsr" Dec 01 14:13:58 crc kubenswrapper[4585]: I1201 14:13:58.796696 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-x849h" Dec 01 14:13:58 crc kubenswrapper[4585]: I1201 14:13:58.801259 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-2bc92"] Dec 01 14:13:58 crc kubenswrapper[4585]: I1201 14:13:58.802403 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-2bc92" Dec 01 14:13:58 crc kubenswrapper[4585]: I1201 14:13:58.822958 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-2bc92"] Dec 01 14:13:58 crc kubenswrapper[4585]: I1201 14:13:58.854016 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-f4d6-account-create-update-d4w4d"] Dec 01 14:13:58 crc kubenswrapper[4585]: I1201 14:13:58.855152 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f4d6-account-create-update-d4w4d" Dec 01 14:13:58 crc kubenswrapper[4585]: I1201 14:13:58.860127 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 01 14:13:58 crc kubenswrapper[4585]: I1201 14:13:58.875065 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-4d2c-account-create-update-d4hsr" Dec 01 14:13:58 crc kubenswrapper[4585]: I1201 14:13:58.875344 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hvjgq"] Dec 01 14:13:58 crc kubenswrapper[4585]: I1201 14:13:58.877585 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hvjgq" Dec 01 14:13:58 crc kubenswrapper[4585]: I1201 14:13:58.897587 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-f4d6-account-create-update-d4w4d"] Dec 01 14:13:58 crc kubenswrapper[4585]: I1201 14:13:58.914576 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hvjgq"] Dec 01 14:13:58 crc kubenswrapper[4585]: I1201 14:13:58.958980 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4552555a-cd04-402f-84f0-48569cbf5fd8-operator-scripts\") pod \"glance-db-create-2bc92\" (UID: \"4552555a-cd04-402f-84f0-48569cbf5fd8\") " pod="openstack/glance-db-create-2bc92" Dec 01 14:13:58 crc kubenswrapper[4585]: I1201 14:13:58.959045 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7l79\" (UniqueName: \"kubernetes.io/projected/dbfb665c-c279-40aa-bf1d-ed326b23d184-kube-api-access-q7l79\") pod \"glance-f4d6-account-create-update-d4w4d\" (UID: \"dbfb665c-c279-40aa-bf1d-ed326b23d184\") " pod="openstack/glance-f4d6-account-create-update-d4w4d" Dec 01 14:13:58 crc kubenswrapper[4585]: I1201 14:13:58.959097 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbfb665c-c279-40aa-bf1d-ed326b23d184-operator-scripts\") pod \"glance-f4d6-account-create-update-d4w4d\" (UID: \"dbfb665c-c279-40aa-bf1d-ed326b23d184\") " pod="openstack/glance-f4d6-account-create-update-d4w4d" Dec 01 14:13:58 crc kubenswrapper[4585]: I1201 14:13:58.959143 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67c89\" (UniqueName: \"kubernetes.io/projected/4552555a-cd04-402f-84f0-48569cbf5fd8-kube-api-access-67c89\") pod \"glance-db-create-2bc92\" (UID: \"4552555a-cd04-402f-84f0-48569cbf5fd8\") " pod="openstack/glance-db-create-2bc92" Dec 01 14:13:59 crc kubenswrapper[4585]: I1201 14:13:59.061227 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac4aab85-ddfc-4e11-8e25-54e8bbdbe195-utilities\") pod \"community-operators-hvjgq\" (UID: \"ac4aab85-ddfc-4e11-8e25-54e8bbdbe195\") " pod="openshift-marketplace/community-operators-hvjgq" Dec 01 14:13:59 crc kubenswrapper[4585]: I1201 14:13:59.061387 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac4aab85-ddfc-4e11-8e25-54e8bbdbe195-catalog-content\") pod \"community-operators-hvjgq\" (UID: \"ac4aab85-ddfc-4e11-8e25-54e8bbdbe195\") " pod="openshift-marketplace/community-operators-hvjgq" Dec 01 14:13:59 crc kubenswrapper[4585]: I1201 14:13:59.061457 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhhqg\" (UniqueName: \"kubernetes.io/projected/ac4aab85-ddfc-4e11-8e25-54e8bbdbe195-kube-api-access-zhhqg\") pod \"community-operators-hvjgq\" (UID: \"ac4aab85-ddfc-4e11-8e25-54e8bbdbe195\") " pod="openshift-marketplace/community-operators-hvjgq" Dec 01 14:13:59 crc kubenswrapper[4585]: I1201 14:13:59.061517 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4552555a-cd04-402f-84f0-48569cbf5fd8-operator-scripts\") pod \"glance-db-create-2bc92\" (UID: \"4552555a-cd04-402f-84f0-48569cbf5fd8\") " pod="openstack/glance-db-create-2bc92" Dec 01 14:13:59 crc kubenswrapper[4585]: I1201 14:13:59.061595 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7l79\" (UniqueName: \"kubernetes.io/projected/dbfb665c-c279-40aa-bf1d-ed326b23d184-kube-api-access-q7l79\") pod \"glance-f4d6-account-create-update-d4w4d\" (UID: \"dbfb665c-c279-40aa-bf1d-ed326b23d184\") " pod="openstack/glance-f4d6-account-create-update-d4w4d" Dec 01 14:13:59 crc kubenswrapper[4585]: I1201 14:13:59.061734 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbfb665c-c279-40aa-bf1d-ed326b23d184-operator-scripts\") pod \"glance-f4d6-account-create-update-d4w4d\" (UID: \"dbfb665c-c279-40aa-bf1d-ed326b23d184\") " pod="openstack/glance-f4d6-account-create-update-d4w4d" Dec 01 14:13:59 crc kubenswrapper[4585]: I1201 14:13:59.061846 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67c89\" (UniqueName: \"kubernetes.io/projected/4552555a-cd04-402f-84f0-48569cbf5fd8-kube-api-access-67c89\") pod \"glance-db-create-2bc92\" (UID: \"4552555a-cd04-402f-84f0-48569cbf5fd8\") " pod="openstack/glance-db-create-2bc92" Dec 01 14:13:59 crc kubenswrapper[4585]: I1201 14:13:59.062460 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4552555a-cd04-402f-84f0-48569cbf5fd8-operator-scripts\") pod \"glance-db-create-2bc92\" (UID: \"4552555a-cd04-402f-84f0-48569cbf5fd8\") " pod="openstack/glance-db-create-2bc92" Dec 01 14:13:59 crc kubenswrapper[4585]: I1201 14:13:59.062703 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbfb665c-c279-40aa-bf1d-ed326b23d184-operator-scripts\") pod \"glance-f4d6-account-create-update-d4w4d\" (UID: \"dbfb665c-c279-40aa-bf1d-ed326b23d184\") " pod="openstack/glance-f4d6-account-create-update-d4w4d" Dec 01 14:13:59 crc kubenswrapper[4585]: I1201 14:13:59.078053 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7l79\" (UniqueName: \"kubernetes.io/projected/dbfb665c-c279-40aa-bf1d-ed326b23d184-kube-api-access-q7l79\") pod \"glance-f4d6-account-create-update-d4w4d\" (UID: \"dbfb665c-c279-40aa-bf1d-ed326b23d184\") " pod="openstack/glance-f4d6-account-create-update-d4w4d" Dec 01 14:13:59 crc kubenswrapper[4585]: I1201 14:13:59.079057 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67c89\" (UniqueName: \"kubernetes.io/projected/4552555a-cd04-402f-84f0-48569cbf5fd8-kube-api-access-67c89\") pod \"glance-db-create-2bc92\" (UID: \"4552555a-cd04-402f-84f0-48569cbf5fd8\") " pod="openstack/glance-db-create-2bc92" Dec 01 14:13:59 crc kubenswrapper[4585]: I1201 14:13:59.124847 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-2bc92" Dec 01 14:13:59 crc kubenswrapper[4585]: I1201 14:13:59.163087 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac4aab85-ddfc-4e11-8e25-54e8bbdbe195-utilities\") pod \"community-operators-hvjgq\" (UID: \"ac4aab85-ddfc-4e11-8e25-54e8bbdbe195\") " pod="openshift-marketplace/community-operators-hvjgq" Dec 01 14:13:59 crc kubenswrapper[4585]: I1201 14:13:59.163455 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac4aab85-ddfc-4e11-8e25-54e8bbdbe195-catalog-content\") pod \"community-operators-hvjgq\" (UID: \"ac4aab85-ddfc-4e11-8e25-54e8bbdbe195\") " pod="openshift-marketplace/community-operators-hvjgq" Dec 01 14:13:59 crc kubenswrapper[4585]: I1201 14:13:59.163476 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhhqg\" (UniqueName: \"kubernetes.io/projected/ac4aab85-ddfc-4e11-8e25-54e8bbdbe195-kube-api-access-zhhqg\") pod \"community-operators-hvjgq\" (UID: \"ac4aab85-ddfc-4e11-8e25-54e8bbdbe195\") " pod="openshift-marketplace/community-operators-hvjgq" Dec 01 14:13:59 crc kubenswrapper[4585]: I1201 14:13:59.163568 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac4aab85-ddfc-4e11-8e25-54e8bbdbe195-utilities\") pod \"community-operators-hvjgq\" (UID: \"ac4aab85-ddfc-4e11-8e25-54e8bbdbe195\") " pod="openshift-marketplace/community-operators-hvjgq" Dec 01 14:13:59 crc kubenswrapper[4585]: I1201 14:13:59.163830 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac4aab85-ddfc-4e11-8e25-54e8bbdbe195-catalog-content\") pod \"community-operators-hvjgq\" (UID: \"ac4aab85-ddfc-4e11-8e25-54e8bbdbe195\") " pod="openshift-marketplace/community-operators-hvjgq" Dec 01 14:13:59 crc kubenswrapper[4585]: I1201 14:13:59.171821 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f4d6-account-create-update-d4w4d" Dec 01 14:13:59 crc kubenswrapper[4585]: I1201 14:13:59.180650 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhhqg\" (UniqueName: \"kubernetes.io/projected/ac4aab85-ddfc-4e11-8e25-54e8bbdbe195-kube-api-access-zhhqg\") pod \"community-operators-hvjgq\" (UID: \"ac4aab85-ddfc-4e11-8e25-54e8bbdbe195\") " pod="openshift-marketplace/community-operators-hvjgq" Dec 01 14:13:59 crc kubenswrapper[4585]: I1201 14:13:59.218506 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hvjgq" Dec 01 14:13:59 crc kubenswrapper[4585]: I1201 14:13:59.471537 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3bc5c97f-1882-47da-843c-f8dba234f1f3-etc-swift\") pod \"swift-storage-0\" (UID: \"3bc5c97f-1882-47da-843c-f8dba234f1f3\") " pod="openstack/swift-storage-0" Dec 01 14:13:59 crc kubenswrapper[4585]: E1201 14:13:59.471759 4585 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 01 14:13:59 crc kubenswrapper[4585]: E1201 14:13:59.471790 4585 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 01 14:13:59 crc kubenswrapper[4585]: E1201 14:13:59.471851 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3bc5c97f-1882-47da-843c-f8dba234f1f3-etc-swift podName:3bc5c97f-1882-47da-843c-f8dba234f1f3 nodeName:}" failed. No retries permitted until 2025-12-01 14:14:07.471833462 +0000 UTC m=+961.456047317 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/3bc5c97f-1882-47da-843c-f8dba234f1f3-etc-swift") pod "swift-storage-0" (UID: "3bc5c97f-1882-47da-843c-f8dba234f1f3") : configmap "swift-ring-files" not found Dec 01 14:14:00 crc kubenswrapper[4585]: I1201 14:14:00.821602 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ncdb4" podUID="cca7a4db-9ec1-4a3c-8458-ab724cdd6861" containerName="registry-server" probeResult="failure" output=< Dec 01 14:14:00 crc kubenswrapper[4585]: timeout: failed to connect service ":50051" within 1s Dec 01 14:14:00 crc kubenswrapper[4585]: > Dec 01 14:14:01 crc kubenswrapper[4585]: I1201 14:14:01.008550 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-rghfw" Dec 01 14:14:01 crc kubenswrapper[4585]: I1201 14:14:01.020585 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-7s67d" Dec 01 14:14:01 crc kubenswrapper[4585]: I1201 14:14:01.154665 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/522a4623-50b5-4e3c-97bf-d67856196c1f-dns-svc\") pod \"522a4623-50b5-4e3c-97bf-d67856196c1f\" (UID: \"522a4623-50b5-4e3c-97bf-d67856196c1f\") " Dec 01 14:14:01 crc kubenswrapper[4585]: I1201 14:14:01.155251 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbxw6\" (UniqueName: \"kubernetes.io/projected/35d48e93-76cf-4411-9772-d650bb88c378-kube-api-access-fbxw6\") pod \"35d48e93-76cf-4411-9772-d650bb88c378\" (UID: \"35d48e93-76cf-4411-9772-d650bb88c378\") " Dec 01 14:14:01 crc kubenswrapper[4585]: I1201 14:14:01.155403 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/35d48e93-76cf-4411-9772-d650bb88c378-dns-svc\") pod \"35d48e93-76cf-4411-9772-d650bb88c378\" (UID: \"35d48e93-76cf-4411-9772-d650bb88c378\") " Dec 01 14:14:01 crc kubenswrapper[4585]: I1201 14:14:01.155450 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35d48e93-76cf-4411-9772-d650bb88c378-config\") pod \"35d48e93-76cf-4411-9772-d650bb88c378\" (UID: \"35d48e93-76cf-4411-9772-d650bb88c378\") " Dec 01 14:14:01 crc kubenswrapper[4585]: I1201 14:14:01.155485 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/522a4623-50b5-4e3c-97bf-d67856196c1f-config\") pod \"522a4623-50b5-4e3c-97bf-d67856196c1f\" (UID: \"522a4623-50b5-4e3c-97bf-d67856196c1f\") " Dec 01 14:14:01 crc kubenswrapper[4585]: I1201 14:14:01.155518 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jthv\" (UniqueName: \"kubernetes.io/projected/522a4623-50b5-4e3c-97bf-d67856196c1f-kube-api-access-2jthv\") pod \"522a4623-50b5-4e3c-97bf-d67856196c1f\" (UID: \"522a4623-50b5-4e3c-97bf-d67856196c1f\") " Dec 01 14:14:01 crc kubenswrapper[4585]: I1201 14:14:01.165317 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/522a4623-50b5-4e3c-97bf-d67856196c1f-kube-api-access-2jthv" (OuterVolumeSpecName: "kube-api-access-2jthv") pod "522a4623-50b5-4e3c-97bf-d67856196c1f" (UID: "522a4623-50b5-4e3c-97bf-d67856196c1f"). InnerVolumeSpecName "kube-api-access-2jthv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:14:01 crc kubenswrapper[4585]: I1201 14:14:01.169623 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35d48e93-76cf-4411-9772-d650bb88c378-kube-api-access-fbxw6" (OuterVolumeSpecName: "kube-api-access-fbxw6") pod "35d48e93-76cf-4411-9772-d650bb88c378" (UID: "35d48e93-76cf-4411-9772-d650bb88c378"). InnerVolumeSpecName "kube-api-access-fbxw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:14:01 crc kubenswrapper[4585]: I1201 14:14:01.233961 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/522a4623-50b5-4e3c-97bf-d67856196c1f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "522a4623-50b5-4e3c-97bf-d67856196c1f" (UID: "522a4623-50b5-4e3c-97bf-d67856196c1f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:14:01 crc kubenswrapper[4585]: I1201 14:14:01.242115 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35d48e93-76cf-4411-9772-d650bb88c378-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "35d48e93-76cf-4411-9772-d650bb88c378" (UID: "35d48e93-76cf-4411-9772-d650bb88c378"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:14:01 crc kubenswrapper[4585]: I1201 14:14:01.257563 4585 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/35d48e93-76cf-4411-9772-d650bb88c378-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:01 crc kubenswrapper[4585]: I1201 14:14:01.257602 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jthv\" (UniqueName: \"kubernetes.io/projected/522a4623-50b5-4e3c-97bf-d67856196c1f-kube-api-access-2jthv\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:01 crc kubenswrapper[4585]: I1201 14:14:01.257615 4585 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/522a4623-50b5-4e3c-97bf-d67856196c1f-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:01 crc kubenswrapper[4585]: I1201 14:14:01.257625 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbxw6\" (UniqueName: \"kubernetes.io/projected/35d48e93-76cf-4411-9772-d650bb88c378-kube-api-access-fbxw6\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:01 crc kubenswrapper[4585]: I1201 14:14:01.260531 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35d48e93-76cf-4411-9772-d650bb88c378-config" (OuterVolumeSpecName: "config") pod "35d48e93-76cf-4411-9772-d650bb88c378" (UID: "35d48e93-76cf-4411-9772-d650bb88c378"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:14:01 crc kubenswrapper[4585]: I1201 14:14:01.272677 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/522a4623-50b5-4e3c-97bf-d67856196c1f-config" (OuterVolumeSpecName: "config") pod "522a4623-50b5-4e3c-97bf-d67856196c1f" (UID: "522a4623-50b5-4e3c-97bf-d67856196c1f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:14:01 crc kubenswrapper[4585]: I1201 14:14:01.359048 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35d48e93-76cf-4411-9772-d650bb88c378-config\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:01 crc kubenswrapper[4585]: I1201 14:14:01.359487 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/522a4623-50b5-4e3c-97bf-d67856196c1f-config\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:01 crc kubenswrapper[4585]: I1201 14:14:01.439616 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-xddp8"] Dec 01 14:14:01 crc kubenswrapper[4585]: I1201 14:14:01.610326 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-xb2ht"] Dec 01 14:14:01 crc kubenswrapper[4585]: I1201 14:14:01.763712 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-xb2ht" event={"ID":"dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53","Type":"ContainerStarted","Data":"133dad6975a3e638ced50ddbd5256be72a3a6128e335a2cb037f1d126d9d3f81"} Dec 01 14:14:01 crc kubenswrapper[4585]: I1201 14:14:01.768679 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-7zn7j" event={"ID":"26bcdef2-b1e8-4848-abc4-b1f6a45c9916","Type":"ContainerStarted","Data":"3a388456df96f333c161d96c402bb37c15ae10d548c894f4684d5a591c18c7d7"} Dec 01 14:14:01 crc kubenswrapper[4585]: I1201 14:14:01.825390 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-7zn7j" podStartSLOduration=1.837712649 podStartE2EDuration="9.825369289s" podCreationTimestamp="2025-12-01 14:13:52 +0000 UTC" firstStartedPulling="2025-12-01 14:13:53.056141325 +0000 UTC m=+947.040355180" lastFinishedPulling="2025-12-01 14:14:01.043797965 +0000 UTC m=+955.028011820" observedRunningTime="2025-12-01 14:14:01.818247049 +0000 UTC m=+955.802460904" watchObservedRunningTime="2025-12-01 14:14:01.825369289 +0000 UTC m=+955.809583134" Dec 01 14:14:01 crc kubenswrapper[4585]: I1201 14:14:01.833084 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-jfsfm"] Dec 01 14:14:01 crc kubenswrapper[4585]: I1201 14:14:01.851218 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-7s67d" event={"ID":"522a4623-50b5-4e3c-97bf-d67856196c1f","Type":"ContainerDied","Data":"fd31558c02c6399721060ec979262fe5bffce8f002ddc06fb001beb99b7dbce3"} Dec 01 14:14:01 crc kubenswrapper[4585]: I1201 14:14:01.851268 4585 scope.go:117] "RemoveContainer" containerID="2a1ecf6518aae1143de2b3c2518e58fc7979b6d1954c5b5deb5688bca4cee177" Dec 01 14:14:01 crc kubenswrapper[4585]: I1201 14:14:01.852978 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-7s67d" Dec 01 14:14:01 crc kubenswrapper[4585]: I1201 14:14:01.874625 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 01 14:14:01 crc kubenswrapper[4585]: I1201 14:14:01.882102 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-rghfw" event={"ID":"35d48e93-76cf-4411-9772-d650bb88c378","Type":"ContainerDied","Data":"10fc8348f876bd89ed59316915e8046b9ad2dec12c5f7a2b4e012fc0aaccbdce"} Dec 01 14:14:01 crc kubenswrapper[4585]: I1201 14:14:01.882332 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-rghfw" Dec 01 14:14:01 crc kubenswrapper[4585]: I1201 14:14:01.889897 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-xddp8" event={"ID":"5299e047-b328-440f-a888-8001cad4933b","Type":"ContainerStarted","Data":"8b4d9031fc19e10df95b8c8c391aca3b0dac401d475cf82f3cad0356d9b16bcb"} Dec 01 14:14:01 crc kubenswrapper[4585]: I1201 14:14:01.965362 4585 scope.go:117] "RemoveContainer" containerID="53d5b094c54f144e2e8cccbd318414bf2352303f76dc4ed9c126e64f0dc13550" Dec 01 14:14:02 crc kubenswrapper[4585]: I1201 14:14:02.017918 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-7s67d"] Dec 01 14:14:02 crc kubenswrapper[4585]: I1201 14:14:02.035927 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-7s67d"] Dec 01 14:14:02 crc kubenswrapper[4585]: I1201 14:14:02.068487 4585 scope.go:117] "RemoveContainer" containerID="50ffa3ba2d85cfd74212a0890a8a8bcbe4d3caac88328492211f31bebf254999" Dec 01 14:14:02 crc kubenswrapper[4585]: I1201 14:14:02.095042 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-rghfw"] Dec 01 14:14:02 crc kubenswrapper[4585]: I1201 14:14:02.103564 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-rghfw"] Dec 01 14:14:02 crc kubenswrapper[4585]: I1201 14:14:02.230259 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-3772-account-create-update-x874m"] Dec 01 14:14:02 crc kubenswrapper[4585]: I1201 14:14:02.245346 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hvjgq"] Dec 01 14:14:02 crc kubenswrapper[4585]: W1201 14:14:02.264680 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod999a47fa_96dd_4791_88bc_ff5e45fe9d6b.slice/crio-a8eeb80affae679b28d7d057a52b3b988905f1cde689a5b0c18f89bc738d4134 WatchSource:0}: Error finding container a8eeb80affae679b28d7d057a52b3b988905f1cde689a5b0c18f89bc738d4134: Status 404 returned error can't find the container with id a8eeb80affae679b28d7d057a52b3b988905f1cde689a5b0c18f89bc738d4134 Dec 01 14:14:02 crc kubenswrapper[4585]: I1201 14:14:02.299554 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6f696b9-xqldv"] Dec 01 14:14:02 crc kubenswrapper[4585]: I1201 14:14:02.343467 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-4d2c-account-create-update-d4hsr"] Dec 01 14:14:02 crc kubenswrapper[4585]: W1201 14:14:02.354586 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e11bdcb_4b1d_466a_8c52_8dc536396e07.slice/crio-557eac9d5a138a997fd520b8570c2d8a00678d23fba5f4b80336251db22ddc45 WatchSource:0}: Error finding container 557eac9d5a138a997fd520b8570c2d8a00678d23fba5f4b80336251db22ddc45: Status 404 returned error can't find the container with id 557eac9d5a138a997fd520b8570c2d8a00678d23fba5f4b80336251db22ddc45 Dec 01 14:14:02 crc kubenswrapper[4585]: I1201 14:14:02.476282 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35d48e93-76cf-4411-9772-d650bb88c378" path="/var/lib/kubelet/pods/35d48e93-76cf-4411-9772-d650bb88c378/volumes" Dec 01 14:14:02 crc kubenswrapper[4585]: I1201 14:14:02.478391 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="522a4623-50b5-4e3c-97bf-d67856196c1f" path="/var/lib/kubelet/pods/522a4623-50b5-4e3c-97bf-d67856196c1f/volumes" Dec 01 14:14:02 crc kubenswrapper[4585]: I1201 14:14:02.479692 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-2bc92"] Dec 01 14:14:02 crc kubenswrapper[4585]: I1201 14:14:02.605696 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-f4d6-account-create-update-d4w4d"] Dec 01 14:14:02 crc kubenswrapper[4585]: I1201 14:14:02.629025 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-x849h"] Dec 01 14:14:02 crc kubenswrapper[4585]: W1201 14:14:02.632457 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddfdc1d20_5b7c_4dff_988a_a8528d764fdf.slice/crio-81596dc7e5b35ef66a2ba274d66b30c295f7ea006f1c110c26ffc7ee58157324 WatchSource:0}: Error finding container 81596dc7e5b35ef66a2ba274d66b30c295f7ea006f1c110c26ffc7ee58157324: Status 404 returned error can't find the container with id 81596dc7e5b35ef66a2ba274d66b30c295f7ea006f1c110c26ffc7ee58157324 Dec 01 14:14:02 crc kubenswrapper[4585]: I1201 14:14:02.919548 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f4d6-account-create-update-d4w4d" event={"ID":"dbfb665c-c279-40aa-bf1d-ed326b23d184","Type":"ContainerStarted","Data":"43c27be06a622a71404e1b821090528a26a2ca42e936f072ddb0f5d68dd778a5"} Dec 01 14:14:02 crc kubenswrapper[4585]: I1201 14:14:02.921072 4585 generic.go:334] "Generic (PLEG): container finished" podID="ac4aab85-ddfc-4e11-8e25-54e8bbdbe195" containerID="7b098606d3cddb0cc3e7f3ef61839954f1b3e2cfde00bf4575f77c270ab3a030" exitCode=0 Dec 01 14:14:02 crc kubenswrapper[4585]: I1201 14:14:02.921118 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvjgq" event={"ID":"ac4aab85-ddfc-4e11-8e25-54e8bbdbe195","Type":"ContainerDied","Data":"7b098606d3cddb0cc3e7f3ef61839954f1b3e2cfde00bf4575f77c270ab3a030"} Dec 01 14:14:02 crc kubenswrapper[4585]: I1201 14:14:02.921142 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvjgq" event={"ID":"ac4aab85-ddfc-4e11-8e25-54e8bbdbe195","Type":"ContainerStarted","Data":"0197763a998dc6a7277d9d0e5384b706bd7741eb224983bdea15dd3f6d318a6f"} Dec 01 14:14:02 crc kubenswrapper[4585]: I1201 14:14:02.933100 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6f696b9-xqldv" event={"ID":"4e11bdcb-4b1d-466a-8c52-8dc536396e07","Type":"ContainerStarted","Data":"beb7a5c661ba2bb8184efe457308ffaa56ab172edf4a3bc6227bfca50a6c524f"} Dec 01 14:14:02 crc kubenswrapper[4585]: I1201 14:14:02.933141 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6f696b9-xqldv" event={"ID":"4e11bdcb-4b1d-466a-8c52-8dc536396e07","Type":"ContainerStarted","Data":"557eac9d5a138a997fd520b8570c2d8a00678d23fba5f4b80336251db22ddc45"} Dec 01 14:14:02 crc kubenswrapper[4585]: I1201 14:14:02.936475 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-xddp8" event={"ID":"5299e047-b328-440f-a888-8001cad4933b","Type":"ContainerStarted","Data":"15111f0c6872800efa138f7d9f47fcfc2698481837356c8f49f0fea05dc29f11"} Dec 01 14:14:02 crc kubenswrapper[4585]: I1201 14:14:02.964727 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-4d2c-account-create-update-d4hsr" event={"ID":"7bef25f3-d94c-4f4a-aa88-e48fb532fcec","Type":"ContainerStarted","Data":"bd165c0b7d53bf258740e737c5ee4ef155f34bd7ac5a30965c302947dec9b090"} Dec 01 14:14:02 crc kubenswrapper[4585]: I1201 14:14:02.964797 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-4d2c-account-create-update-d4hsr" event={"ID":"7bef25f3-d94c-4f4a-aa88-e48fb532fcec","Type":"ContainerStarted","Data":"d5b26c5a41baba28cc6d77fd3c502656ebbb83e9614d886c5ac4c339c8026b95"} Dec 01 14:14:02 crc kubenswrapper[4585]: I1201 14:14:02.966408 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"e15da9d0-0ba7-4885-8da4-89631b7886f6","Type":"ContainerStarted","Data":"bceb83731d205103667dbeb628161021f0e84abbbb1a4a71bf00a79dc9b5583e"} Dec 01 14:14:02 crc kubenswrapper[4585]: I1201 14:14:02.976626 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-xddp8" podStartSLOduration=6.976596705 podStartE2EDuration="6.976596705s" podCreationTimestamp="2025-12-01 14:13:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:14:02.966874796 +0000 UTC m=+956.951088651" watchObservedRunningTime="2025-12-01 14:14:02.976596705 +0000 UTC m=+956.960810550" Dec 01 14:14:03 crc kubenswrapper[4585]: I1201 14:14:03.046420 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-3772-account-create-update-x874m" event={"ID":"999a47fa-96dd-4791-88bc-ff5e45fe9d6b","Type":"ContainerStarted","Data":"ae80d1a98060f280309194e382f1942ff4486a774beaed833c8d1ac660088af4"} Dec 01 14:14:03 crc kubenswrapper[4585]: I1201 14:14:03.046475 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-3772-account-create-update-x874m" event={"ID":"999a47fa-96dd-4791-88bc-ff5e45fe9d6b","Type":"ContainerStarted","Data":"a8eeb80affae679b28d7d057a52b3b988905f1cde689a5b0c18f89bc738d4134"} Dec 01 14:14:03 crc kubenswrapper[4585]: I1201 14:14:03.058995 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-4d2c-account-create-update-d4hsr" podStartSLOduration=5.058957081 podStartE2EDuration="5.058957081s" podCreationTimestamp="2025-12-01 14:13:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:14:03.039605145 +0000 UTC m=+957.023819000" watchObservedRunningTime="2025-12-01 14:14:03.058957081 +0000 UTC m=+957.043170936" Dec 01 14:14:03 crc kubenswrapper[4585]: I1201 14:14:03.083677 4585 generic.go:334] "Generic (PLEG): container finished" podID="dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53" containerID="ac96dd283f58772aef8cb613c5fdec3760127dd34d7c4f646cfe65ec65e6f821" exitCode=0 Dec 01 14:14:03 crc kubenswrapper[4585]: I1201 14:14:03.083758 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-xb2ht" event={"ID":"dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53","Type":"ContainerDied","Data":"ac96dd283f58772aef8cb613c5fdec3760127dd34d7c4f646cfe65ec65e6f821"} Dec 01 14:14:03 crc kubenswrapper[4585]: I1201 14:14:03.165066 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-2bc92" event={"ID":"4552555a-cd04-402f-84f0-48569cbf5fd8","Type":"ContainerStarted","Data":"efb2ed550a45d22043b3a1c37c314e2208939f217e6f04f3b25c2a6022186a6d"} Dec 01 14:14:03 crc kubenswrapper[4585]: I1201 14:14:03.165109 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-2bc92" event={"ID":"4552555a-cd04-402f-84f0-48569cbf5fd8","Type":"ContainerStarted","Data":"98c46af75035fcd99b636489e5911b0e3e1baf785143034e5ba818bbe410108f"} Dec 01 14:14:03 crc kubenswrapper[4585]: I1201 14:14:03.194878 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-jfsfm" event={"ID":"14c28b2f-8076-4123-8a7b-e907e8d88a30","Type":"ContainerStarted","Data":"70a8a58e42054315381a278709450610373b4af816ba77e84d5982bd1209b22b"} Dec 01 14:14:03 crc kubenswrapper[4585]: I1201 14:14:03.194930 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-jfsfm" event={"ID":"14c28b2f-8076-4123-8a7b-e907e8d88a30","Type":"ContainerStarted","Data":"0085c3aaecfb152642334e346c5831f5d2bcd77207a4743caefb35ad34097266"} Dec 01 14:14:03 crc kubenswrapper[4585]: I1201 14:14:03.200873 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-2bc92" podStartSLOduration=5.200855603 podStartE2EDuration="5.200855603s" podCreationTimestamp="2025-12-01 14:13:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:14:03.195267044 +0000 UTC m=+957.179480899" watchObservedRunningTime="2025-12-01 14:14:03.200855603 +0000 UTC m=+957.185069458" Dec 01 14:14:03 crc kubenswrapper[4585]: I1201 14:14:03.207463 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-x849h" event={"ID":"dfdc1d20-5b7c-4dff-988a-a8528d764fdf","Type":"ContainerStarted","Data":"81596dc7e5b35ef66a2ba274d66b30c295f7ea006f1c110c26ffc7ee58157324"} Dec 01 14:14:04 crc kubenswrapper[4585]: I1201 14:14:04.169724 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wwrrs" Dec 01 14:14:04 crc kubenswrapper[4585]: I1201 14:14:04.222575 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wwrrs" Dec 01 14:14:04 crc kubenswrapper[4585]: I1201 14:14:04.223021 4585 generic.go:334] "Generic (PLEG): container finished" podID="dfdc1d20-5b7c-4dff-988a-a8528d764fdf" containerID="ffe4c8385d3b9dff71988384ecf41a79109d0c4ba3ae2098894a2eb53f9df2a7" exitCode=0 Dec 01 14:14:04 crc kubenswrapper[4585]: I1201 14:14:04.223080 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-x849h" event={"ID":"dfdc1d20-5b7c-4dff-988a-a8528d764fdf","Type":"ContainerDied","Data":"ffe4c8385d3b9dff71988384ecf41a79109d0c4ba3ae2098894a2eb53f9df2a7"} Dec 01 14:14:04 crc kubenswrapper[4585]: I1201 14:14:04.227512 4585 generic.go:334] "Generic (PLEG): container finished" podID="999a47fa-96dd-4791-88bc-ff5e45fe9d6b" containerID="ae80d1a98060f280309194e382f1942ff4486a774beaed833c8d1ac660088af4" exitCode=0 Dec 01 14:14:04 crc kubenswrapper[4585]: I1201 14:14:04.227591 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-3772-account-create-update-x874m" event={"ID":"999a47fa-96dd-4791-88bc-ff5e45fe9d6b","Type":"ContainerDied","Data":"ae80d1a98060f280309194e382f1942ff4486a774beaed833c8d1ac660088af4"} Dec 01 14:14:04 crc kubenswrapper[4585]: I1201 14:14:04.254052 4585 generic.go:334] "Generic (PLEG): container finished" podID="dbfb665c-c279-40aa-bf1d-ed326b23d184" containerID="5f22c4c22b95834a5e9420127714ab200b326e6508a28f2e24da5548eeaf290a" exitCode=0 Dec 01 14:14:04 crc kubenswrapper[4585]: I1201 14:14:04.254136 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f4d6-account-create-update-d4w4d" event={"ID":"dbfb665c-c279-40aa-bf1d-ed326b23d184","Type":"ContainerDied","Data":"5f22c4c22b95834a5e9420127714ab200b326e6508a28f2e24da5548eeaf290a"} Dec 01 14:14:04 crc kubenswrapper[4585]: I1201 14:14:04.272672 4585 generic.go:334] "Generic (PLEG): container finished" podID="4e11bdcb-4b1d-466a-8c52-8dc536396e07" containerID="beb7a5c661ba2bb8184efe457308ffaa56ab172edf4a3bc6227bfca50a6c524f" exitCode=0 Dec 01 14:14:04 crc kubenswrapper[4585]: I1201 14:14:04.272764 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6f696b9-xqldv" event={"ID":"4e11bdcb-4b1d-466a-8c52-8dc536396e07","Type":"ContainerDied","Data":"beb7a5c661ba2bb8184efe457308ffaa56ab172edf4a3bc6227bfca50a6c524f"} Dec 01 14:14:04 crc kubenswrapper[4585]: I1201 14:14:04.273097 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6f696b9-xqldv" event={"ID":"4e11bdcb-4b1d-466a-8c52-8dc536396e07","Type":"ContainerStarted","Data":"674e6b706f16b7ded0bde538df57be7f7c4ca70a58ed7d59ec9903c1faf47bed"} Dec 01 14:14:04 crc kubenswrapper[4585]: I1201 14:14:04.273148 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74f6f696b9-xqldv" Dec 01 14:14:04 crc kubenswrapper[4585]: I1201 14:14:04.290190 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-xb2ht" event={"ID":"dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53","Type":"ContainerStarted","Data":"4b15fe80d49b4bf44ec4a241ee2fe3d17d389b28995296458a1b5f12adcb2cde"} Dec 01 14:14:04 crc kubenswrapper[4585]: I1201 14:14:04.290888 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-xb2ht" Dec 01 14:14:04 crc kubenswrapper[4585]: I1201 14:14:04.317679 4585 generic.go:334] "Generic (PLEG): container finished" podID="7bef25f3-d94c-4f4a-aa88-e48fb532fcec" containerID="bd165c0b7d53bf258740e737c5ee4ef155f34bd7ac5a30965c302947dec9b090" exitCode=0 Dec 01 14:14:04 crc kubenswrapper[4585]: I1201 14:14:04.317773 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-4d2c-account-create-update-d4hsr" event={"ID":"7bef25f3-d94c-4f4a-aa88-e48fb532fcec","Type":"ContainerDied","Data":"bd165c0b7d53bf258740e737c5ee4ef155f34bd7ac5a30965c302947dec9b090"} Dec 01 14:14:04 crc kubenswrapper[4585]: I1201 14:14:04.319723 4585 generic.go:334] "Generic (PLEG): container finished" podID="4552555a-cd04-402f-84f0-48569cbf5fd8" containerID="efb2ed550a45d22043b3a1c37c314e2208939f217e6f04f3b25c2a6022186a6d" exitCode=0 Dec 01 14:14:04 crc kubenswrapper[4585]: I1201 14:14:04.319787 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-2bc92" event={"ID":"4552555a-cd04-402f-84f0-48569cbf5fd8","Type":"ContainerDied","Data":"efb2ed550a45d22043b3a1c37c314e2208939f217e6f04f3b25c2a6022186a6d"} Dec 01 14:14:04 crc kubenswrapper[4585]: I1201 14:14:04.320804 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-xb2ht" podStartSLOduration=8.320786156 podStartE2EDuration="8.320786156s" podCreationTimestamp="2025-12-01 14:13:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:14:04.315897476 +0000 UTC m=+958.300111331" watchObservedRunningTime="2025-12-01 14:14:04.320786156 +0000 UTC m=+958.305000011" Dec 01 14:14:04 crc kubenswrapper[4585]: I1201 14:14:04.325368 4585 generic.go:334] "Generic (PLEG): container finished" podID="14c28b2f-8076-4123-8a7b-e907e8d88a30" containerID="70a8a58e42054315381a278709450610373b4af816ba77e84d5982bd1209b22b" exitCode=0 Dec 01 14:14:04 crc kubenswrapper[4585]: I1201 14:14:04.326199 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-jfsfm" event={"ID":"14c28b2f-8076-4123-8a7b-e907e8d88a30","Type":"ContainerDied","Data":"70a8a58e42054315381a278709450610373b4af816ba77e84d5982bd1209b22b"} Dec 01 14:14:04 crc kubenswrapper[4585]: I1201 14:14:04.340011 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74f6f696b9-xqldv" podStartSLOduration=8.339996058 podStartE2EDuration="8.339996058s" podCreationTimestamp="2025-12-01 14:13:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:14:04.334930693 +0000 UTC m=+958.319144548" watchObservedRunningTime="2025-12-01 14:14:04.339996058 +0000 UTC m=+958.324209913" Dec 01 14:14:04 crc kubenswrapper[4585]: I1201 14:14:04.735880 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3772-account-create-update-x874m" Dec 01 14:14:04 crc kubenswrapper[4585]: I1201 14:14:04.859221 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9jfw\" (UniqueName: \"kubernetes.io/projected/999a47fa-96dd-4791-88bc-ff5e45fe9d6b-kube-api-access-v9jfw\") pod \"999a47fa-96dd-4791-88bc-ff5e45fe9d6b\" (UID: \"999a47fa-96dd-4791-88bc-ff5e45fe9d6b\") " Dec 01 14:14:04 crc kubenswrapper[4585]: I1201 14:14:04.859519 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/999a47fa-96dd-4791-88bc-ff5e45fe9d6b-operator-scripts\") pod \"999a47fa-96dd-4791-88bc-ff5e45fe9d6b\" (UID: \"999a47fa-96dd-4791-88bc-ff5e45fe9d6b\") " Dec 01 14:14:04 crc kubenswrapper[4585]: I1201 14:14:04.860109 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/999a47fa-96dd-4791-88bc-ff5e45fe9d6b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "999a47fa-96dd-4791-88bc-ff5e45fe9d6b" (UID: "999a47fa-96dd-4791-88bc-ff5e45fe9d6b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:14:04 crc kubenswrapper[4585]: I1201 14:14:04.866183 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/999a47fa-96dd-4791-88bc-ff5e45fe9d6b-kube-api-access-v9jfw" (OuterVolumeSpecName: "kube-api-access-v9jfw") pod "999a47fa-96dd-4791-88bc-ff5e45fe9d6b" (UID: "999a47fa-96dd-4791-88bc-ff5e45fe9d6b"). InnerVolumeSpecName "kube-api-access-v9jfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:14:04 crc kubenswrapper[4585]: I1201 14:14:04.959284 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-jfsfm" Dec 01 14:14:04 crc kubenswrapper[4585]: I1201 14:14:04.961431 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9jfw\" (UniqueName: \"kubernetes.io/projected/999a47fa-96dd-4791-88bc-ff5e45fe9d6b-kube-api-access-v9jfw\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:04 crc kubenswrapper[4585]: I1201 14:14:04.961553 4585 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/999a47fa-96dd-4791-88bc-ff5e45fe9d6b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:05 crc kubenswrapper[4585]: I1201 14:14:05.062718 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14c28b2f-8076-4123-8a7b-e907e8d88a30-operator-scripts\") pod \"14c28b2f-8076-4123-8a7b-e907e8d88a30\" (UID: \"14c28b2f-8076-4123-8a7b-e907e8d88a30\") " Dec 01 14:14:05 crc kubenswrapper[4585]: I1201 14:14:05.062766 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4dnw\" (UniqueName: \"kubernetes.io/projected/14c28b2f-8076-4123-8a7b-e907e8d88a30-kube-api-access-p4dnw\") pod \"14c28b2f-8076-4123-8a7b-e907e8d88a30\" (UID: \"14c28b2f-8076-4123-8a7b-e907e8d88a30\") " Dec 01 14:14:05 crc kubenswrapper[4585]: I1201 14:14:05.064136 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14c28b2f-8076-4123-8a7b-e907e8d88a30-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "14c28b2f-8076-4123-8a7b-e907e8d88a30" (UID: "14c28b2f-8076-4123-8a7b-e907e8d88a30"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:14:05 crc kubenswrapper[4585]: I1201 14:14:05.067941 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14c28b2f-8076-4123-8a7b-e907e8d88a30-kube-api-access-p4dnw" (OuterVolumeSpecName: "kube-api-access-p4dnw") pod "14c28b2f-8076-4123-8a7b-e907e8d88a30" (UID: "14c28b2f-8076-4123-8a7b-e907e8d88a30"). InnerVolumeSpecName "kube-api-access-p4dnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:14:05 crc kubenswrapper[4585]: I1201 14:14:05.164867 4585 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14c28b2f-8076-4123-8a7b-e907e8d88a30-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:05 crc kubenswrapper[4585]: I1201 14:14:05.165227 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4dnw\" (UniqueName: \"kubernetes.io/projected/14c28b2f-8076-4123-8a7b-e907e8d88a30-kube-api-access-p4dnw\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:05 crc kubenswrapper[4585]: I1201 14:14:05.335923 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-jfsfm" event={"ID":"14c28b2f-8076-4123-8a7b-e907e8d88a30","Type":"ContainerDied","Data":"0085c3aaecfb152642334e346c5831f5d2bcd77207a4743caefb35ad34097266"} Dec 01 14:14:05 crc kubenswrapper[4585]: I1201 14:14:05.336016 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0085c3aaecfb152642334e346c5831f5d2bcd77207a4743caefb35ad34097266" Dec 01 14:14:05 crc kubenswrapper[4585]: I1201 14:14:05.336053 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-jfsfm" Dec 01 14:14:05 crc kubenswrapper[4585]: I1201 14:14:05.339627 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"e15da9d0-0ba7-4885-8da4-89631b7886f6","Type":"ContainerStarted","Data":"4b6f94641c5fbb066dcb325c6c45c2672ac2618daa99662b3bae41d98c8922af"} Dec 01 14:14:05 crc kubenswrapper[4585]: I1201 14:14:05.339682 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"e15da9d0-0ba7-4885-8da4-89631b7886f6","Type":"ContainerStarted","Data":"c7b47840ec8016039486ffa61a2459c1b3e4e6f0c835bc7511ca1e8f1d3f0cbf"} Dec 01 14:14:05 crc kubenswrapper[4585]: I1201 14:14:05.339727 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 01 14:14:05 crc kubenswrapper[4585]: I1201 14:14:05.341595 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-3772-account-create-update-x874m" event={"ID":"999a47fa-96dd-4791-88bc-ff5e45fe9d6b","Type":"ContainerDied","Data":"a8eeb80affae679b28d7d057a52b3b988905f1cde689a5b0c18f89bc738d4134"} Dec 01 14:14:05 crc kubenswrapper[4585]: I1201 14:14:05.341642 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8eeb80affae679b28d7d057a52b3b988905f1cde689a5b0c18f89bc738d4134" Dec 01 14:14:05 crc kubenswrapper[4585]: I1201 14:14:05.341613 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3772-account-create-update-x874m" Dec 01 14:14:05 crc kubenswrapper[4585]: I1201 14:14:05.344005 4585 generic.go:334] "Generic (PLEG): container finished" podID="ac4aab85-ddfc-4e11-8e25-54e8bbdbe195" containerID="3a24e83e1e6abbfb923e84dfbd86da400f2a1ffad61c0294af0c1d0627c43acc" exitCode=0 Dec 01 14:14:05 crc kubenswrapper[4585]: I1201 14:14:05.344955 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvjgq" event={"ID":"ac4aab85-ddfc-4e11-8e25-54e8bbdbe195","Type":"ContainerDied","Data":"3a24e83e1e6abbfb923e84dfbd86da400f2a1ffad61c0294af0c1d0627c43acc"} Dec 01 14:14:05 crc kubenswrapper[4585]: I1201 14:14:05.376154 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=6.022461988 podStartE2EDuration="8.376105867s" podCreationTimestamp="2025-12-01 14:13:57 +0000 UTC" firstStartedPulling="2025-12-01 14:14:01.971628467 +0000 UTC m=+955.955842322" lastFinishedPulling="2025-12-01 14:14:04.325272346 +0000 UTC m=+958.309486201" observedRunningTime="2025-12-01 14:14:05.363771679 +0000 UTC m=+959.347985574" watchObservedRunningTime="2025-12-01 14:14:05.376105867 +0000 UTC m=+959.360319732" Dec 01 14:14:05 crc kubenswrapper[4585]: I1201 14:14:05.455353 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wwrrs"] Dec 01 14:14:05 crc kubenswrapper[4585]: I1201 14:14:05.455620 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wwrrs" podUID="c27ce924-56e0-4896-9809-6b44ba1c215b" containerName="registry-server" containerID="cri-o://3bf3fd2122dbab8298d005cce01a835477f0b0863fefd97c684be49b4fa8d381" gracePeriod=2 Dec 01 14:14:05 crc kubenswrapper[4585]: I1201 14:14:05.741869 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-x849h" Dec 01 14:14:05 crc kubenswrapper[4585]: I1201 14:14:05.893963 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dfdc1d20-5b7c-4dff-988a-a8528d764fdf-operator-scripts\") pod \"dfdc1d20-5b7c-4dff-988a-a8528d764fdf\" (UID: \"dfdc1d20-5b7c-4dff-988a-a8528d764fdf\") " Dec 01 14:14:05 crc kubenswrapper[4585]: I1201 14:14:05.894026 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5rwz\" (UniqueName: \"kubernetes.io/projected/dfdc1d20-5b7c-4dff-988a-a8528d764fdf-kube-api-access-r5rwz\") pod \"dfdc1d20-5b7c-4dff-988a-a8528d764fdf\" (UID: \"dfdc1d20-5b7c-4dff-988a-a8528d764fdf\") " Dec 01 14:14:05 crc kubenswrapper[4585]: I1201 14:14:05.894522 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfdc1d20-5b7c-4dff-988a-a8528d764fdf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dfdc1d20-5b7c-4dff-988a-a8528d764fdf" (UID: "dfdc1d20-5b7c-4dff-988a-a8528d764fdf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:14:05 crc kubenswrapper[4585]: I1201 14:14:05.912251 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfdc1d20-5b7c-4dff-988a-a8528d764fdf-kube-api-access-r5rwz" (OuterVolumeSpecName: "kube-api-access-r5rwz") pod "dfdc1d20-5b7c-4dff-988a-a8528d764fdf" (UID: "dfdc1d20-5b7c-4dff-988a-a8528d764fdf"). InnerVolumeSpecName "kube-api-access-r5rwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:14:05 crc kubenswrapper[4585]: I1201 14:14:05.973924 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7cb5889db5-7s67d" podUID="522a4623-50b5-4e3c-97bf-d67856196c1f" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.107:5353: i/o timeout" Dec 01 14:14:05 crc kubenswrapper[4585]: I1201 14:14:05.998351 4585 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dfdc1d20-5b7c-4dff-988a-a8528d764fdf-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:05 crc kubenswrapper[4585]: I1201 14:14:05.998392 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5rwz\" (UniqueName: \"kubernetes.io/projected/dfdc1d20-5b7c-4dff-988a-a8528d764fdf-kube-api-access-r5rwz\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:06 crc kubenswrapper[4585]: I1201 14:14:06.098082 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-2bc92" Dec 01 14:14:06 crc kubenswrapper[4585]: I1201 14:14:06.104050 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-4d2c-account-create-update-d4hsr" Dec 01 14:14:06 crc kubenswrapper[4585]: I1201 14:14:06.112423 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f4d6-account-create-update-d4w4d" Dec 01 14:14:06 crc kubenswrapper[4585]: I1201 14:14:06.138641 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wwrrs" Dec 01 14:14:06 crc kubenswrapper[4585]: I1201 14:14:06.205106 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbfb665c-c279-40aa-bf1d-ed326b23d184-operator-scripts\") pod \"dbfb665c-c279-40aa-bf1d-ed326b23d184\" (UID: \"dbfb665c-c279-40aa-bf1d-ed326b23d184\") " Dec 01 14:14:06 crc kubenswrapper[4585]: I1201 14:14:06.205206 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4552555a-cd04-402f-84f0-48569cbf5fd8-operator-scripts\") pod \"4552555a-cd04-402f-84f0-48569cbf5fd8\" (UID: \"4552555a-cd04-402f-84f0-48569cbf5fd8\") " Dec 01 14:14:06 crc kubenswrapper[4585]: I1201 14:14:06.205346 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7l79\" (UniqueName: \"kubernetes.io/projected/dbfb665c-c279-40aa-bf1d-ed326b23d184-kube-api-access-q7l79\") pod \"dbfb665c-c279-40aa-bf1d-ed326b23d184\" (UID: \"dbfb665c-c279-40aa-bf1d-ed326b23d184\") " Dec 01 14:14:06 crc kubenswrapper[4585]: I1201 14:14:06.205387 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jzl7\" (UniqueName: \"kubernetes.io/projected/7bef25f3-d94c-4f4a-aa88-e48fb532fcec-kube-api-access-5jzl7\") pod \"7bef25f3-d94c-4f4a-aa88-e48fb532fcec\" (UID: \"7bef25f3-d94c-4f4a-aa88-e48fb532fcec\") " Dec 01 14:14:06 crc kubenswrapper[4585]: I1201 14:14:06.205451 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67c89\" (UniqueName: \"kubernetes.io/projected/4552555a-cd04-402f-84f0-48569cbf5fd8-kube-api-access-67c89\") pod \"4552555a-cd04-402f-84f0-48569cbf5fd8\" (UID: \"4552555a-cd04-402f-84f0-48569cbf5fd8\") " Dec 01 14:14:06 crc kubenswrapper[4585]: I1201 14:14:06.205486 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bef25f3-d94c-4f4a-aa88-e48fb532fcec-operator-scripts\") pod \"7bef25f3-d94c-4f4a-aa88-e48fb532fcec\" (UID: \"7bef25f3-d94c-4f4a-aa88-e48fb532fcec\") " Dec 01 14:14:06 crc kubenswrapper[4585]: I1201 14:14:06.209530 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbfb665c-c279-40aa-bf1d-ed326b23d184-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dbfb665c-c279-40aa-bf1d-ed326b23d184" (UID: "dbfb665c-c279-40aa-bf1d-ed326b23d184"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:14:06 crc kubenswrapper[4585]: I1201 14:14:06.209884 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4552555a-cd04-402f-84f0-48569cbf5fd8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4552555a-cd04-402f-84f0-48569cbf5fd8" (UID: "4552555a-cd04-402f-84f0-48569cbf5fd8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:14:06 crc kubenswrapper[4585]: I1201 14:14:06.215215 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbfb665c-c279-40aa-bf1d-ed326b23d184-kube-api-access-q7l79" (OuterVolumeSpecName: "kube-api-access-q7l79") pod "dbfb665c-c279-40aa-bf1d-ed326b23d184" (UID: "dbfb665c-c279-40aa-bf1d-ed326b23d184"). InnerVolumeSpecName "kube-api-access-q7l79". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:14:06 crc kubenswrapper[4585]: I1201 14:14:06.216233 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4552555a-cd04-402f-84f0-48569cbf5fd8-kube-api-access-67c89" (OuterVolumeSpecName: "kube-api-access-67c89") pod "4552555a-cd04-402f-84f0-48569cbf5fd8" (UID: "4552555a-cd04-402f-84f0-48569cbf5fd8"). InnerVolumeSpecName "kube-api-access-67c89". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:14:06 crc kubenswrapper[4585]: I1201 14:14:06.217633 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bef25f3-d94c-4f4a-aa88-e48fb532fcec-kube-api-access-5jzl7" (OuterVolumeSpecName: "kube-api-access-5jzl7") pod "7bef25f3-d94c-4f4a-aa88-e48fb532fcec" (UID: "7bef25f3-d94c-4f4a-aa88-e48fb532fcec"). InnerVolumeSpecName "kube-api-access-5jzl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:14:06 crc kubenswrapper[4585]: I1201 14:14:06.218586 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bef25f3-d94c-4f4a-aa88-e48fb532fcec-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7bef25f3-d94c-4f4a-aa88-e48fb532fcec" (UID: "7bef25f3-d94c-4f4a-aa88-e48fb532fcec"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:14:06 crc kubenswrapper[4585]: I1201 14:14:06.312261 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c27ce924-56e0-4896-9809-6b44ba1c215b-catalog-content\") pod \"c27ce924-56e0-4896-9809-6b44ba1c215b\" (UID: \"c27ce924-56e0-4896-9809-6b44ba1c215b\") " Dec 01 14:14:06 crc kubenswrapper[4585]: I1201 14:14:06.312385 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvz9t\" (UniqueName: \"kubernetes.io/projected/c27ce924-56e0-4896-9809-6b44ba1c215b-kube-api-access-fvz9t\") pod \"c27ce924-56e0-4896-9809-6b44ba1c215b\" (UID: \"c27ce924-56e0-4896-9809-6b44ba1c215b\") " Dec 01 14:14:06 crc kubenswrapper[4585]: I1201 14:14:06.312435 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c27ce924-56e0-4896-9809-6b44ba1c215b-utilities\") pod \"c27ce924-56e0-4896-9809-6b44ba1c215b\" (UID: \"c27ce924-56e0-4896-9809-6b44ba1c215b\") " Dec 01 14:14:06 crc kubenswrapper[4585]: I1201 14:14:06.312728 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7l79\" (UniqueName: \"kubernetes.io/projected/dbfb665c-c279-40aa-bf1d-ed326b23d184-kube-api-access-q7l79\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:06 crc kubenswrapper[4585]: I1201 14:14:06.312742 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jzl7\" (UniqueName: \"kubernetes.io/projected/7bef25f3-d94c-4f4a-aa88-e48fb532fcec-kube-api-access-5jzl7\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:06 crc kubenswrapper[4585]: I1201 14:14:06.312751 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67c89\" (UniqueName: \"kubernetes.io/projected/4552555a-cd04-402f-84f0-48569cbf5fd8-kube-api-access-67c89\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:06 crc kubenswrapper[4585]: I1201 14:14:06.312760 4585 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bef25f3-d94c-4f4a-aa88-e48fb532fcec-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:06 crc kubenswrapper[4585]: I1201 14:14:06.312768 4585 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbfb665c-c279-40aa-bf1d-ed326b23d184-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:06 crc kubenswrapper[4585]: I1201 14:14:06.312777 4585 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4552555a-cd04-402f-84f0-48569cbf5fd8-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:06 crc kubenswrapper[4585]: I1201 14:14:06.316217 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c27ce924-56e0-4896-9809-6b44ba1c215b-utilities" (OuterVolumeSpecName: "utilities") pod "c27ce924-56e0-4896-9809-6b44ba1c215b" (UID: "c27ce924-56e0-4896-9809-6b44ba1c215b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:14:06 crc kubenswrapper[4585]: I1201 14:14:06.318176 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c27ce924-56e0-4896-9809-6b44ba1c215b-kube-api-access-fvz9t" (OuterVolumeSpecName: "kube-api-access-fvz9t") pod "c27ce924-56e0-4896-9809-6b44ba1c215b" (UID: "c27ce924-56e0-4896-9809-6b44ba1c215b"). InnerVolumeSpecName "kube-api-access-fvz9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:14:06 crc kubenswrapper[4585]: I1201 14:14:06.364571 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c27ce924-56e0-4896-9809-6b44ba1c215b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c27ce924-56e0-4896-9809-6b44ba1c215b" (UID: "c27ce924-56e0-4896-9809-6b44ba1c215b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:14:06 crc kubenswrapper[4585]: I1201 14:14:06.376051 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-4d2c-account-create-update-d4hsr" Dec 01 14:14:06 crc kubenswrapper[4585]: I1201 14:14:06.376342 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-4d2c-account-create-update-d4hsr" event={"ID":"7bef25f3-d94c-4f4a-aa88-e48fb532fcec","Type":"ContainerDied","Data":"d5b26c5a41baba28cc6d77fd3c502656ebbb83e9614d886c5ac4c339c8026b95"} Dec 01 14:14:06 crc kubenswrapper[4585]: I1201 14:14:06.376378 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5b26c5a41baba28cc6d77fd3c502656ebbb83e9614d886c5ac4c339c8026b95" Dec 01 14:14:06 crc kubenswrapper[4585]: I1201 14:14:06.379650 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-2bc92" event={"ID":"4552555a-cd04-402f-84f0-48569cbf5fd8","Type":"ContainerDied","Data":"98c46af75035fcd99b636489e5911b0e3e1baf785143034e5ba818bbe410108f"} Dec 01 14:14:06 crc kubenswrapper[4585]: I1201 14:14:06.379679 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98c46af75035fcd99b636489e5911b0e3e1baf785143034e5ba818bbe410108f" Dec 01 14:14:06 crc kubenswrapper[4585]: I1201 14:14:06.379741 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-2bc92" Dec 01 14:14:06 crc kubenswrapper[4585]: I1201 14:14:06.389479 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-x849h" event={"ID":"dfdc1d20-5b7c-4dff-988a-a8528d764fdf","Type":"ContainerDied","Data":"81596dc7e5b35ef66a2ba274d66b30c295f7ea006f1c110c26ffc7ee58157324"} Dec 01 14:14:06 crc kubenswrapper[4585]: I1201 14:14:06.389519 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81596dc7e5b35ef66a2ba274d66b30c295f7ea006f1c110c26ffc7ee58157324" Dec 01 14:14:06 crc kubenswrapper[4585]: I1201 14:14:06.390120 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-x849h" Dec 01 14:14:06 crc kubenswrapper[4585]: I1201 14:14:06.399010 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f4d6-account-create-update-d4w4d" event={"ID":"dbfb665c-c279-40aa-bf1d-ed326b23d184","Type":"ContainerDied","Data":"43c27be06a622a71404e1b821090528a26a2ca42e936f072ddb0f5d68dd778a5"} Dec 01 14:14:06 crc kubenswrapper[4585]: I1201 14:14:06.399057 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43c27be06a622a71404e1b821090528a26a2ca42e936f072ddb0f5d68dd778a5" Dec 01 14:14:06 crc kubenswrapper[4585]: I1201 14:14:06.399210 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f4d6-account-create-update-d4w4d" Dec 01 14:14:06 crc kubenswrapper[4585]: I1201 14:14:06.401501 4585 generic.go:334] "Generic (PLEG): container finished" podID="c27ce924-56e0-4896-9809-6b44ba1c215b" containerID="3bf3fd2122dbab8298d005cce01a835477f0b0863fefd97c684be49b4fa8d381" exitCode=0 Dec 01 14:14:06 crc kubenswrapper[4585]: I1201 14:14:06.404382 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wwrrs" Dec 01 14:14:06 crc kubenswrapper[4585]: I1201 14:14:06.404860 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wwrrs" event={"ID":"c27ce924-56e0-4896-9809-6b44ba1c215b","Type":"ContainerDied","Data":"3bf3fd2122dbab8298d005cce01a835477f0b0863fefd97c684be49b4fa8d381"} Dec 01 14:14:06 crc kubenswrapper[4585]: I1201 14:14:06.404907 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wwrrs" event={"ID":"c27ce924-56e0-4896-9809-6b44ba1c215b","Type":"ContainerDied","Data":"251bb33fb2202960997322c33ff2b83b7a245b778256961376467542e98738f1"} Dec 01 14:14:06 crc kubenswrapper[4585]: I1201 14:14:06.404930 4585 scope.go:117] "RemoveContainer" containerID="3bf3fd2122dbab8298d005cce01a835477f0b0863fefd97c684be49b4fa8d381" Dec 01 14:14:06 crc kubenswrapper[4585]: I1201 14:14:06.408946 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvjgq" event={"ID":"ac4aab85-ddfc-4e11-8e25-54e8bbdbe195","Type":"ContainerStarted","Data":"7519df874c2b286bffcd27bcc6863ca34d10076fd854e48742dfa6960dfddbba"} Dec 01 14:14:06 crc kubenswrapper[4585]: I1201 14:14:06.418129 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvz9t\" (UniqueName: \"kubernetes.io/projected/c27ce924-56e0-4896-9809-6b44ba1c215b-kube-api-access-fvz9t\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:06 crc kubenswrapper[4585]: I1201 14:14:06.418167 4585 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c27ce924-56e0-4896-9809-6b44ba1c215b-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:06 crc kubenswrapper[4585]: I1201 14:14:06.418178 4585 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c27ce924-56e0-4896-9809-6b44ba1c215b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:06 crc kubenswrapper[4585]: I1201 14:14:06.443190 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hvjgq" podStartSLOduration=5.2484487810000005 podStartE2EDuration="8.44316542s" podCreationTimestamp="2025-12-01 14:13:58 +0000 UTC" firstStartedPulling="2025-12-01 14:14:02.926215532 +0000 UTC m=+956.910429387" lastFinishedPulling="2025-12-01 14:14:06.120932171 +0000 UTC m=+960.105146026" observedRunningTime="2025-12-01 14:14:06.439280367 +0000 UTC m=+960.423494222" watchObservedRunningTime="2025-12-01 14:14:06.44316542 +0000 UTC m=+960.427379275" Dec 01 14:14:06 crc kubenswrapper[4585]: I1201 14:14:06.507684 4585 scope.go:117] "RemoveContainer" containerID="51dce2d73e3eb49ba393ede7f058a91cd475563b03913982b5e7df3148f643c5" Dec 01 14:14:06 crc kubenswrapper[4585]: I1201 14:14:06.526508 4585 scope.go:117] "RemoveContainer" containerID="5fb137ae4e61f67062fe6cf034ea1b9fd0c0b6986ead83b4d1c29d954332db67" Dec 01 14:14:06 crc kubenswrapper[4585]: I1201 14:14:06.570418 4585 scope.go:117] "RemoveContainer" containerID="3bf3fd2122dbab8298d005cce01a835477f0b0863fefd97c684be49b4fa8d381" Dec 01 14:14:06 crc kubenswrapper[4585]: E1201 14:14:06.571276 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bf3fd2122dbab8298d005cce01a835477f0b0863fefd97c684be49b4fa8d381\": container with ID starting with 3bf3fd2122dbab8298d005cce01a835477f0b0863fefd97c684be49b4fa8d381 not found: ID does not exist" containerID="3bf3fd2122dbab8298d005cce01a835477f0b0863fefd97c684be49b4fa8d381" Dec 01 14:14:06 crc kubenswrapper[4585]: I1201 14:14:06.571306 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bf3fd2122dbab8298d005cce01a835477f0b0863fefd97c684be49b4fa8d381"} err="failed to get container status \"3bf3fd2122dbab8298d005cce01a835477f0b0863fefd97c684be49b4fa8d381\": rpc error: code = NotFound desc = could not find container \"3bf3fd2122dbab8298d005cce01a835477f0b0863fefd97c684be49b4fa8d381\": container with ID starting with 3bf3fd2122dbab8298d005cce01a835477f0b0863fefd97c684be49b4fa8d381 not found: ID does not exist" Dec 01 14:14:06 crc kubenswrapper[4585]: I1201 14:14:06.571326 4585 scope.go:117] "RemoveContainer" containerID="51dce2d73e3eb49ba393ede7f058a91cd475563b03913982b5e7df3148f643c5" Dec 01 14:14:06 crc kubenswrapper[4585]: E1201 14:14:06.571842 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51dce2d73e3eb49ba393ede7f058a91cd475563b03913982b5e7df3148f643c5\": container with ID starting with 51dce2d73e3eb49ba393ede7f058a91cd475563b03913982b5e7df3148f643c5 not found: ID does not exist" containerID="51dce2d73e3eb49ba393ede7f058a91cd475563b03913982b5e7df3148f643c5" Dec 01 14:14:06 crc kubenswrapper[4585]: I1201 14:14:06.571863 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51dce2d73e3eb49ba393ede7f058a91cd475563b03913982b5e7df3148f643c5"} err="failed to get container status \"51dce2d73e3eb49ba393ede7f058a91cd475563b03913982b5e7df3148f643c5\": rpc error: code = NotFound desc = could not find container \"51dce2d73e3eb49ba393ede7f058a91cd475563b03913982b5e7df3148f643c5\": container with ID starting with 51dce2d73e3eb49ba393ede7f058a91cd475563b03913982b5e7df3148f643c5 not found: ID does not exist" Dec 01 14:14:06 crc kubenswrapper[4585]: I1201 14:14:06.571879 4585 scope.go:117] "RemoveContainer" containerID="5fb137ae4e61f67062fe6cf034ea1b9fd0c0b6986ead83b4d1c29d954332db67" Dec 01 14:14:06 crc kubenswrapper[4585]: E1201 14:14:06.572253 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fb137ae4e61f67062fe6cf034ea1b9fd0c0b6986ead83b4d1c29d954332db67\": container with ID starting with 5fb137ae4e61f67062fe6cf034ea1b9fd0c0b6986ead83b4d1c29d954332db67 not found: ID does not exist" containerID="5fb137ae4e61f67062fe6cf034ea1b9fd0c0b6986ead83b4d1c29d954332db67" Dec 01 14:14:06 crc kubenswrapper[4585]: I1201 14:14:06.572271 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fb137ae4e61f67062fe6cf034ea1b9fd0c0b6986ead83b4d1c29d954332db67"} err="failed to get container status \"5fb137ae4e61f67062fe6cf034ea1b9fd0c0b6986ead83b4d1c29d954332db67\": rpc error: code = NotFound desc = could not find container \"5fb137ae4e61f67062fe6cf034ea1b9fd0c0b6986ead83b4d1c29d954332db67\": container with ID starting with 5fb137ae4e61f67062fe6cf034ea1b9fd0c0b6986ead83b4d1c29d954332db67 not found: ID does not exist" Dec 01 14:14:06 crc kubenswrapper[4585]: I1201 14:14:06.609896 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wwrrs"] Dec 01 14:14:06 crc kubenswrapper[4585]: I1201 14:14:06.616623 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wwrrs"] Dec 01 14:14:06 crc kubenswrapper[4585]: I1201 14:14:06.815682 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-p2d6v" Dec 01 14:14:07 crc kubenswrapper[4585]: I1201 14:14:07.539827 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3bc5c97f-1882-47da-843c-f8dba234f1f3-etc-swift\") pod \"swift-storage-0\" (UID: \"3bc5c97f-1882-47da-843c-f8dba234f1f3\") " pod="openstack/swift-storage-0" Dec 01 14:14:07 crc kubenswrapper[4585]: E1201 14:14:07.540051 4585 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 01 14:14:07 crc kubenswrapper[4585]: E1201 14:14:07.540075 4585 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 01 14:14:07 crc kubenswrapper[4585]: E1201 14:14:07.540135 4585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3bc5c97f-1882-47da-843c-f8dba234f1f3-etc-swift podName:3bc5c97f-1882-47da-843c-f8dba234f1f3 nodeName:}" failed. No retries permitted until 2025-12-01 14:14:23.540110021 +0000 UTC m=+977.524323876 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/3bc5c97f-1882-47da-843c-f8dba234f1f3-etc-swift") pod "swift-storage-0" (UID: "3bc5c97f-1882-47da-843c-f8dba234f1f3") : configmap "swift-ring-files" not found Dec 01 14:14:08 crc kubenswrapper[4585]: I1201 14:14:08.422989 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c27ce924-56e0-4896-9809-6b44ba1c215b" path="/var/lib/kubelet/pods/c27ce924-56e0-4896-9809-6b44ba1c215b/volumes" Dec 01 14:14:09 crc kubenswrapper[4585]: I1201 14:14:09.040146 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-rvrx7"] Dec 01 14:14:09 crc kubenswrapper[4585]: E1201 14:14:09.040756 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbfb665c-c279-40aa-bf1d-ed326b23d184" containerName="mariadb-account-create-update" Dec 01 14:14:09 crc kubenswrapper[4585]: I1201 14:14:09.040769 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbfb665c-c279-40aa-bf1d-ed326b23d184" containerName="mariadb-account-create-update" Dec 01 14:14:09 crc kubenswrapper[4585]: E1201 14:14:09.040780 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="999a47fa-96dd-4791-88bc-ff5e45fe9d6b" containerName="mariadb-account-create-update" Dec 01 14:14:09 crc kubenswrapper[4585]: I1201 14:14:09.040786 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="999a47fa-96dd-4791-88bc-ff5e45fe9d6b" containerName="mariadb-account-create-update" Dec 01 14:14:09 crc kubenswrapper[4585]: E1201 14:14:09.040798 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bef25f3-d94c-4f4a-aa88-e48fb532fcec" containerName="mariadb-account-create-update" Dec 01 14:14:09 crc kubenswrapper[4585]: I1201 14:14:09.040804 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bef25f3-d94c-4f4a-aa88-e48fb532fcec" containerName="mariadb-account-create-update" Dec 01 14:14:09 crc kubenswrapper[4585]: E1201 14:14:09.040815 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14c28b2f-8076-4123-8a7b-e907e8d88a30" containerName="mariadb-database-create" Dec 01 14:14:09 crc kubenswrapper[4585]: I1201 14:14:09.040822 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="14c28b2f-8076-4123-8a7b-e907e8d88a30" containerName="mariadb-database-create" Dec 01 14:14:09 crc kubenswrapper[4585]: E1201 14:14:09.040837 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfdc1d20-5b7c-4dff-988a-a8528d764fdf" containerName="mariadb-database-create" Dec 01 14:14:09 crc kubenswrapper[4585]: I1201 14:14:09.040845 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfdc1d20-5b7c-4dff-988a-a8528d764fdf" containerName="mariadb-database-create" Dec 01 14:14:09 crc kubenswrapper[4585]: E1201 14:14:09.040858 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c27ce924-56e0-4896-9809-6b44ba1c215b" containerName="extract-utilities" Dec 01 14:14:09 crc kubenswrapper[4585]: I1201 14:14:09.040866 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="c27ce924-56e0-4896-9809-6b44ba1c215b" containerName="extract-utilities" Dec 01 14:14:09 crc kubenswrapper[4585]: E1201 14:14:09.040873 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c27ce924-56e0-4896-9809-6b44ba1c215b" containerName="extract-content" Dec 01 14:14:09 crc kubenswrapper[4585]: I1201 14:14:09.040880 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="c27ce924-56e0-4896-9809-6b44ba1c215b" containerName="extract-content" Dec 01 14:14:09 crc kubenswrapper[4585]: E1201 14:14:09.040894 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35d48e93-76cf-4411-9772-d650bb88c378" containerName="init" Dec 01 14:14:09 crc kubenswrapper[4585]: I1201 14:14:09.040902 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="35d48e93-76cf-4411-9772-d650bb88c378" containerName="init" Dec 01 14:14:09 crc kubenswrapper[4585]: E1201 14:14:09.040914 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4552555a-cd04-402f-84f0-48569cbf5fd8" containerName="mariadb-database-create" Dec 01 14:14:09 crc kubenswrapper[4585]: I1201 14:14:09.040922 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="4552555a-cd04-402f-84f0-48569cbf5fd8" containerName="mariadb-database-create" Dec 01 14:14:09 crc kubenswrapper[4585]: E1201 14:14:09.040933 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="522a4623-50b5-4e3c-97bf-d67856196c1f" containerName="init" Dec 01 14:14:09 crc kubenswrapper[4585]: I1201 14:14:09.040940 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="522a4623-50b5-4e3c-97bf-d67856196c1f" containerName="init" Dec 01 14:14:09 crc kubenswrapper[4585]: E1201 14:14:09.040957 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c27ce924-56e0-4896-9809-6b44ba1c215b" containerName="registry-server" Dec 01 14:14:09 crc kubenswrapper[4585]: I1201 14:14:09.040963 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="c27ce924-56e0-4896-9809-6b44ba1c215b" containerName="registry-server" Dec 01 14:14:09 crc kubenswrapper[4585]: E1201 14:14:09.040993 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="522a4623-50b5-4e3c-97bf-d67856196c1f" containerName="dnsmasq-dns" Dec 01 14:14:09 crc kubenswrapper[4585]: I1201 14:14:09.041003 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="522a4623-50b5-4e3c-97bf-d67856196c1f" containerName="dnsmasq-dns" Dec 01 14:14:09 crc kubenswrapper[4585]: I1201 14:14:09.041169 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="35d48e93-76cf-4411-9772-d650bb88c378" containerName="init" Dec 01 14:14:09 crc kubenswrapper[4585]: I1201 14:14:09.041206 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfdc1d20-5b7c-4dff-988a-a8528d764fdf" containerName="mariadb-database-create" Dec 01 14:14:09 crc kubenswrapper[4585]: I1201 14:14:09.041225 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="4552555a-cd04-402f-84f0-48569cbf5fd8" containerName="mariadb-database-create" Dec 01 14:14:09 crc kubenswrapper[4585]: I1201 14:14:09.041234 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbfb665c-c279-40aa-bf1d-ed326b23d184" containerName="mariadb-account-create-update" Dec 01 14:14:09 crc kubenswrapper[4585]: I1201 14:14:09.041243 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="c27ce924-56e0-4896-9809-6b44ba1c215b" containerName="registry-server" Dec 01 14:14:09 crc kubenswrapper[4585]: I1201 14:14:09.041256 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="14c28b2f-8076-4123-8a7b-e907e8d88a30" containerName="mariadb-database-create" Dec 01 14:14:09 crc kubenswrapper[4585]: I1201 14:14:09.041273 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bef25f3-d94c-4f4a-aa88-e48fb532fcec" containerName="mariadb-account-create-update" Dec 01 14:14:09 crc kubenswrapper[4585]: I1201 14:14:09.041288 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="522a4623-50b5-4e3c-97bf-d67856196c1f" containerName="dnsmasq-dns" Dec 01 14:14:09 crc kubenswrapper[4585]: I1201 14:14:09.041302 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="999a47fa-96dd-4791-88bc-ff5e45fe9d6b" containerName="mariadb-account-create-update" Dec 01 14:14:09 crc kubenswrapper[4585]: I1201 14:14:09.042010 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-rvrx7" Dec 01 14:14:09 crc kubenswrapper[4585]: I1201 14:14:09.046088 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 01 14:14:09 crc kubenswrapper[4585]: I1201 14:14:09.054791 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-rvrx7"] Dec 01 14:14:09 crc kubenswrapper[4585]: I1201 14:14:09.059122 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-rvw5c" Dec 01 14:14:09 crc kubenswrapper[4585]: I1201 14:14:09.077011 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3732bb19-81e6-42db-88c4-f0476ae5ace4-config-data\") pod \"glance-db-sync-rvrx7\" (UID: \"3732bb19-81e6-42db-88c4-f0476ae5ace4\") " pod="openstack/glance-db-sync-rvrx7" Dec 01 14:14:09 crc kubenswrapper[4585]: I1201 14:14:09.077076 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3732bb19-81e6-42db-88c4-f0476ae5ace4-combined-ca-bundle\") pod \"glance-db-sync-rvrx7\" (UID: \"3732bb19-81e6-42db-88c4-f0476ae5ace4\") " pod="openstack/glance-db-sync-rvrx7" Dec 01 14:14:09 crc kubenswrapper[4585]: I1201 14:14:09.077224 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lrq8\" (UniqueName: \"kubernetes.io/projected/3732bb19-81e6-42db-88c4-f0476ae5ace4-kube-api-access-4lrq8\") pod \"glance-db-sync-rvrx7\" (UID: \"3732bb19-81e6-42db-88c4-f0476ae5ace4\") " pod="openstack/glance-db-sync-rvrx7" Dec 01 14:14:09 crc kubenswrapper[4585]: I1201 14:14:09.077289 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3732bb19-81e6-42db-88c4-f0476ae5ace4-db-sync-config-data\") pod \"glance-db-sync-rvrx7\" (UID: \"3732bb19-81e6-42db-88c4-f0476ae5ace4\") " pod="openstack/glance-db-sync-rvrx7" Dec 01 14:14:09 crc kubenswrapper[4585]: I1201 14:14:09.179645 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3732bb19-81e6-42db-88c4-f0476ae5ace4-config-data\") pod \"glance-db-sync-rvrx7\" (UID: \"3732bb19-81e6-42db-88c4-f0476ae5ace4\") " pod="openstack/glance-db-sync-rvrx7" Dec 01 14:14:09 crc kubenswrapper[4585]: I1201 14:14:09.179741 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3732bb19-81e6-42db-88c4-f0476ae5ace4-combined-ca-bundle\") pod \"glance-db-sync-rvrx7\" (UID: \"3732bb19-81e6-42db-88c4-f0476ae5ace4\") " pod="openstack/glance-db-sync-rvrx7" Dec 01 14:14:09 crc kubenswrapper[4585]: I1201 14:14:09.179833 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lrq8\" (UniqueName: \"kubernetes.io/projected/3732bb19-81e6-42db-88c4-f0476ae5ace4-kube-api-access-4lrq8\") pod \"glance-db-sync-rvrx7\" (UID: \"3732bb19-81e6-42db-88c4-f0476ae5ace4\") " pod="openstack/glance-db-sync-rvrx7" Dec 01 14:14:09 crc kubenswrapper[4585]: I1201 14:14:09.179909 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3732bb19-81e6-42db-88c4-f0476ae5ace4-db-sync-config-data\") pod \"glance-db-sync-rvrx7\" (UID: \"3732bb19-81e6-42db-88c4-f0476ae5ace4\") " pod="openstack/glance-db-sync-rvrx7" Dec 01 14:14:09 crc kubenswrapper[4585]: I1201 14:14:09.186316 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3732bb19-81e6-42db-88c4-f0476ae5ace4-combined-ca-bundle\") pod \"glance-db-sync-rvrx7\" (UID: \"3732bb19-81e6-42db-88c4-f0476ae5ace4\") " pod="openstack/glance-db-sync-rvrx7" Dec 01 14:14:09 crc kubenswrapper[4585]: I1201 14:14:09.186465 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3732bb19-81e6-42db-88c4-f0476ae5ace4-config-data\") pod \"glance-db-sync-rvrx7\" (UID: \"3732bb19-81e6-42db-88c4-f0476ae5ace4\") " pod="openstack/glance-db-sync-rvrx7" Dec 01 14:14:09 crc kubenswrapper[4585]: I1201 14:14:09.187097 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3732bb19-81e6-42db-88c4-f0476ae5ace4-db-sync-config-data\") pod \"glance-db-sync-rvrx7\" (UID: \"3732bb19-81e6-42db-88c4-f0476ae5ace4\") " pod="openstack/glance-db-sync-rvrx7" Dec 01 14:14:09 crc kubenswrapper[4585]: I1201 14:14:09.199890 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lrq8\" (UniqueName: \"kubernetes.io/projected/3732bb19-81e6-42db-88c4-f0476ae5ace4-kube-api-access-4lrq8\") pod \"glance-db-sync-rvrx7\" (UID: \"3732bb19-81e6-42db-88c4-f0476ae5ace4\") " pod="openstack/glance-db-sync-rvrx7" Dec 01 14:14:09 crc kubenswrapper[4585]: I1201 14:14:09.219903 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hvjgq" Dec 01 14:14:09 crc kubenswrapper[4585]: I1201 14:14:09.220133 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hvjgq" Dec 01 14:14:09 crc kubenswrapper[4585]: I1201 14:14:09.263421 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hvjgq" Dec 01 14:14:09 crc kubenswrapper[4585]: I1201 14:14:09.359595 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-rvrx7" Dec 01 14:14:09 crc kubenswrapper[4585]: I1201 14:14:09.637334 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p2d6v"] Dec 01 14:14:09 crc kubenswrapper[4585]: I1201 14:14:09.637856 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-p2d6v" podUID="29ef2e82-35e4-4b31-9324-4fd4274a82b1" containerName="registry-server" containerID="cri-o://2b604a606be77b06926321878fcdaf1cd226fca047d14b5274c7240d2558363f" gracePeriod=2 Dec 01 14:14:09 crc kubenswrapper[4585]: I1201 14:14:09.860793 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ncdb4" Dec 01 14:14:09 crc kubenswrapper[4585]: I1201 14:14:09.945075 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ncdb4" Dec 01 14:14:10 crc kubenswrapper[4585]: I1201 14:14:10.018409 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-rvrx7"] Dec 01 14:14:10 crc kubenswrapper[4585]: I1201 14:14:10.138641 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p2d6v" Dec 01 14:14:10 crc kubenswrapper[4585]: I1201 14:14:10.216548 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29ef2e82-35e4-4b31-9324-4fd4274a82b1-utilities\") pod \"29ef2e82-35e4-4b31-9324-4fd4274a82b1\" (UID: \"29ef2e82-35e4-4b31-9324-4fd4274a82b1\") " Dec 01 14:14:10 crc kubenswrapper[4585]: I1201 14:14:10.217211 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29ef2e82-35e4-4b31-9324-4fd4274a82b1-catalog-content\") pod \"29ef2e82-35e4-4b31-9324-4fd4274a82b1\" (UID: \"29ef2e82-35e4-4b31-9324-4fd4274a82b1\") " Dec 01 14:14:10 crc kubenswrapper[4585]: I1201 14:14:10.217348 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pprvq\" (UniqueName: \"kubernetes.io/projected/29ef2e82-35e4-4b31-9324-4fd4274a82b1-kube-api-access-pprvq\") pod \"29ef2e82-35e4-4b31-9324-4fd4274a82b1\" (UID: \"29ef2e82-35e4-4b31-9324-4fd4274a82b1\") " Dec 01 14:14:10 crc kubenswrapper[4585]: I1201 14:14:10.217508 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29ef2e82-35e4-4b31-9324-4fd4274a82b1-utilities" (OuterVolumeSpecName: "utilities") pod "29ef2e82-35e4-4b31-9324-4fd4274a82b1" (UID: "29ef2e82-35e4-4b31-9324-4fd4274a82b1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:14:10 crc kubenswrapper[4585]: I1201 14:14:10.217990 4585 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29ef2e82-35e4-4b31-9324-4fd4274a82b1-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:10 crc kubenswrapper[4585]: I1201 14:14:10.229035 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29ef2e82-35e4-4b31-9324-4fd4274a82b1-kube-api-access-pprvq" (OuterVolumeSpecName: "kube-api-access-pprvq") pod "29ef2e82-35e4-4b31-9324-4fd4274a82b1" (UID: "29ef2e82-35e4-4b31-9324-4fd4274a82b1"). InnerVolumeSpecName "kube-api-access-pprvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:14:10 crc kubenswrapper[4585]: I1201 14:14:10.239891 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29ef2e82-35e4-4b31-9324-4fd4274a82b1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "29ef2e82-35e4-4b31-9324-4fd4274a82b1" (UID: "29ef2e82-35e4-4b31-9324-4fd4274a82b1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:14:10 crc kubenswrapper[4585]: I1201 14:14:10.319274 4585 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29ef2e82-35e4-4b31-9324-4fd4274a82b1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:10 crc kubenswrapper[4585]: I1201 14:14:10.319314 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pprvq\" (UniqueName: \"kubernetes.io/projected/29ef2e82-35e4-4b31-9324-4fd4274a82b1-kube-api-access-pprvq\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:10 crc kubenswrapper[4585]: I1201 14:14:10.447240 4585 generic.go:334] "Generic (PLEG): container finished" podID="29ef2e82-35e4-4b31-9324-4fd4274a82b1" containerID="2b604a606be77b06926321878fcdaf1cd226fca047d14b5274c7240d2558363f" exitCode=0 Dec 01 14:14:10 crc kubenswrapper[4585]: I1201 14:14:10.447425 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p2d6v" event={"ID":"29ef2e82-35e4-4b31-9324-4fd4274a82b1","Type":"ContainerDied","Data":"2b604a606be77b06926321878fcdaf1cd226fca047d14b5274c7240d2558363f"} Dec 01 14:14:10 crc kubenswrapper[4585]: I1201 14:14:10.447487 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p2d6v" event={"ID":"29ef2e82-35e4-4b31-9324-4fd4274a82b1","Type":"ContainerDied","Data":"ad7739df0666fc00220eddb370a2b21b8e84e63ce396c44abb118e0136b88730"} Dec 01 14:14:10 crc kubenswrapper[4585]: I1201 14:14:10.447510 4585 scope.go:117] "RemoveContainer" containerID="2b604a606be77b06926321878fcdaf1cd226fca047d14b5274c7240d2558363f" Dec 01 14:14:10 crc kubenswrapper[4585]: I1201 14:14:10.447676 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p2d6v" Dec 01 14:14:10 crc kubenswrapper[4585]: I1201 14:14:10.448942 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-rvrx7" event={"ID":"3732bb19-81e6-42db-88c4-f0476ae5ace4","Type":"ContainerStarted","Data":"d87b53f578d87c87c612a27593d62c6f67a8402da180b02b73e231aaa1aaa874"} Dec 01 14:14:10 crc kubenswrapper[4585]: I1201 14:14:10.472015 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p2d6v"] Dec 01 14:14:10 crc kubenswrapper[4585]: I1201 14:14:10.478802 4585 scope.go:117] "RemoveContainer" containerID="430b270ffeb7e50a2b070aad3755126a55b242deaa33c100d79a334dd36c77f6" Dec 01 14:14:10 crc kubenswrapper[4585]: I1201 14:14:10.482444 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-p2d6v"] Dec 01 14:14:10 crc kubenswrapper[4585]: I1201 14:14:10.498443 4585 scope.go:117] "RemoveContainer" containerID="58a24cb030fbaa6dee22fe3aa48c5d1e5d0db0d3418d954c72346d90bff87464" Dec 01 14:14:10 crc kubenswrapper[4585]: I1201 14:14:10.531213 4585 scope.go:117] "RemoveContainer" containerID="2b604a606be77b06926321878fcdaf1cd226fca047d14b5274c7240d2558363f" Dec 01 14:14:10 crc kubenswrapper[4585]: E1201 14:14:10.531779 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b604a606be77b06926321878fcdaf1cd226fca047d14b5274c7240d2558363f\": container with ID starting with 2b604a606be77b06926321878fcdaf1cd226fca047d14b5274c7240d2558363f not found: ID does not exist" containerID="2b604a606be77b06926321878fcdaf1cd226fca047d14b5274c7240d2558363f" Dec 01 14:14:10 crc kubenswrapper[4585]: I1201 14:14:10.531822 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b604a606be77b06926321878fcdaf1cd226fca047d14b5274c7240d2558363f"} err="failed to get container status \"2b604a606be77b06926321878fcdaf1cd226fca047d14b5274c7240d2558363f\": rpc error: code = NotFound desc = could not find container \"2b604a606be77b06926321878fcdaf1cd226fca047d14b5274c7240d2558363f\": container with ID starting with 2b604a606be77b06926321878fcdaf1cd226fca047d14b5274c7240d2558363f not found: ID does not exist" Dec 01 14:14:10 crc kubenswrapper[4585]: I1201 14:14:10.531850 4585 scope.go:117] "RemoveContainer" containerID="430b270ffeb7e50a2b070aad3755126a55b242deaa33c100d79a334dd36c77f6" Dec 01 14:14:10 crc kubenswrapper[4585]: E1201 14:14:10.532422 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"430b270ffeb7e50a2b070aad3755126a55b242deaa33c100d79a334dd36c77f6\": container with ID starting with 430b270ffeb7e50a2b070aad3755126a55b242deaa33c100d79a334dd36c77f6 not found: ID does not exist" containerID="430b270ffeb7e50a2b070aad3755126a55b242deaa33c100d79a334dd36c77f6" Dec 01 14:14:10 crc kubenswrapper[4585]: I1201 14:14:10.532524 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"430b270ffeb7e50a2b070aad3755126a55b242deaa33c100d79a334dd36c77f6"} err="failed to get container status \"430b270ffeb7e50a2b070aad3755126a55b242deaa33c100d79a334dd36c77f6\": rpc error: code = NotFound desc = could not find container \"430b270ffeb7e50a2b070aad3755126a55b242deaa33c100d79a334dd36c77f6\": container with ID starting with 430b270ffeb7e50a2b070aad3755126a55b242deaa33c100d79a334dd36c77f6 not found: ID does not exist" Dec 01 14:14:10 crc kubenswrapper[4585]: I1201 14:14:10.532604 4585 scope.go:117] "RemoveContainer" containerID="58a24cb030fbaa6dee22fe3aa48c5d1e5d0db0d3418d954c72346d90bff87464" Dec 01 14:14:10 crc kubenswrapper[4585]: E1201 14:14:10.533125 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58a24cb030fbaa6dee22fe3aa48c5d1e5d0db0d3418d954c72346d90bff87464\": container with ID starting with 58a24cb030fbaa6dee22fe3aa48c5d1e5d0db0d3418d954c72346d90bff87464 not found: ID does not exist" containerID="58a24cb030fbaa6dee22fe3aa48c5d1e5d0db0d3418d954c72346d90bff87464" Dec 01 14:14:10 crc kubenswrapper[4585]: I1201 14:14:10.533166 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58a24cb030fbaa6dee22fe3aa48c5d1e5d0db0d3418d954c72346d90bff87464"} err="failed to get container status \"58a24cb030fbaa6dee22fe3aa48c5d1e5d0db0d3418d954c72346d90bff87464\": rpc error: code = NotFound desc = could not find container \"58a24cb030fbaa6dee22fe3aa48c5d1e5d0db0d3418d954c72346d90bff87464\": container with ID starting with 58a24cb030fbaa6dee22fe3aa48c5d1e5d0db0d3418d954c72346d90bff87464 not found: ID does not exist" Dec 01 14:14:11 crc kubenswrapper[4585]: I1201 14:14:11.460176 4585 generic.go:334] "Generic (PLEG): container finished" podID="26bcdef2-b1e8-4848-abc4-b1f6a45c9916" containerID="3a388456df96f333c161d96c402bb37c15ae10d548c894f4684d5a591c18c7d7" exitCode=0 Dec 01 14:14:11 crc kubenswrapper[4585]: I1201 14:14:11.460211 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-7zn7j" event={"ID":"26bcdef2-b1e8-4848-abc4-b1f6a45c9916","Type":"ContainerDied","Data":"3a388456df96f333c161d96c402bb37c15ae10d548c894f4684d5a591c18c7d7"} Dec 01 14:14:11 crc kubenswrapper[4585]: I1201 14:14:11.510136 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-74f6f696b9-xqldv" Dec 01 14:14:11 crc kubenswrapper[4585]: I1201 14:14:11.962116 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-xb2ht" Dec 01 14:14:12 crc kubenswrapper[4585]: I1201 14:14:12.017559 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6f696b9-xqldv"] Dec 01 14:14:12 crc kubenswrapper[4585]: I1201 14:14:12.424282 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29ef2e82-35e4-4b31-9324-4fd4274a82b1" path="/var/lib/kubelet/pods/29ef2e82-35e4-4b31-9324-4fd4274a82b1/volumes" Dec 01 14:14:12 crc kubenswrapper[4585]: I1201 14:14:12.467567 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74f6f696b9-xqldv" podUID="4e11bdcb-4b1d-466a-8c52-8dc536396e07" containerName="dnsmasq-dns" containerID="cri-o://674e6b706f16b7ded0bde538df57be7f7c4ca70a58ed7d59ec9903c1faf47bed" gracePeriod=10 Dec 01 14:14:12 crc kubenswrapper[4585]: I1201 14:14:12.846644 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-7zn7j" Dec 01 14:14:12 crc kubenswrapper[4585]: I1201 14:14:12.880391 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/26bcdef2-b1e8-4848-abc4-b1f6a45c9916-scripts\") pod \"26bcdef2-b1e8-4848-abc4-b1f6a45c9916\" (UID: \"26bcdef2-b1e8-4848-abc4-b1f6a45c9916\") " Dec 01 14:14:12 crc kubenswrapper[4585]: I1201 14:14:12.880433 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/26bcdef2-b1e8-4848-abc4-b1f6a45c9916-ring-data-devices\") pod \"26bcdef2-b1e8-4848-abc4-b1f6a45c9916\" (UID: \"26bcdef2-b1e8-4848-abc4-b1f6a45c9916\") " Dec 01 14:14:12 crc kubenswrapper[4585]: I1201 14:14:12.880504 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/26bcdef2-b1e8-4848-abc4-b1f6a45c9916-dispersionconf\") pod \"26bcdef2-b1e8-4848-abc4-b1f6a45c9916\" (UID: \"26bcdef2-b1e8-4848-abc4-b1f6a45c9916\") " Dec 01 14:14:12 crc kubenswrapper[4585]: I1201 14:14:12.880595 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/26bcdef2-b1e8-4848-abc4-b1f6a45c9916-swiftconf\") pod \"26bcdef2-b1e8-4848-abc4-b1f6a45c9916\" (UID: \"26bcdef2-b1e8-4848-abc4-b1f6a45c9916\") " Dec 01 14:14:12 crc kubenswrapper[4585]: I1201 14:14:12.880627 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26bcdef2-b1e8-4848-abc4-b1f6a45c9916-combined-ca-bundle\") pod \"26bcdef2-b1e8-4848-abc4-b1f6a45c9916\" (UID: \"26bcdef2-b1e8-4848-abc4-b1f6a45c9916\") " Dec 01 14:14:12 crc kubenswrapper[4585]: I1201 14:14:12.880652 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/26bcdef2-b1e8-4848-abc4-b1f6a45c9916-etc-swift\") pod \"26bcdef2-b1e8-4848-abc4-b1f6a45c9916\" (UID: \"26bcdef2-b1e8-4848-abc4-b1f6a45c9916\") " Dec 01 14:14:12 crc kubenswrapper[4585]: I1201 14:14:12.880714 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9m9qn\" (UniqueName: \"kubernetes.io/projected/26bcdef2-b1e8-4848-abc4-b1f6a45c9916-kube-api-access-9m9qn\") pod \"26bcdef2-b1e8-4848-abc4-b1f6a45c9916\" (UID: \"26bcdef2-b1e8-4848-abc4-b1f6a45c9916\") " Dec 01 14:14:12 crc kubenswrapper[4585]: I1201 14:14:12.884189 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26bcdef2-b1e8-4848-abc4-b1f6a45c9916-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "26bcdef2-b1e8-4848-abc4-b1f6a45c9916" (UID: "26bcdef2-b1e8-4848-abc4-b1f6a45c9916"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:14:12 crc kubenswrapper[4585]: I1201 14:14:12.885240 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26bcdef2-b1e8-4848-abc4-b1f6a45c9916-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "26bcdef2-b1e8-4848-abc4-b1f6a45c9916" (UID: "26bcdef2-b1e8-4848-abc4-b1f6a45c9916"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:14:12 crc kubenswrapper[4585]: I1201 14:14:12.895402 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26bcdef2-b1e8-4848-abc4-b1f6a45c9916-kube-api-access-9m9qn" (OuterVolumeSpecName: "kube-api-access-9m9qn") pod "26bcdef2-b1e8-4848-abc4-b1f6a45c9916" (UID: "26bcdef2-b1e8-4848-abc4-b1f6a45c9916"). InnerVolumeSpecName "kube-api-access-9m9qn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:14:12 crc kubenswrapper[4585]: I1201 14:14:12.908640 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26bcdef2-b1e8-4848-abc4-b1f6a45c9916-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "26bcdef2-b1e8-4848-abc4-b1f6a45c9916" (UID: "26bcdef2-b1e8-4848-abc4-b1f6a45c9916"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:14:12 crc kubenswrapper[4585]: I1201 14:14:12.925158 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26bcdef2-b1e8-4848-abc4-b1f6a45c9916-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "26bcdef2-b1e8-4848-abc4-b1f6a45c9916" (UID: "26bcdef2-b1e8-4848-abc4-b1f6a45c9916"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:14:12 crc kubenswrapper[4585]: I1201 14:14:12.928831 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26bcdef2-b1e8-4848-abc4-b1f6a45c9916-scripts" (OuterVolumeSpecName: "scripts") pod "26bcdef2-b1e8-4848-abc4-b1f6a45c9916" (UID: "26bcdef2-b1e8-4848-abc4-b1f6a45c9916"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:14:12 crc kubenswrapper[4585]: I1201 14:14:12.962923 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26bcdef2-b1e8-4848-abc4-b1f6a45c9916-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "26bcdef2-b1e8-4848-abc4-b1f6a45c9916" (UID: "26bcdef2-b1e8-4848-abc4-b1f6a45c9916"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:14:12 crc kubenswrapper[4585]: I1201 14:14:12.977988 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6f696b9-xqldv" Dec 01 14:14:12 crc kubenswrapper[4585]: I1201 14:14:12.982894 4585 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/26bcdef2-b1e8-4848-abc4-b1f6a45c9916-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:12 crc kubenswrapper[4585]: I1201 14:14:12.982928 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26bcdef2-b1e8-4848-abc4-b1f6a45c9916-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:12 crc kubenswrapper[4585]: I1201 14:14:12.982939 4585 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/26bcdef2-b1e8-4848-abc4-b1f6a45c9916-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:12 crc kubenswrapper[4585]: I1201 14:14:12.982948 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9m9qn\" (UniqueName: \"kubernetes.io/projected/26bcdef2-b1e8-4848-abc4-b1f6a45c9916-kube-api-access-9m9qn\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:12 crc kubenswrapper[4585]: I1201 14:14:12.982957 4585 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/26bcdef2-b1e8-4848-abc4-b1f6a45c9916-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:12 crc kubenswrapper[4585]: I1201 14:14:12.982965 4585 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/26bcdef2-b1e8-4848-abc4-b1f6a45c9916-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:12 crc kubenswrapper[4585]: I1201 14:14:12.982991 4585 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/26bcdef2-b1e8-4848-abc4-b1f6a45c9916-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:13 crc kubenswrapper[4585]: I1201 14:14:13.083679 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e11bdcb-4b1d-466a-8c52-8dc536396e07-config\") pod \"4e11bdcb-4b1d-466a-8c52-8dc536396e07\" (UID: \"4e11bdcb-4b1d-466a-8c52-8dc536396e07\") " Dec 01 14:14:13 crc kubenswrapper[4585]: I1201 14:14:13.083728 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e11bdcb-4b1d-466a-8c52-8dc536396e07-ovsdbserver-nb\") pod \"4e11bdcb-4b1d-466a-8c52-8dc536396e07\" (UID: \"4e11bdcb-4b1d-466a-8c52-8dc536396e07\") " Dec 01 14:14:13 crc kubenswrapper[4585]: I1201 14:14:13.083756 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e11bdcb-4b1d-466a-8c52-8dc536396e07-dns-svc\") pod \"4e11bdcb-4b1d-466a-8c52-8dc536396e07\" (UID: \"4e11bdcb-4b1d-466a-8c52-8dc536396e07\") " Dec 01 14:14:13 crc kubenswrapper[4585]: I1201 14:14:13.083806 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvfgb\" (UniqueName: \"kubernetes.io/projected/4e11bdcb-4b1d-466a-8c52-8dc536396e07-kube-api-access-fvfgb\") pod \"4e11bdcb-4b1d-466a-8c52-8dc536396e07\" (UID: \"4e11bdcb-4b1d-466a-8c52-8dc536396e07\") " Dec 01 14:14:13 crc kubenswrapper[4585]: I1201 14:14:13.088283 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e11bdcb-4b1d-466a-8c52-8dc536396e07-kube-api-access-fvfgb" (OuterVolumeSpecName: "kube-api-access-fvfgb") pod "4e11bdcb-4b1d-466a-8c52-8dc536396e07" (UID: "4e11bdcb-4b1d-466a-8c52-8dc536396e07"). InnerVolumeSpecName "kube-api-access-fvfgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:14:13 crc kubenswrapper[4585]: I1201 14:14:13.124683 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e11bdcb-4b1d-466a-8c52-8dc536396e07-config" (OuterVolumeSpecName: "config") pod "4e11bdcb-4b1d-466a-8c52-8dc536396e07" (UID: "4e11bdcb-4b1d-466a-8c52-8dc536396e07"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:14:13 crc kubenswrapper[4585]: I1201 14:14:13.127298 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e11bdcb-4b1d-466a-8c52-8dc536396e07-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4e11bdcb-4b1d-466a-8c52-8dc536396e07" (UID: "4e11bdcb-4b1d-466a-8c52-8dc536396e07"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:14:13 crc kubenswrapper[4585]: I1201 14:14:13.132866 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e11bdcb-4b1d-466a-8c52-8dc536396e07-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4e11bdcb-4b1d-466a-8c52-8dc536396e07" (UID: "4e11bdcb-4b1d-466a-8c52-8dc536396e07"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:14:13 crc kubenswrapper[4585]: I1201 14:14:13.185605 4585 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e11bdcb-4b1d-466a-8c52-8dc536396e07-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:13 crc kubenswrapper[4585]: I1201 14:14:13.185633 4585 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e11bdcb-4b1d-466a-8c52-8dc536396e07-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:13 crc kubenswrapper[4585]: I1201 14:14:13.185643 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvfgb\" (UniqueName: \"kubernetes.io/projected/4e11bdcb-4b1d-466a-8c52-8dc536396e07-kube-api-access-fvfgb\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:13 crc kubenswrapper[4585]: I1201 14:14:13.185655 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e11bdcb-4b1d-466a-8c52-8dc536396e07-config\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:13 crc kubenswrapper[4585]: I1201 14:14:13.480655 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-7zn7j" event={"ID":"26bcdef2-b1e8-4848-abc4-b1f6a45c9916","Type":"ContainerDied","Data":"4e55ba08df100205d4936300028b810792534d1c4f92c5fe78d834f9eb0c4adb"} Dec 01 14:14:13 crc kubenswrapper[4585]: I1201 14:14:13.480689 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e55ba08df100205d4936300028b810792534d1c4f92c5fe78d834f9eb0c4adb" Dec 01 14:14:13 crc kubenswrapper[4585]: I1201 14:14:13.480661 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-7zn7j" Dec 01 14:14:13 crc kubenswrapper[4585]: I1201 14:14:13.482700 4585 generic.go:334] "Generic (PLEG): container finished" podID="4e11bdcb-4b1d-466a-8c52-8dc536396e07" containerID="674e6b706f16b7ded0bde538df57be7f7c4ca70a58ed7d59ec9903c1faf47bed" exitCode=0 Dec 01 14:14:13 crc kubenswrapper[4585]: I1201 14:14:13.482728 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6f696b9-xqldv" event={"ID":"4e11bdcb-4b1d-466a-8c52-8dc536396e07","Type":"ContainerDied","Data":"674e6b706f16b7ded0bde538df57be7f7c4ca70a58ed7d59ec9903c1faf47bed"} Dec 01 14:14:13 crc kubenswrapper[4585]: I1201 14:14:13.482744 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6f696b9-xqldv" event={"ID":"4e11bdcb-4b1d-466a-8c52-8dc536396e07","Type":"ContainerDied","Data":"557eac9d5a138a997fd520b8570c2d8a00678d23fba5f4b80336251db22ddc45"} Dec 01 14:14:13 crc kubenswrapper[4585]: I1201 14:14:13.482761 4585 scope.go:117] "RemoveContainer" containerID="674e6b706f16b7ded0bde538df57be7f7c4ca70a58ed7d59ec9903c1faf47bed" Dec 01 14:14:13 crc kubenswrapper[4585]: I1201 14:14:13.482766 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6f696b9-xqldv" Dec 01 14:14:13 crc kubenswrapper[4585]: I1201 14:14:13.537164 4585 scope.go:117] "RemoveContainer" containerID="beb7a5c661ba2bb8184efe457308ffaa56ab172edf4a3bc6227bfca50a6c524f" Dec 01 14:14:13 crc kubenswrapper[4585]: I1201 14:14:13.545848 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6f696b9-xqldv"] Dec 01 14:14:13 crc kubenswrapper[4585]: I1201 14:14:13.549171 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74f6f696b9-xqldv"] Dec 01 14:14:13 crc kubenswrapper[4585]: I1201 14:14:13.587288 4585 scope.go:117] "RemoveContainer" containerID="674e6b706f16b7ded0bde538df57be7f7c4ca70a58ed7d59ec9903c1faf47bed" Dec 01 14:14:13 crc kubenswrapper[4585]: E1201 14:14:13.587963 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"674e6b706f16b7ded0bde538df57be7f7c4ca70a58ed7d59ec9903c1faf47bed\": container with ID starting with 674e6b706f16b7ded0bde538df57be7f7c4ca70a58ed7d59ec9903c1faf47bed not found: ID does not exist" containerID="674e6b706f16b7ded0bde538df57be7f7c4ca70a58ed7d59ec9903c1faf47bed" Dec 01 14:14:13 crc kubenswrapper[4585]: I1201 14:14:13.588021 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"674e6b706f16b7ded0bde538df57be7f7c4ca70a58ed7d59ec9903c1faf47bed"} err="failed to get container status \"674e6b706f16b7ded0bde538df57be7f7c4ca70a58ed7d59ec9903c1faf47bed\": rpc error: code = NotFound desc = could not find container \"674e6b706f16b7ded0bde538df57be7f7c4ca70a58ed7d59ec9903c1faf47bed\": container with ID starting with 674e6b706f16b7ded0bde538df57be7f7c4ca70a58ed7d59ec9903c1faf47bed not found: ID does not exist" Dec 01 14:14:13 crc kubenswrapper[4585]: I1201 14:14:13.588051 4585 scope.go:117] "RemoveContainer" containerID="beb7a5c661ba2bb8184efe457308ffaa56ab172edf4a3bc6227bfca50a6c524f" Dec 01 14:14:13 crc kubenswrapper[4585]: E1201 14:14:13.588282 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"beb7a5c661ba2bb8184efe457308ffaa56ab172edf4a3bc6227bfca50a6c524f\": container with ID starting with beb7a5c661ba2bb8184efe457308ffaa56ab172edf4a3bc6227bfca50a6c524f not found: ID does not exist" containerID="beb7a5c661ba2bb8184efe457308ffaa56ab172edf4a3bc6227bfca50a6c524f" Dec 01 14:14:13 crc kubenswrapper[4585]: I1201 14:14:13.588308 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"beb7a5c661ba2bb8184efe457308ffaa56ab172edf4a3bc6227bfca50a6c524f"} err="failed to get container status \"beb7a5c661ba2bb8184efe457308ffaa56ab172edf4a3bc6227bfca50a6c524f\": rpc error: code = NotFound desc = could not find container \"beb7a5c661ba2bb8184efe457308ffaa56ab172edf4a3bc6227bfca50a6c524f\": container with ID starting with beb7a5c661ba2bb8184efe457308ffaa56ab172edf4a3bc6227bfca50a6c524f not found: ID does not exist" Dec 01 14:14:13 crc kubenswrapper[4585]: I1201 14:14:13.716694 4585 patch_prober.go:28] interesting pod/machine-config-daemon-lj9gs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 14:14:13 crc kubenswrapper[4585]: I1201 14:14:13.716751 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 14:14:13 crc kubenswrapper[4585]: I1201 14:14:13.838491 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ncdb4"] Dec 01 14:14:13 crc kubenswrapper[4585]: I1201 14:14:13.838775 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ncdb4" podUID="cca7a4db-9ec1-4a3c-8458-ab724cdd6861" containerName="registry-server" containerID="cri-o://ac9841e053a337a64b3d1b41546d631fd4d1dcb25d96d60d6660f931e228d252" gracePeriod=2 Dec 01 14:14:14 crc kubenswrapper[4585]: I1201 14:14:14.301710 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ncdb4" Dec 01 14:14:14 crc kubenswrapper[4585]: I1201 14:14:14.405590 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cca7a4db-9ec1-4a3c-8458-ab724cdd6861-utilities\") pod \"cca7a4db-9ec1-4a3c-8458-ab724cdd6861\" (UID: \"cca7a4db-9ec1-4a3c-8458-ab724cdd6861\") " Dec 01 14:14:14 crc kubenswrapper[4585]: I1201 14:14:14.405773 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cca7a4db-9ec1-4a3c-8458-ab724cdd6861-catalog-content\") pod \"cca7a4db-9ec1-4a3c-8458-ab724cdd6861\" (UID: \"cca7a4db-9ec1-4a3c-8458-ab724cdd6861\") " Dec 01 14:14:14 crc kubenswrapper[4585]: I1201 14:14:14.405805 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttnbx\" (UniqueName: \"kubernetes.io/projected/cca7a4db-9ec1-4a3c-8458-ab724cdd6861-kube-api-access-ttnbx\") pod \"cca7a4db-9ec1-4a3c-8458-ab724cdd6861\" (UID: \"cca7a4db-9ec1-4a3c-8458-ab724cdd6861\") " Dec 01 14:14:14 crc kubenswrapper[4585]: I1201 14:14:14.409953 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cca7a4db-9ec1-4a3c-8458-ab724cdd6861-utilities" (OuterVolumeSpecName: "utilities") pod "cca7a4db-9ec1-4a3c-8458-ab724cdd6861" (UID: "cca7a4db-9ec1-4a3c-8458-ab724cdd6861"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:14:14 crc kubenswrapper[4585]: I1201 14:14:14.412087 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cca7a4db-9ec1-4a3c-8458-ab724cdd6861-kube-api-access-ttnbx" (OuterVolumeSpecName: "kube-api-access-ttnbx") pod "cca7a4db-9ec1-4a3c-8458-ab724cdd6861" (UID: "cca7a4db-9ec1-4a3c-8458-ab724cdd6861"). InnerVolumeSpecName "kube-api-access-ttnbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:14:14 crc kubenswrapper[4585]: I1201 14:14:14.440218 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e11bdcb-4b1d-466a-8c52-8dc536396e07" path="/var/lib/kubelet/pods/4e11bdcb-4b1d-466a-8c52-8dc536396e07/volumes" Dec 01 14:14:14 crc kubenswrapper[4585]: I1201 14:14:14.502450 4585 generic.go:334] "Generic (PLEG): container finished" podID="cca7a4db-9ec1-4a3c-8458-ab724cdd6861" containerID="ac9841e053a337a64b3d1b41546d631fd4d1dcb25d96d60d6660f931e228d252" exitCode=0 Dec 01 14:14:14 crc kubenswrapper[4585]: I1201 14:14:14.502526 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ncdb4" event={"ID":"cca7a4db-9ec1-4a3c-8458-ab724cdd6861","Type":"ContainerDied","Data":"ac9841e053a337a64b3d1b41546d631fd4d1dcb25d96d60d6660f931e228d252"} Dec 01 14:14:14 crc kubenswrapper[4585]: I1201 14:14:14.502560 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ncdb4" event={"ID":"cca7a4db-9ec1-4a3c-8458-ab724cdd6861","Type":"ContainerDied","Data":"cbd1c68a0082301f4ac70db9ba3bdc8168461564e0e5b637f025c4bc634414d3"} Dec 01 14:14:14 crc kubenswrapper[4585]: I1201 14:14:14.502574 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ncdb4" Dec 01 14:14:14 crc kubenswrapper[4585]: I1201 14:14:14.502587 4585 scope.go:117] "RemoveContainer" containerID="ac9841e053a337a64b3d1b41546d631fd4d1dcb25d96d60d6660f931e228d252" Dec 01 14:14:14 crc kubenswrapper[4585]: I1201 14:14:14.507942 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttnbx\" (UniqueName: \"kubernetes.io/projected/cca7a4db-9ec1-4a3c-8458-ab724cdd6861-kube-api-access-ttnbx\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:14 crc kubenswrapper[4585]: I1201 14:14:14.507984 4585 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cca7a4db-9ec1-4a3c-8458-ab724cdd6861-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:14 crc kubenswrapper[4585]: I1201 14:14:14.520083 4585 scope.go:117] "RemoveContainer" containerID="8c8f1bebbe3ceaba9904177a45cdce226b641556b1de7e9272b34b37fde27c53" Dec 01 14:14:14 crc kubenswrapper[4585]: I1201 14:14:14.537861 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cca7a4db-9ec1-4a3c-8458-ab724cdd6861-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cca7a4db-9ec1-4a3c-8458-ab724cdd6861" (UID: "cca7a4db-9ec1-4a3c-8458-ab724cdd6861"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:14:14 crc kubenswrapper[4585]: I1201 14:14:14.541260 4585 scope.go:117] "RemoveContainer" containerID="f4af9d3d5df9bb817d43a4fcf227b346ab6f58b2e386bff42298a911d6612d3a" Dec 01 14:14:14 crc kubenswrapper[4585]: I1201 14:14:14.558955 4585 scope.go:117] "RemoveContainer" containerID="ac9841e053a337a64b3d1b41546d631fd4d1dcb25d96d60d6660f931e228d252" Dec 01 14:14:14 crc kubenswrapper[4585]: E1201 14:14:14.559358 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac9841e053a337a64b3d1b41546d631fd4d1dcb25d96d60d6660f931e228d252\": container with ID starting with ac9841e053a337a64b3d1b41546d631fd4d1dcb25d96d60d6660f931e228d252 not found: ID does not exist" containerID="ac9841e053a337a64b3d1b41546d631fd4d1dcb25d96d60d6660f931e228d252" Dec 01 14:14:14 crc kubenswrapper[4585]: I1201 14:14:14.559395 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac9841e053a337a64b3d1b41546d631fd4d1dcb25d96d60d6660f931e228d252"} err="failed to get container status \"ac9841e053a337a64b3d1b41546d631fd4d1dcb25d96d60d6660f931e228d252\": rpc error: code = NotFound desc = could not find container \"ac9841e053a337a64b3d1b41546d631fd4d1dcb25d96d60d6660f931e228d252\": container with ID starting with ac9841e053a337a64b3d1b41546d631fd4d1dcb25d96d60d6660f931e228d252 not found: ID does not exist" Dec 01 14:14:14 crc kubenswrapper[4585]: I1201 14:14:14.559440 4585 scope.go:117] "RemoveContainer" containerID="8c8f1bebbe3ceaba9904177a45cdce226b641556b1de7e9272b34b37fde27c53" Dec 01 14:14:14 crc kubenswrapper[4585]: E1201 14:14:14.559833 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c8f1bebbe3ceaba9904177a45cdce226b641556b1de7e9272b34b37fde27c53\": container with ID starting with 8c8f1bebbe3ceaba9904177a45cdce226b641556b1de7e9272b34b37fde27c53 not found: ID does not exist" containerID="8c8f1bebbe3ceaba9904177a45cdce226b641556b1de7e9272b34b37fde27c53" Dec 01 14:14:14 crc kubenswrapper[4585]: I1201 14:14:14.559874 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c8f1bebbe3ceaba9904177a45cdce226b641556b1de7e9272b34b37fde27c53"} err="failed to get container status \"8c8f1bebbe3ceaba9904177a45cdce226b641556b1de7e9272b34b37fde27c53\": rpc error: code = NotFound desc = could not find container \"8c8f1bebbe3ceaba9904177a45cdce226b641556b1de7e9272b34b37fde27c53\": container with ID starting with 8c8f1bebbe3ceaba9904177a45cdce226b641556b1de7e9272b34b37fde27c53 not found: ID does not exist" Dec 01 14:14:14 crc kubenswrapper[4585]: I1201 14:14:14.559888 4585 scope.go:117] "RemoveContainer" containerID="f4af9d3d5df9bb817d43a4fcf227b346ab6f58b2e386bff42298a911d6612d3a" Dec 01 14:14:14 crc kubenswrapper[4585]: E1201 14:14:14.560322 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4af9d3d5df9bb817d43a4fcf227b346ab6f58b2e386bff42298a911d6612d3a\": container with ID starting with f4af9d3d5df9bb817d43a4fcf227b346ab6f58b2e386bff42298a911d6612d3a not found: ID does not exist" containerID="f4af9d3d5df9bb817d43a4fcf227b346ab6f58b2e386bff42298a911d6612d3a" Dec 01 14:14:14 crc kubenswrapper[4585]: I1201 14:14:14.560343 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4af9d3d5df9bb817d43a4fcf227b346ab6f58b2e386bff42298a911d6612d3a"} err="failed to get container status \"f4af9d3d5df9bb817d43a4fcf227b346ab6f58b2e386bff42298a911d6612d3a\": rpc error: code = NotFound desc = could not find container \"f4af9d3d5df9bb817d43a4fcf227b346ab6f58b2e386bff42298a911d6612d3a\": container with ID starting with f4af9d3d5df9bb817d43a4fcf227b346ab6f58b2e386bff42298a911d6612d3a not found: ID does not exist" Dec 01 14:14:14 crc kubenswrapper[4585]: I1201 14:14:14.609216 4585 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cca7a4db-9ec1-4a3c-8458-ab724cdd6861-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:14 crc kubenswrapper[4585]: I1201 14:14:14.720703 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-m7xvm" podUID="2f3d9474-e60e-401e-8597-1bd7af4f34c3" containerName="ovn-controller" probeResult="failure" output=< Dec 01 14:14:14 crc kubenswrapper[4585]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 01 14:14:14 crc kubenswrapper[4585]: > Dec 01 14:14:14 crc kubenswrapper[4585]: I1201 14:14:14.886350 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ncdb4"] Dec 01 14:14:14 crc kubenswrapper[4585]: I1201 14:14:14.890405 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ncdb4"] Dec 01 14:14:15 crc kubenswrapper[4585]: I1201 14:14:15.514423 4585 generic.go:334] "Generic (PLEG): container finished" podID="6c266121-e7d2-42aa-b1d9-0d15bdd0f798" containerID="01471bb78ac8279a59d5f59a8cd08029ea754a6ffbcf48823c084903b339191c" exitCode=0 Dec 01 14:14:15 crc kubenswrapper[4585]: I1201 14:14:15.514494 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6c266121-e7d2-42aa-b1d9-0d15bdd0f798","Type":"ContainerDied","Data":"01471bb78ac8279a59d5f59a8cd08029ea754a6ffbcf48823c084903b339191c"} Dec 01 14:14:16 crc kubenswrapper[4585]: I1201 14:14:16.425608 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cca7a4db-9ec1-4a3c-8458-ab724cdd6861" path="/var/lib/kubelet/pods/cca7a4db-9ec1-4a3c-8458-ab724cdd6861/volumes" Dec 01 14:14:16 crc kubenswrapper[4585]: I1201 14:14:16.529103 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6c266121-e7d2-42aa-b1d9-0d15bdd0f798","Type":"ContainerStarted","Data":"750cf3415d2c964a31c9ffec8a7a334d06b9a8c65269a22a794fe0a3cf8946c2"} Dec 01 14:14:16 crc kubenswrapper[4585]: I1201 14:14:16.529321 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:14:16 crc kubenswrapper[4585]: I1201 14:14:16.534260 4585 generic.go:334] "Generic (PLEG): container finished" podID="d41c9a27-f15b-44c5-84b2-0e083f8dc837" containerID="136bde2b6e8e606ac594a703cd22914247375724ce60c2580695ad1f22d011e9" exitCode=0 Dec 01 14:14:16 crc kubenswrapper[4585]: I1201 14:14:16.534305 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d41c9a27-f15b-44c5-84b2-0e083f8dc837","Type":"ContainerDied","Data":"136bde2b6e8e606ac594a703cd22914247375724ce60c2580695ad1f22d011e9"} Dec 01 14:14:16 crc kubenswrapper[4585]: I1201 14:14:16.560597 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=40.491905251 podStartE2EDuration="1m13.56057218s" podCreationTimestamp="2025-12-01 14:13:03 +0000 UTC" firstStartedPulling="2025-12-01 14:13:05.957201396 +0000 UTC m=+899.941415251" lastFinishedPulling="2025-12-01 14:13:39.025868325 +0000 UTC m=+933.010082180" observedRunningTime="2025-12-01 14:14:16.554466068 +0000 UTC m=+970.538679923" watchObservedRunningTime="2025-12-01 14:14:16.56057218 +0000 UTC m=+970.544786025" Dec 01 14:14:17 crc kubenswrapper[4585]: I1201 14:14:17.505671 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 01 14:14:19 crc kubenswrapper[4585]: I1201 14:14:19.273065 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hvjgq" Dec 01 14:14:19 crc kubenswrapper[4585]: I1201 14:14:19.320985 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hvjgq"] Dec 01 14:14:19 crc kubenswrapper[4585]: I1201 14:14:19.563066 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hvjgq" podUID="ac4aab85-ddfc-4e11-8e25-54e8bbdbe195" containerName="registry-server" containerID="cri-o://7519df874c2b286bffcd27bcc6863ca34d10076fd854e48742dfa6960dfddbba" gracePeriod=2 Dec 01 14:14:19 crc kubenswrapper[4585]: I1201 14:14:19.726849 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-m7xvm" podUID="2f3d9474-e60e-401e-8597-1bd7af4f34c3" containerName="ovn-controller" probeResult="failure" output=< Dec 01 14:14:19 crc kubenswrapper[4585]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 01 14:14:19 crc kubenswrapper[4585]: > Dec 01 14:14:19 crc kubenswrapper[4585]: I1201 14:14:19.837509 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-rt4xq" Dec 01 14:14:20 crc kubenswrapper[4585]: I1201 14:14:20.574602 4585 generic.go:334] "Generic (PLEG): container finished" podID="ac4aab85-ddfc-4e11-8e25-54e8bbdbe195" containerID="7519df874c2b286bffcd27bcc6863ca34d10076fd854e48742dfa6960dfddbba" exitCode=0 Dec 01 14:14:20 crc kubenswrapper[4585]: I1201 14:14:20.574662 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvjgq" event={"ID":"ac4aab85-ddfc-4e11-8e25-54e8bbdbe195","Type":"ContainerDied","Data":"7519df874c2b286bffcd27bcc6863ca34d10076fd854e48742dfa6960dfddbba"} Dec 01 14:14:23 crc kubenswrapper[4585]: I1201 14:14:23.586049 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3bc5c97f-1882-47da-843c-f8dba234f1f3-etc-swift\") pod \"swift-storage-0\" (UID: \"3bc5c97f-1882-47da-843c-f8dba234f1f3\") " pod="openstack/swift-storage-0" Dec 01 14:14:23 crc kubenswrapper[4585]: I1201 14:14:23.595264 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3bc5c97f-1882-47da-843c-f8dba234f1f3-etc-swift\") pod \"swift-storage-0\" (UID: \"3bc5c97f-1882-47da-843c-f8dba234f1f3\") " pod="openstack/swift-storage-0" Dec 01 14:14:23 crc kubenswrapper[4585]: I1201 14:14:23.829530 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 01 14:14:24 crc kubenswrapper[4585]: I1201 14:14:24.284574 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hvjgq" Dec 01 14:14:24 crc kubenswrapper[4585]: I1201 14:14:24.409716 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac4aab85-ddfc-4e11-8e25-54e8bbdbe195-catalog-content\") pod \"ac4aab85-ddfc-4e11-8e25-54e8bbdbe195\" (UID: \"ac4aab85-ddfc-4e11-8e25-54e8bbdbe195\") " Dec 01 14:14:24 crc kubenswrapper[4585]: I1201 14:14:24.409759 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhhqg\" (UniqueName: \"kubernetes.io/projected/ac4aab85-ddfc-4e11-8e25-54e8bbdbe195-kube-api-access-zhhqg\") pod \"ac4aab85-ddfc-4e11-8e25-54e8bbdbe195\" (UID: \"ac4aab85-ddfc-4e11-8e25-54e8bbdbe195\") " Dec 01 14:14:24 crc kubenswrapper[4585]: I1201 14:14:24.409891 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac4aab85-ddfc-4e11-8e25-54e8bbdbe195-utilities\") pod \"ac4aab85-ddfc-4e11-8e25-54e8bbdbe195\" (UID: \"ac4aab85-ddfc-4e11-8e25-54e8bbdbe195\") " Dec 01 14:14:24 crc kubenswrapper[4585]: I1201 14:14:24.411054 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac4aab85-ddfc-4e11-8e25-54e8bbdbe195-utilities" (OuterVolumeSpecName: "utilities") pod "ac4aab85-ddfc-4e11-8e25-54e8bbdbe195" (UID: "ac4aab85-ddfc-4e11-8e25-54e8bbdbe195"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:14:24 crc kubenswrapper[4585]: I1201 14:14:24.414491 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac4aab85-ddfc-4e11-8e25-54e8bbdbe195-kube-api-access-zhhqg" (OuterVolumeSpecName: "kube-api-access-zhhqg") pod "ac4aab85-ddfc-4e11-8e25-54e8bbdbe195" (UID: "ac4aab85-ddfc-4e11-8e25-54e8bbdbe195"). InnerVolumeSpecName "kube-api-access-zhhqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:14:24 crc kubenswrapper[4585]: I1201 14:14:24.479561 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac4aab85-ddfc-4e11-8e25-54e8bbdbe195-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ac4aab85-ddfc-4e11-8e25-54e8bbdbe195" (UID: "ac4aab85-ddfc-4e11-8e25-54e8bbdbe195"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:14:24 crc kubenswrapper[4585]: I1201 14:14:24.512277 4585 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac4aab85-ddfc-4e11-8e25-54e8bbdbe195-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:24 crc kubenswrapper[4585]: I1201 14:14:24.512307 4585 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac4aab85-ddfc-4e11-8e25-54e8bbdbe195-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:24 crc kubenswrapper[4585]: I1201 14:14:24.512321 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhhqg\" (UniqueName: \"kubernetes.io/projected/ac4aab85-ddfc-4e11-8e25-54e8bbdbe195-kube-api-access-zhhqg\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:24 crc kubenswrapper[4585]: I1201 14:14:24.619156 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 01 14:14:24 crc kubenswrapper[4585]: I1201 14:14:24.627868 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvjgq" event={"ID":"ac4aab85-ddfc-4e11-8e25-54e8bbdbe195","Type":"ContainerDied","Data":"0197763a998dc6a7277d9d0e5384b706bd7741eb224983bdea15dd3f6d318a6f"} Dec 01 14:14:24 crc kubenswrapper[4585]: I1201 14:14:24.627904 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hvjgq" Dec 01 14:14:24 crc kubenswrapper[4585]: I1201 14:14:24.627926 4585 scope.go:117] "RemoveContainer" containerID="7519df874c2b286bffcd27bcc6863ca34d10076fd854e48742dfa6960dfddbba" Dec 01 14:14:24 crc kubenswrapper[4585]: W1201 14:14:24.636583 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3bc5c97f_1882_47da_843c_f8dba234f1f3.slice/crio-aee2ae8fc40caa6eff7cdc3b7060ae711e247ca3551ff429f8480370e35f9631 WatchSource:0}: Error finding container aee2ae8fc40caa6eff7cdc3b7060ae711e247ca3551ff429f8480370e35f9631: Status 404 returned error can't find the container with id aee2ae8fc40caa6eff7cdc3b7060ae711e247ca3551ff429f8480370e35f9631 Dec 01 14:14:24 crc kubenswrapper[4585]: I1201 14:14:24.637652 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d41c9a27-f15b-44c5-84b2-0e083f8dc837","Type":"ContainerStarted","Data":"85d3f6b4909b70bf35ccaee9949916c82b673ab0fc7b98cbea1b5d6b216206b9"} Dec 01 14:14:24 crc kubenswrapper[4585]: I1201 14:14:24.637860 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 01 14:14:24 crc kubenswrapper[4585]: I1201 14:14:24.700605 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=-9223371955.154196 podStartE2EDuration="1m21.70057959s" podCreationTimestamp="2025-12-01 14:13:03 +0000 UTC" firstStartedPulling="2025-12-01 14:13:06.000089999 +0000 UTC m=+899.984303864" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:14:24.679483718 +0000 UTC m=+978.663697573" watchObservedRunningTime="2025-12-01 14:14:24.70057959 +0000 UTC m=+978.684793445" Dec 01 14:14:24 crc kubenswrapper[4585]: I1201 14:14:24.703889 4585 scope.go:117] "RemoveContainer" containerID="3a24e83e1e6abbfb923e84dfbd86da400f2a1ffad61c0294af0c1d0627c43acc" Dec 01 14:14:24 crc kubenswrapper[4585]: I1201 14:14:24.704055 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hvjgq"] Dec 01 14:14:24 crc kubenswrapper[4585]: I1201 14:14:24.719535 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hvjgq"] Dec 01 14:14:24 crc kubenswrapper[4585]: I1201 14:14:24.736079 4585 scope.go:117] "RemoveContainer" containerID="7b098606d3cddb0cc3e7f3ef61839954f1b3e2cfde00bf4575f77c270ab3a030" Dec 01 14:14:24 crc kubenswrapper[4585]: I1201 14:14:24.747171 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-m7xvm" podUID="2f3d9474-e60e-401e-8597-1bd7af4f34c3" containerName="ovn-controller" probeResult="failure" output=< Dec 01 14:14:24 crc kubenswrapper[4585]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 01 14:14:24 crc kubenswrapper[4585]: > Dec 01 14:14:24 crc kubenswrapper[4585]: I1201 14:14:24.851144 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-rt4xq" Dec 01 14:14:25 crc kubenswrapper[4585]: I1201 14:14:25.070719 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-m7xvm-config-49rfz"] Dec 01 14:14:25 crc kubenswrapper[4585]: E1201 14:14:25.071385 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e11bdcb-4b1d-466a-8c52-8dc536396e07" containerName="init" Dec 01 14:14:25 crc kubenswrapper[4585]: I1201 14:14:25.071405 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e11bdcb-4b1d-466a-8c52-8dc536396e07" containerName="init" Dec 01 14:14:25 crc kubenswrapper[4585]: E1201 14:14:25.071421 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29ef2e82-35e4-4b31-9324-4fd4274a82b1" containerName="extract-content" Dec 01 14:14:25 crc kubenswrapper[4585]: I1201 14:14:25.071428 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="29ef2e82-35e4-4b31-9324-4fd4274a82b1" containerName="extract-content" Dec 01 14:14:25 crc kubenswrapper[4585]: E1201 14:14:25.071435 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26bcdef2-b1e8-4848-abc4-b1f6a45c9916" containerName="swift-ring-rebalance" Dec 01 14:14:25 crc kubenswrapper[4585]: I1201 14:14:25.071442 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="26bcdef2-b1e8-4848-abc4-b1f6a45c9916" containerName="swift-ring-rebalance" Dec 01 14:14:25 crc kubenswrapper[4585]: E1201 14:14:25.071459 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cca7a4db-9ec1-4a3c-8458-ab724cdd6861" containerName="registry-server" Dec 01 14:14:25 crc kubenswrapper[4585]: I1201 14:14:25.071465 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="cca7a4db-9ec1-4a3c-8458-ab724cdd6861" containerName="registry-server" Dec 01 14:14:25 crc kubenswrapper[4585]: E1201 14:14:25.071474 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac4aab85-ddfc-4e11-8e25-54e8bbdbe195" containerName="extract-content" Dec 01 14:14:25 crc kubenswrapper[4585]: I1201 14:14:25.071480 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac4aab85-ddfc-4e11-8e25-54e8bbdbe195" containerName="extract-content" Dec 01 14:14:25 crc kubenswrapper[4585]: E1201 14:14:25.071487 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cca7a4db-9ec1-4a3c-8458-ab724cdd6861" containerName="extract-utilities" Dec 01 14:14:25 crc kubenswrapper[4585]: I1201 14:14:25.071493 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="cca7a4db-9ec1-4a3c-8458-ab724cdd6861" containerName="extract-utilities" Dec 01 14:14:25 crc kubenswrapper[4585]: E1201 14:14:25.071503 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cca7a4db-9ec1-4a3c-8458-ab724cdd6861" containerName="extract-content" Dec 01 14:14:25 crc kubenswrapper[4585]: I1201 14:14:25.071509 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="cca7a4db-9ec1-4a3c-8458-ab724cdd6861" containerName="extract-content" Dec 01 14:14:25 crc kubenswrapper[4585]: E1201 14:14:25.071521 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac4aab85-ddfc-4e11-8e25-54e8bbdbe195" containerName="extract-utilities" Dec 01 14:14:25 crc kubenswrapper[4585]: I1201 14:14:25.071528 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac4aab85-ddfc-4e11-8e25-54e8bbdbe195" containerName="extract-utilities" Dec 01 14:14:25 crc kubenswrapper[4585]: E1201 14:14:25.071534 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29ef2e82-35e4-4b31-9324-4fd4274a82b1" containerName="extract-utilities" Dec 01 14:14:25 crc kubenswrapper[4585]: I1201 14:14:25.071539 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="29ef2e82-35e4-4b31-9324-4fd4274a82b1" containerName="extract-utilities" Dec 01 14:14:25 crc kubenswrapper[4585]: E1201 14:14:25.071546 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29ef2e82-35e4-4b31-9324-4fd4274a82b1" containerName="registry-server" Dec 01 14:14:25 crc kubenswrapper[4585]: I1201 14:14:25.071552 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="29ef2e82-35e4-4b31-9324-4fd4274a82b1" containerName="registry-server" Dec 01 14:14:25 crc kubenswrapper[4585]: E1201 14:14:25.071563 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac4aab85-ddfc-4e11-8e25-54e8bbdbe195" containerName="registry-server" Dec 01 14:14:25 crc kubenswrapper[4585]: I1201 14:14:25.071569 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac4aab85-ddfc-4e11-8e25-54e8bbdbe195" containerName="registry-server" Dec 01 14:14:25 crc kubenswrapper[4585]: E1201 14:14:25.071579 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e11bdcb-4b1d-466a-8c52-8dc536396e07" containerName="dnsmasq-dns" Dec 01 14:14:25 crc kubenswrapper[4585]: I1201 14:14:25.071585 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e11bdcb-4b1d-466a-8c52-8dc536396e07" containerName="dnsmasq-dns" Dec 01 14:14:25 crc kubenswrapper[4585]: I1201 14:14:25.071726 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac4aab85-ddfc-4e11-8e25-54e8bbdbe195" containerName="registry-server" Dec 01 14:14:25 crc kubenswrapper[4585]: I1201 14:14:25.071742 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="29ef2e82-35e4-4b31-9324-4fd4274a82b1" containerName="registry-server" Dec 01 14:14:25 crc kubenswrapper[4585]: I1201 14:14:25.071752 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="26bcdef2-b1e8-4848-abc4-b1f6a45c9916" containerName="swift-ring-rebalance" Dec 01 14:14:25 crc kubenswrapper[4585]: I1201 14:14:25.071763 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="cca7a4db-9ec1-4a3c-8458-ab724cdd6861" containerName="registry-server" Dec 01 14:14:25 crc kubenswrapper[4585]: I1201 14:14:25.071773 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e11bdcb-4b1d-466a-8c52-8dc536396e07" containerName="dnsmasq-dns" Dec 01 14:14:25 crc kubenswrapper[4585]: I1201 14:14:25.072290 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-m7xvm-config-49rfz" Dec 01 14:14:25 crc kubenswrapper[4585]: I1201 14:14:25.087396 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 01 14:14:25 crc kubenswrapper[4585]: I1201 14:14:25.094340 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-m7xvm-config-49rfz"] Dec 01 14:14:25 crc kubenswrapper[4585]: I1201 14:14:25.121867 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/af4ff2e6-05e0-4841-aed8-f330da5095e9-var-log-ovn\") pod \"ovn-controller-m7xvm-config-49rfz\" (UID: \"af4ff2e6-05e0-4841-aed8-f330da5095e9\") " pod="openstack/ovn-controller-m7xvm-config-49rfz" Dec 01 14:14:25 crc kubenswrapper[4585]: I1201 14:14:25.121954 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/af4ff2e6-05e0-4841-aed8-f330da5095e9-var-run-ovn\") pod \"ovn-controller-m7xvm-config-49rfz\" (UID: \"af4ff2e6-05e0-4841-aed8-f330da5095e9\") " pod="openstack/ovn-controller-m7xvm-config-49rfz" Dec 01 14:14:25 crc kubenswrapper[4585]: I1201 14:14:25.122062 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/af4ff2e6-05e0-4841-aed8-f330da5095e9-additional-scripts\") pod \"ovn-controller-m7xvm-config-49rfz\" (UID: \"af4ff2e6-05e0-4841-aed8-f330da5095e9\") " pod="openstack/ovn-controller-m7xvm-config-49rfz" Dec 01 14:14:25 crc kubenswrapper[4585]: I1201 14:14:25.122086 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/af4ff2e6-05e0-4841-aed8-f330da5095e9-var-run\") pod \"ovn-controller-m7xvm-config-49rfz\" (UID: \"af4ff2e6-05e0-4841-aed8-f330da5095e9\") " pod="openstack/ovn-controller-m7xvm-config-49rfz" Dec 01 14:14:25 crc kubenswrapper[4585]: I1201 14:14:25.122107 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2vxm\" (UniqueName: \"kubernetes.io/projected/af4ff2e6-05e0-4841-aed8-f330da5095e9-kube-api-access-g2vxm\") pod \"ovn-controller-m7xvm-config-49rfz\" (UID: \"af4ff2e6-05e0-4841-aed8-f330da5095e9\") " pod="openstack/ovn-controller-m7xvm-config-49rfz" Dec 01 14:14:25 crc kubenswrapper[4585]: I1201 14:14:25.122148 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/af4ff2e6-05e0-4841-aed8-f330da5095e9-scripts\") pod \"ovn-controller-m7xvm-config-49rfz\" (UID: \"af4ff2e6-05e0-4841-aed8-f330da5095e9\") " pod="openstack/ovn-controller-m7xvm-config-49rfz" Dec 01 14:14:25 crc kubenswrapper[4585]: I1201 14:14:25.223913 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/af4ff2e6-05e0-4841-aed8-f330da5095e9-additional-scripts\") pod \"ovn-controller-m7xvm-config-49rfz\" (UID: \"af4ff2e6-05e0-4841-aed8-f330da5095e9\") " pod="openstack/ovn-controller-m7xvm-config-49rfz" Dec 01 14:14:25 crc kubenswrapper[4585]: I1201 14:14:25.223984 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/af4ff2e6-05e0-4841-aed8-f330da5095e9-var-run\") pod \"ovn-controller-m7xvm-config-49rfz\" (UID: \"af4ff2e6-05e0-4841-aed8-f330da5095e9\") " pod="openstack/ovn-controller-m7xvm-config-49rfz" Dec 01 14:14:25 crc kubenswrapper[4585]: I1201 14:14:25.224001 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2vxm\" (UniqueName: \"kubernetes.io/projected/af4ff2e6-05e0-4841-aed8-f330da5095e9-kube-api-access-g2vxm\") pod \"ovn-controller-m7xvm-config-49rfz\" (UID: \"af4ff2e6-05e0-4841-aed8-f330da5095e9\") " pod="openstack/ovn-controller-m7xvm-config-49rfz" Dec 01 14:14:25 crc kubenswrapper[4585]: I1201 14:14:25.224039 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/af4ff2e6-05e0-4841-aed8-f330da5095e9-scripts\") pod \"ovn-controller-m7xvm-config-49rfz\" (UID: \"af4ff2e6-05e0-4841-aed8-f330da5095e9\") " pod="openstack/ovn-controller-m7xvm-config-49rfz" Dec 01 14:14:25 crc kubenswrapper[4585]: I1201 14:14:25.224100 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/af4ff2e6-05e0-4841-aed8-f330da5095e9-var-log-ovn\") pod \"ovn-controller-m7xvm-config-49rfz\" (UID: \"af4ff2e6-05e0-4841-aed8-f330da5095e9\") " pod="openstack/ovn-controller-m7xvm-config-49rfz" Dec 01 14:14:25 crc kubenswrapper[4585]: I1201 14:14:25.224117 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/af4ff2e6-05e0-4841-aed8-f330da5095e9-var-run-ovn\") pod \"ovn-controller-m7xvm-config-49rfz\" (UID: \"af4ff2e6-05e0-4841-aed8-f330da5095e9\") " pod="openstack/ovn-controller-m7xvm-config-49rfz" Dec 01 14:14:25 crc kubenswrapper[4585]: I1201 14:14:25.224395 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/af4ff2e6-05e0-4841-aed8-f330da5095e9-var-run-ovn\") pod \"ovn-controller-m7xvm-config-49rfz\" (UID: \"af4ff2e6-05e0-4841-aed8-f330da5095e9\") " pod="openstack/ovn-controller-m7xvm-config-49rfz" Dec 01 14:14:25 crc kubenswrapper[4585]: I1201 14:14:25.224463 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/af4ff2e6-05e0-4841-aed8-f330da5095e9-var-run\") pod \"ovn-controller-m7xvm-config-49rfz\" (UID: \"af4ff2e6-05e0-4841-aed8-f330da5095e9\") " pod="openstack/ovn-controller-m7xvm-config-49rfz" Dec 01 14:14:25 crc kubenswrapper[4585]: I1201 14:14:25.224722 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/af4ff2e6-05e0-4841-aed8-f330da5095e9-additional-scripts\") pod \"ovn-controller-m7xvm-config-49rfz\" (UID: \"af4ff2e6-05e0-4841-aed8-f330da5095e9\") " pod="openstack/ovn-controller-m7xvm-config-49rfz" Dec 01 14:14:25 crc kubenswrapper[4585]: I1201 14:14:25.224815 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/af4ff2e6-05e0-4841-aed8-f330da5095e9-var-log-ovn\") pod \"ovn-controller-m7xvm-config-49rfz\" (UID: \"af4ff2e6-05e0-4841-aed8-f330da5095e9\") " pod="openstack/ovn-controller-m7xvm-config-49rfz" Dec 01 14:14:25 crc kubenswrapper[4585]: I1201 14:14:25.226435 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/af4ff2e6-05e0-4841-aed8-f330da5095e9-scripts\") pod \"ovn-controller-m7xvm-config-49rfz\" (UID: \"af4ff2e6-05e0-4841-aed8-f330da5095e9\") " pod="openstack/ovn-controller-m7xvm-config-49rfz" Dec 01 14:14:25 crc kubenswrapper[4585]: I1201 14:14:25.255232 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2vxm\" (UniqueName: \"kubernetes.io/projected/af4ff2e6-05e0-4841-aed8-f330da5095e9-kube-api-access-g2vxm\") pod \"ovn-controller-m7xvm-config-49rfz\" (UID: \"af4ff2e6-05e0-4841-aed8-f330da5095e9\") " pod="openstack/ovn-controller-m7xvm-config-49rfz" Dec 01 14:14:25 crc kubenswrapper[4585]: I1201 14:14:25.386511 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-m7xvm-config-49rfz" Dec 01 14:14:25 crc kubenswrapper[4585]: I1201 14:14:25.657872 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3bc5c97f-1882-47da-843c-f8dba234f1f3","Type":"ContainerStarted","Data":"aee2ae8fc40caa6eff7cdc3b7060ae711e247ca3551ff429f8480370e35f9631"} Dec 01 14:14:25 crc kubenswrapper[4585]: I1201 14:14:25.663181 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-rvrx7" event={"ID":"3732bb19-81e6-42db-88c4-f0476ae5ace4","Type":"ContainerStarted","Data":"fe9aa6d17ef960a658e666c9d866827cb0d099f697364ce2f6cc4b9686014cb1"} Dec 01 14:14:25 crc kubenswrapper[4585]: I1201 14:14:25.683602 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-rvrx7" podStartSLOduration=2.57068133 podStartE2EDuration="16.683570173s" podCreationTimestamp="2025-12-01 14:14:09 +0000 UTC" firstStartedPulling="2025-12-01 14:14:10.028800919 +0000 UTC m=+964.013014774" lastFinishedPulling="2025-12-01 14:14:24.141689762 +0000 UTC m=+978.125903617" observedRunningTime="2025-12-01 14:14:25.682540186 +0000 UTC m=+979.666754041" watchObservedRunningTime="2025-12-01 14:14:25.683570173 +0000 UTC m=+979.667784028" Dec 01 14:14:26 crc kubenswrapper[4585]: I1201 14:14:26.414706 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-m7xvm-config-49rfz"] Dec 01 14:14:26 crc kubenswrapper[4585]: I1201 14:14:26.439220 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac4aab85-ddfc-4e11-8e25-54e8bbdbe195" path="/var/lib/kubelet/pods/ac4aab85-ddfc-4e11-8e25-54e8bbdbe195/volumes" Dec 01 14:14:26 crc kubenswrapper[4585]: I1201 14:14:26.675423 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-m7xvm-config-49rfz" event={"ID":"af4ff2e6-05e0-4841-aed8-f330da5095e9","Type":"ContainerStarted","Data":"0c0d48e44e33d820dc7aad3cbf67ea943e32991586f71c7c4d7b3972f1935d58"} Dec 01 14:14:27 crc kubenswrapper[4585]: I1201 14:14:27.683587 4585 generic.go:334] "Generic (PLEG): container finished" podID="af4ff2e6-05e0-4841-aed8-f330da5095e9" containerID="6d30ffa5587b83f16bca0f1a93349dff61fb549a0681bd8110eb45017a5f9200" exitCode=0 Dec 01 14:14:27 crc kubenswrapper[4585]: I1201 14:14:27.683660 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-m7xvm-config-49rfz" event={"ID":"af4ff2e6-05e0-4841-aed8-f330da5095e9","Type":"ContainerDied","Data":"6d30ffa5587b83f16bca0f1a93349dff61fb549a0681bd8110eb45017a5f9200"} Dec 01 14:14:27 crc kubenswrapper[4585]: I1201 14:14:27.685759 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3bc5c97f-1882-47da-843c-f8dba234f1f3","Type":"ContainerStarted","Data":"36988353ac935f3b467f1d956772fbc363097474daeec228847fe7be7ab8909d"} Dec 01 14:14:27 crc kubenswrapper[4585]: I1201 14:14:27.685795 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3bc5c97f-1882-47da-843c-f8dba234f1f3","Type":"ContainerStarted","Data":"a98b652822696a0da6ebe2b82f8a20703e4363265383e04808831bea4a2b8c6a"} Dec 01 14:14:28 crc kubenswrapper[4585]: I1201 14:14:28.698344 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3bc5c97f-1882-47da-843c-f8dba234f1f3","Type":"ContainerStarted","Data":"9fca54743961cfa8be6c140f641dac0b94d98cb9edd9d14d936291a660d72585"} Dec 01 14:14:28 crc kubenswrapper[4585]: I1201 14:14:28.698836 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3bc5c97f-1882-47da-843c-f8dba234f1f3","Type":"ContainerStarted","Data":"3a655646a899ef0000facafd5dcf756cf36fda5ddf6185b098acd285862f8780"} Dec 01 14:14:29 crc kubenswrapper[4585]: I1201 14:14:29.038240 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-m7xvm-config-49rfz" Dec 01 14:14:29 crc kubenswrapper[4585]: I1201 14:14:29.100304 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2vxm\" (UniqueName: \"kubernetes.io/projected/af4ff2e6-05e0-4841-aed8-f330da5095e9-kube-api-access-g2vxm\") pod \"af4ff2e6-05e0-4841-aed8-f330da5095e9\" (UID: \"af4ff2e6-05e0-4841-aed8-f330da5095e9\") " Dec 01 14:14:29 crc kubenswrapper[4585]: I1201 14:14:29.100359 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/af4ff2e6-05e0-4841-aed8-f330da5095e9-scripts\") pod \"af4ff2e6-05e0-4841-aed8-f330da5095e9\" (UID: \"af4ff2e6-05e0-4841-aed8-f330da5095e9\") " Dec 01 14:14:29 crc kubenswrapper[4585]: I1201 14:14:29.100425 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/af4ff2e6-05e0-4841-aed8-f330da5095e9-var-run\") pod \"af4ff2e6-05e0-4841-aed8-f330da5095e9\" (UID: \"af4ff2e6-05e0-4841-aed8-f330da5095e9\") " Dec 01 14:14:29 crc kubenswrapper[4585]: I1201 14:14:29.100517 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/af4ff2e6-05e0-4841-aed8-f330da5095e9-additional-scripts\") pod \"af4ff2e6-05e0-4841-aed8-f330da5095e9\" (UID: \"af4ff2e6-05e0-4841-aed8-f330da5095e9\") " Dec 01 14:14:29 crc kubenswrapper[4585]: I1201 14:14:29.100562 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/af4ff2e6-05e0-4841-aed8-f330da5095e9-var-run-ovn\") pod \"af4ff2e6-05e0-4841-aed8-f330da5095e9\" (UID: \"af4ff2e6-05e0-4841-aed8-f330da5095e9\") " Dec 01 14:14:29 crc kubenswrapper[4585]: I1201 14:14:29.100616 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/af4ff2e6-05e0-4841-aed8-f330da5095e9-var-log-ovn\") pod \"af4ff2e6-05e0-4841-aed8-f330da5095e9\" (UID: \"af4ff2e6-05e0-4841-aed8-f330da5095e9\") " Dec 01 14:14:29 crc kubenswrapper[4585]: I1201 14:14:29.100636 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/af4ff2e6-05e0-4841-aed8-f330da5095e9-var-run" (OuterVolumeSpecName: "var-run") pod "af4ff2e6-05e0-4841-aed8-f330da5095e9" (UID: "af4ff2e6-05e0-4841-aed8-f330da5095e9"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 14:14:29 crc kubenswrapper[4585]: I1201 14:14:29.100702 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/af4ff2e6-05e0-4841-aed8-f330da5095e9-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "af4ff2e6-05e0-4841-aed8-f330da5095e9" (UID: "af4ff2e6-05e0-4841-aed8-f330da5095e9"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 14:14:29 crc kubenswrapper[4585]: I1201 14:14:29.100805 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/af4ff2e6-05e0-4841-aed8-f330da5095e9-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "af4ff2e6-05e0-4841-aed8-f330da5095e9" (UID: "af4ff2e6-05e0-4841-aed8-f330da5095e9"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 14:14:29 crc kubenswrapper[4585]: I1201 14:14:29.101261 4585 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/af4ff2e6-05e0-4841-aed8-f330da5095e9-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:29 crc kubenswrapper[4585]: I1201 14:14:29.101276 4585 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/af4ff2e6-05e0-4841-aed8-f330da5095e9-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:29 crc kubenswrapper[4585]: I1201 14:14:29.101286 4585 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/af4ff2e6-05e0-4841-aed8-f330da5095e9-var-run\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:29 crc kubenswrapper[4585]: I1201 14:14:29.101276 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af4ff2e6-05e0-4841-aed8-f330da5095e9-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "af4ff2e6-05e0-4841-aed8-f330da5095e9" (UID: "af4ff2e6-05e0-4841-aed8-f330da5095e9"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:14:29 crc kubenswrapper[4585]: I1201 14:14:29.101496 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af4ff2e6-05e0-4841-aed8-f330da5095e9-scripts" (OuterVolumeSpecName: "scripts") pod "af4ff2e6-05e0-4841-aed8-f330da5095e9" (UID: "af4ff2e6-05e0-4841-aed8-f330da5095e9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:14:29 crc kubenswrapper[4585]: I1201 14:14:29.113363 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af4ff2e6-05e0-4841-aed8-f330da5095e9-kube-api-access-g2vxm" (OuterVolumeSpecName: "kube-api-access-g2vxm") pod "af4ff2e6-05e0-4841-aed8-f330da5095e9" (UID: "af4ff2e6-05e0-4841-aed8-f330da5095e9"). InnerVolumeSpecName "kube-api-access-g2vxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:14:29 crc kubenswrapper[4585]: I1201 14:14:29.203013 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2vxm\" (UniqueName: \"kubernetes.io/projected/af4ff2e6-05e0-4841-aed8-f330da5095e9-kube-api-access-g2vxm\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:29 crc kubenswrapper[4585]: I1201 14:14:29.203044 4585 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/af4ff2e6-05e0-4841-aed8-f330da5095e9-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:29 crc kubenswrapper[4585]: I1201 14:14:29.203054 4585 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/af4ff2e6-05e0-4841-aed8-f330da5095e9-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:29 crc kubenswrapper[4585]: I1201 14:14:29.707803 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-m7xvm-config-49rfz" event={"ID":"af4ff2e6-05e0-4841-aed8-f330da5095e9","Type":"ContainerDied","Data":"0c0d48e44e33d820dc7aad3cbf67ea943e32991586f71c7c4d7b3972f1935d58"} Dec 01 14:14:29 crc kubenswrapper[4585]: I1201 14:14:29.708106 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c0d48e44e33d820dc7aad3cbf67ea943e32991586f71c7c4d7b3972f1935d58" Dec 01 14:14:29 crc kubenswrapper[4585]: I1201 14:14:29.708041 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-m7xvm-config-49rfz" Dec 01 14:14:29 crc kubenswrapper[4585]: I1201 14:14:29.744712 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-m7xvm" Dec 01 14:14:30 crc kubenswrapper[4585]: I1201 14:14:30.156196 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-m7xvm-config-49rfz"] Dec 01 14:14:30 crc kubenswrapper[4585]: I1201 14:14:30.163197 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-m7xvm-config-49rfz"] Dec 01 14:14:30 crc kubenswrapper[4585]: I1201 14:14:30.422613 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af4ff2e6-05e0-4841-aed8-f330da5095e9" path="/var/lib/kubelet/pods/af4ff2e6-05e0-4841-aed8-f330da5095e9/volumes" Dec 01 14:14:30 crc kubenswrapper[4585]: I1201 14:14:30.719808 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3bc5c97f-1882-47da-843c-f8dba234f1f3","Type":"ContainerStarted","Data":"50834b8d2d40067d6fc5f1b2707d4fe66f76e9bd458eb1792b10f0599bd60fca"} Dec 01 14:14:30 crc kubenswrapper[4585]: I1201 14:14:30.719848 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3bc5c97f-1882-47da-843c-f8dba234f1f3","Type":"ContainerStarted","Data":"9115e387a734ec09e17237d3649c2579d56a9f00b8855c3c6aad206b6fccad90"} Dec 01 14:14:30 crc kubenswrapper[4585]: I1201 14:14:30.719862 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3bc5c97f-1882-47da-843c-f8dba234f1f3","Type":"ContainerStarted","Data":"32fed8fbfe372b437cba0300e0067fa1274e0d398a0f99caaeb23b40b293b73d"} Dec 01 14:14:30 crc kubenswrapper[4585]: I1201 14:14:30.719871 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3bc5c97f-1882-47da-843c-f8dba234f1f3","Type":"ContainerStarted","Data":"56d2d4d71ca9d0a911ba092c560846e9edce9d997e0422d70d73b1641d6add20"} Dec 01 14:14:32 crc kubenswrapper[4585]: I1201 14:14:32.748634 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3bc5c97f-1882-47da-843c-f8dba234f1f3","Type":"ContainerStarted","Data":"ca80cd3624ca6bc30a4cfa54ac0dbcd47e64f4d63b8db797809d1c242dc8b605"} Dec 01 14:14:32 crc kubenswrapper[4585]: I1201 14:14:32.749094 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3bc5c97f-1882-47da-843c-f8dba234f1f3","Type":"ContainerStarted","Data":"70f074d05ccae281db36b124c0dbdd190be44a7aa064abe76c9988e12ca8c5d0"} Dec 01 14:14:32 crc kubenswrapper[4585]: I1201 14:14:32.749111 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3bc5c97f-1882-47da-843c-f8dba234f1f3","Type":"ContainerStarted","Data":"19d17a1605a5295bc67abfbbbae6ab4befebc1bef93daa62704c55caaeb813eb"} Dec 01 14:14:32 crc kubenswrapper[4585]: I1201 14:14:32.749119 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3bc5c97f-1882-47da-843c-f8dba234f1f3","Type":"ContainerStarted","Data":"1c1fbdd60c5d55846a7661b4f7c30113179ee94433ecd415681efeebcab02f3a"} Dec 01 14:14:32 crc kubenswrapper[4585]: I1201 14:14:32.749129 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3bc5c97f-1882-47da-843c-f8dba234f1f3","Type":"ContainerStarted","Data":"3d6f29fc400fdfef62664af58cfce67d83fd79a0f7d7a94a99ab97fa51489ba9"} Dec 01 14:14:32 crc kubenswrapper[4585]: I1201 14:14:32.749137 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3bc5c97f-1882-47da-843c-f8dba234f1f3","Type":"ContainerStarted","Data":"18c4218d3d825df2a87839a12cc0383ba929d89c3df0d9d200937d37369940c7"} Dec 01 14:14:33 crc kubenswrapper[4585]: I1201 14:14:33.767546 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3bc5c97f-1882-47da-843c-f8dba234f1f3","Type":"ContainerStarted","Data":"73a19cbf3df9ebba82ec723339895ef232d19e56e5d2e580e8024dd0eccbcc32"} Dec 01 14:14:33 crc kubenswrapper[4585]: I1201 14:14:33.770041 4585 generic.go:334] "Generic (PLEG): container finished" podID="3732bb19-81e6-42db-88c4-f0476ae5ace4" containerID="fe9aa6d17ef960a658e666c9d866827cb0d099f697364ce2f6cc4b9686014cb1" exitCode=0 Dec 01 14:14:33 crc kubenswrapper[4585]: I1201 14:14:33.770092 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-rvrx7" event={"ID":"3732bb19-81e6-42db-88c4-f0476ae5ace4","Type":"ContainerDied","Data":"fe9aa6d17ef960a658e666c9d866827cb0d099f697364ce2f6cc4b9686014cb1"} Dec 01 14:14:33 crc kubenswrapper[4585]: I1201 14:14:33.832894 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=36.715961242 podStartE2EDuration="43.832874351s" podCreationTimestamp="2025-12-01 14:13:50 +0000 UTC" firstStartedPulling="2025-12-01 14:14:24.642771719 +0000 UTC m=+978.626985574" lastFinishedPulling="2025-12-01 14:14:31.759684828 +0000 UTC m=+985.743898683" observedRunningTime="2025-12-01 14:14:33.831827173 +0000 UTC m=+987.816041088" watchObservedRunningTime="2025-12-01 14:14:33.832874351 +0000 UTC m=+987.817088206" Dec 01 14:14:34 crc kubenswrapper[4585]: I1201 14:14:34.229067 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-clpkz"] Dec 01 14:14:34 crc kubenswrapper[4585]: E1201 14:14:34.229386 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af4ff2e6-05e0-4841-aed8-f330da5095e9" containerName="ovn-config" Dec 01 14:14:34 crc kubenswrapper[4585]: I1201 14:14:34.229401 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="af4ff2e6-05e0-4841-aed8-f330da5095e9" containerName="ovn-config" Dec 01 14:14:34 crc kubenswrapper[4585]: I1201 14:14:34.229555 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="af4ff2e6-05e0-4841-aed8-f330da5095e9" containerName="ovn-config" Dec 01 14:14:34 crc kubenswrapper[4585]: I1201 14:14:34.230439 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-clpkz" Dec 01 14:14:34 crc kubenswrapper[4585]: I1201 14:14:34.233959 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Dec 01 14:14:34 crc kubenswrapper[4585]: I1201 14:14:34.249114 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-clpkz"] Dec 01 14:14:34 crc kubenswrapper[4585]: I1201 14:14:34.280769 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8221747f-ffad-4b2e-bbd9-94b7d25949cc-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-clpkz\" (UID: \"8221747f-ffad-4b2e-bbd9-94b7d25949cc\") " pod="openstack/dnsmasq-dns-764c5664d7-clpkz" Dec 01 14:14:34 crc kubenswrapper[4585]: I1201 14:14:34.280837 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8221747f-ffad-4b2e-bbd9-94b7d25949cc-dns-svc\") pod \"dnsmasq-dns-764c5664d7-clpkz\" (UID: \"8221747f-ffad-4b2e-bbd9-94b7d25949cc\") " pod="openstack/dnsmasq-dns-764c5664d7-clpkz" Dec 01 14:14:34 crc kubenswrapper[4585]: I1201 14:14:34.280951 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pwh6\" (UniqueName: \"kubernetes.io/projected/8221747f-ffad-4b2e-bbd9-94b7d25949cc-kube-api-access-7pwh6\") pod \"dnsmasq-dns-764c5664d7-clpkz\" (UID: \"8221747f-ffad-4b2e-bbd9-94b7d25949cc\") " pod="openstack/dnsmasq-dns-764c5664d7-clpkz" Dec 01 14:14:34 crc kubenswrapper[4585]: I1201 14:14:34.281010 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8221747f-ffad-4b2e-bbd9-94b7d25949cc-config\") pod \"dnsmasq-dns-764c5664d7-clpkz\" (UID: \"8221747f-ffad-4b2e-bbd9-94b7d25949cc\") " pod="openstack/dnsmasq-dns-764c5664d7-clpkz" Dec 01 14:14:34 crc kubenswrapper[4585]: I1201 14:14:34.281049 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8221747f-ffad-4b2e-bbd9-94b7d25949cc-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-clpkz\" (UID: \"8221747f-ffad-4b2e-bbd9-94b7d25949cc\") " pod="openstack/dnsmasq-dns-764c5664d7-clpkz" Dec 01 14:14:34 crc kubenswrapper[4585]: I1201 14:14:34.281149 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8221747f-ffad-4b2e-bbd9-94b7d25949cc-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-clpkz\" (UID: \"8221747f-ffad-4b2e-bbd9-94b7d25949cc\") " pod="openstack/dnsmasq-dns-764c5664d7-clpkz" Dec 01 14:14:34 crc kubenswrapper[4585]: I1201 14:14:34.382950 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8221747f-ffad-4b2e-bbd9-94b7d25949cc-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-clpkz\" (UID: \"8221747f-ffad-4b2e-bbd9-94b7d25949cc\") " pod="openstack/dnsmasq-dns-764c5664d7-clpkz" Dec 01 14:14:34 crc kubenswrapper[4585]: I1201 14:14:34.383018 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8221747f-ffad-4b2e-bbd9-94b7d25949cc-dns-svc\") pod \"dnsmasq-dns-764c5664d7-clpkz\" (UID: \"8221747f-ffad-4b2e-bbd9-94b7d25949cc\") " pod="openstack/dnsmasq-dns-764c5664d7-clpkz" Dec 01 14:14:34 crc kubenswrapper[4585]: I1201 14:14:34.383066 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pwh6\" (UniqueName: \"kubernetes.io/projected/8221747f-ffad-4b2e-bbd9-94b7d25949cc-kube-api-access-7pwh6\") pod \"dnsmasq-dns-764c5664d7-clpkz\" (UID: \"8221747f-ffad-4b2e-bbd9-94b7d25949cc\") " pod="openstack/dnsmasq-dns-764c5664d7-clpkz" Dec 01 14:14:34 crc kubenswrapper[4585]: I1201 14:14:34.383102 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8221747f-ffad-4b2e-bbd9-94b7d25949cc-config\") pod \"dnsmasq-dns-764c5664d7-clpkz\" (UID: \"8221747f-ffad-4b2e-bbd9-94b7d25949cc\") " pod="openstack/dnsmasq-dns-764c5664d7-clpkz" Dec 01 14:14:34 crc kubenswrapper[4585]: I1201 14:14:34.383137 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8221747f-ffad-4b2e-bbd9-94b7d25949cc-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-clpkz\" (UID: \"8221747f-ffad-4b2e-bbd9-94b7d25949cc\") " pod="openstack/dnsmasq-dns-764c5664d7-clpkz" Dec 01 14:14:34 crc kubenswrapper[4585]: I1201 14:14:34.383251 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8221747f-ffad-4b2e-bbd9-94b7d25949cc-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-clpkz\" (UID: \"8221747f-ffad-4b2e-bbd9-94b7d25949cc\") " pod="openstack/dnsmasq-dns-764c5664d7-clpkz" Dec 01 14:14:34 crc kubenswrapper[4585]: I1201 14:14:34.383966 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8221747f-ffad-4b2e-bbd9-94b7d25949cc-dns-svc\") pod \"dnsmasq-dns-764c5664d7-clpkz\" (UID: \"8221747f-ffad-4b2e-bbd9-94b7d25949cc\") " pod="openstack/dnsmasq-dns-764c5664d7-clpkz" Dec 01 14:14:34 crc kubenswrapper[4585]: I1201 14:14:34.384183 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8221747f-ffad-4b2e-bbd9-94b7d25949cc-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-clpkz\" (UID: \"8221747f-ffad-4b2e-bbd9-94b7d25949cc\") " pod="openstack/dnsmasq-dns-764c5664d7-clpkz" Dec 01 14:14:34 crc kubenswrapper[4585]: I1201 14:14:34.384183 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8221747f-ffad-4b2e-bbd9-94b7d25949cc-config\") pod \"dnsmasq-dns-764c5664d7-clpkz\" (UID: \"8221747f-ffad-4b2e-bbd9-94b7d25949cc\") " pod="openstack/dnsmasq-dns-764c5664d7-clpkz" Dec 01 14:14:34 crc kubenswrapper[4585]: I1201 14:14:34.384250 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8221747f-ffad-4b2e-bbd9-94b7d25949cc-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-clpkz\" (UID: \"8221747f-ffad-4b2e-bbd9-94b7d25949cc\") " pod="openstack/dnsmasq-dns-764c5664d7-clpkz" Dec 01 14:14:34 crc kubenswrapper[4585]: I1201 14:14:34.384689 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8221747f-ffad-4b2e-bbd9-94b7d25949cc-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-clpkz\" (UID: \"8221747f-ffad-4b2e-bbd9-94b7d25949cc\") " pod="openstack/dnsmasq-dns-764c5664d7-clpkz" Dec 01 14:14:34 crc kubenswrapper[4585]: I1201 14:14:34.401493 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pwh6\" (UniqueName: \"kubernetes.io/projected/8221747f-ffad-4b2e-bbd9-94b7d25949cc-kube-api-access-7pwh6\") pod \"dnsmasq-dns-764c5664d7-clpkz\" (UID: \"8221747f-ffad-4b2e-bbd9-94b7d25949cc\") " pod="openstack/dnsmasq-dns-764c5664d7-clpkz" Dec 01 14:14:34 crc kubenswrapper[4585]: I1201 14:14:34.547165 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-clpkz" Dec 01 14:14:34 crc kubenswrapper[4585]: I1201 14:14:34.992208 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-clpkz"] Dec 01 14:14:35 crc kubenswrapper[4585]: I1201 14:14:35.123773 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:14:35 crc kubenswrapper[4585]: I1201 14:14:35.173248 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-rvrx7" Dec 01 14:14:35 crc kubenswrapper[4585]: I1201 14:14:35.188363 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 01 14:14:35 crc kubenswrapper[4585]: I1201 14:14:35.298673 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3732bb19-81e6-42db-88c4-f0476ae5ace4-config-data\") pod \"3732bb19-81e6-42db-88c4-f0476ae5ace4\" (UID: \"3732bb19-81e6-42db-88c4-f0476ae5ace4\") " Dec 01 14:14:35 crc kubenswrapper[4585]: I1201 14:14:35.299172 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lrq8\" (UniqueName: \"kubernetes.io/projected/3732bb19-81e6-42db-88c4-f0476ae5ace4-kube-api-access-4lrq8\") pod \"3732bb19-81e6-42db-88c4-f0476ae5ace4\" (UID: \"3732bb19-81e6-42db-88c4-f0476ae5ace4\") " Dec 01 14:14:35 crc kubenswrapper[4585]: I1201 14:14:35.299230 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3732bb19-81e6-42db-88c4-f0476ae5ace4-combined-ca-bundle\") pod \"3732bb19-81e6-42db-88c4-f0476ae5ace4\" (UID: \"3732bb19-81e6-42db-88c4-f0476ae5ace4\") " Dec 01 14:14:35 crc kubenswrapper[4585]: I1201 14:14:35.299276 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3732bb19-81e6-42db-88c4-f0476ae5ace4-db-sync-config-data\") pod \"3732bb19-81e6-42db-88c4-f0476ae5ace4\" (UID: \"3732bb19-81e6-42db-88c4-f0476ae5ace4\") " Dec 01 14:14:35 crc kubenswrapper[4585]: I1201 14:14:35.308269 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3732bb19-81e6-42db-88c4-f0476ae5ace4-kube-api-access-4lrq8" (OuterVolumeSpecName: "kube-api-access-4lrq8") pod "3732bb19-81e6-42db-88c4-f0476ae5ace4" (UID: "3732bb19-81e6-42db-88c4-f0476ae5ace4"). InnerVolumeSpecName "kube-api-access-4lrq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:14:35 crc kubenswrapper[4585]: I1201 14:14:35.312465 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3732bb19-81e6-42db-88c4-f0476ae5ace4-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "3732bb19-81e6-42db-88c4-f0476ae5ace4" (UID: "3732bb19-81e6-42db-88c4-f0476ae5ace4"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:14:35 crc kubenswrapper[4585]: I1201 14:14:35.354290 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3732bb19-81e6-42db-88c4-f0476ae5ace4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3732bb19-81e6-42db-88c4-f0476ae5ace4" (UID: "3732bb19-81e6-42db-88c4-f0476ae5ace4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:14:35 crc kubenswrapper[4585]: I1201 14:14:35.395343 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3732bb19-81e6-42db-88c4-f0476ae5ace4-config-data" (OuterVolumeSpecName: "config-data") pod "3732bb19-81e6-42db-88c4-f0476ae5ace4" (UID: "3732bb19-81e6-42db-88c4-f0476ae5ace4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:14:35 crc kubenswrapper[4585]: I1201 14:14:35.401854 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lrq8\" (UniqueName: \"kubernetes.io/projected/3732bb19-81e6-42db-88c4-f0476ae5ace4-kube-api-access-4lrq8\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:35 crc kubenswrapper[4585]: I1201 14:14:35.401893 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3732bb19-81e6-42db-88c4-f0476ae5ace4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:35 crc kubenswrapper[4585]: I1201 14:14:35.401904 4585 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3732bb19-81e6-42db-88c4-f0476ae5ace4-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:35 crc kubenswrapper[4585]: I1201 14:14:35.401912 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3732bb19-81e6-42db-88c4-f0476ae5ace4-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:35 crc kubenswrapper[4585]: I1201 14:14:35.792922 4585 generic.go:334] "Generic (PLEG): container finished" podID="8221747f-ffad-4b2e-bbd9-94b7d25949cc" containerID="9af3e001e6713a8464ffda57096c95b11f6f20a92b9ad1e2f7b2ef8cda667999" exitCode=0 Dec 01 14:14:35 crc kubenswrapper[4585]: I1201 14:14:35.792996 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-clpkz" event={"ID":"8221747f-ffad-4b2e-bbd9-94b7d25949cc","Type":"ContainerDied","Data":"9af3e001e6713a8464ffda57096c95b11f6f20a92b9ad1e2f7b2ef8cda667999"} Dec 01 14:14:35 crc kubenswrapper[4585]: I1201 14:14:35.793047 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-clpkz" event={"ID":"8221747f-ffad-4b2e-bbd9-94b7d25949cc","Type":"ContainerStarted","Data":"0ef5c3ea209683a1ac074d0314872a8cc32447351a782476f2611d224a0a65db"} Dec 01 14:14:35 crc kubenswrapper[4585]: I1201 14:14:35.795017 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-rvrx7" event={"ID":"3732bb19-81e6-42db-88c4-f0476ae5ace4","Type":"ContainerDied","Data":"d87b53f578d87c87c612a27593d62c6f67a8402da180b02b73e231aaa1aaa874"} Dec 01 14:14:35 crc kubenswrapper[4585]: I1201 14:14:35.795048 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d87b53f578d87c87c612a27593d62c6f67a8402da180b02b73e231aaa1aaa874" Dec 01 14:14:35 crc kubenswrapper[4585]: I1201 14:14:35.795185 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-rvrx7" Dec 01 14:14:36 crc kubenswrapper[4585]: I1201 14:14:36.317960 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-clpkz"] Dec 01 14:14:36 crc kubenswrapper[4585]: I1201 14:14:36.363639 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-5hcms"] Dec 01 14:14:36 crc kubenswrapper[4585]: E1201 14:14:36.364301 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3732bb19-81e6-42db-88c4-f0476ae5ace4" containerName="glance-db-sync" Dec 01 14:14:36 crc kubenswrapper[4585]: I1201 14:14:36.364386 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="3732bb19-81e6-42db-88c4-f0476ae5ace4" containerName="glance-db-sync" Dec 01 14:14:36 crc kubenswrapper[4585]: I1201 14:14:36.364608 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="3732bb19-81e6-42db-88c4-f0476ae5ace4" containerName="glance-db-sync" Dec 01 14:14:36 crc kubenswrapper[4585]: I1201 14:14:36.365535 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-5hcms" Dec 01 14:14:36 crc kubenswrapper[4585]: I1201 14:14:36.445440 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-5hcms"] Dec 01 14:14:36 crc kubenswrapper[4585]: I1201 14:14:36.522117 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36a1f1e1-f620-422c-b66d-db401c8be015-config\") pod \"dnsmasq-dns-74f6bcbc87-5hcms\" (UID: \"36a1f1e1-f620-422c-b66d-db401c8be015\") " pod="openstack/dnsmasq-dns-74f6bcbc87-5hcms" Dec 01 14:14:36 crc kubenswrapper[4585]: I1201 14:14:36.522175 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzg4f\" (UniqueName: \"kubernetes.io/projected/36a1f1e1-f620-422c-b66d-db401c8be015-kube-api-access-gzg4f\") pod \"dnsmasq-dns-74f6bcbc87-5hcms\" (UID: \"36a1f1e1-f620-422c-b66d-db401c8be015\") " pod="openstack/dnsmasq-dns-74f6bcbc87-5hcms" Dec 01 14:14:36 crc kubenswrapper[4585]: I1201 14:14:36.522199 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/36a1f1e1-f620-422c-b66d-db401c8be015-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-5hcms\" (UID: \"36a1f1e1-f620-422c-b66d-db401c8be015\") " pod="openstack/dnsmasq-dns-74f6bcbc87-5hcms" Dec 01 14:14:36 crc kubenswrapper[4585]: I1201 14:14:36.522218 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/36a1f1e1-f620-422c-b66d-db401c8be015-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-5hcms\" (UID: \"36a1f1e1-f620-422c-b66d-db401c8be015\") " pod="openstack/dnsmasq-dns-74f6bcbc87-5hcms" Dec 01 14:14:36 crc kubenswrapper[4585]: I1201 14:14:36.522253 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/36a1f1e1-f620-422c-b66d-db401c8be015-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-5hcms\" (UID: \"36a1f1e1-f620-422c-b66d-db401c8be015\") " pod="openstack/dnsmasq-dns-74f6bcbc87-5hcms" Dec 01 14:14:36 crc kubenswrapper[4585]: I1201 14:14:36.522345 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/36a1f1e1-f620-422c-b66d-db401c8be015-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-5hcms\" (UID: \"36a1f1e1-f620-422c-b66d-db401c8be015\") " pod="openstack/dnsmasq-dns-74f6bcbc87-5hcms" Dec 01 14:14:36 crc kubenswrapper[4585]: I1201 14:14:36.624328 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36a1f1e1-f620-422c-b66d-db401c8be015-config\") pod \"dnsmasq-dns-74f6bcbc87-5hcms\" (UID: \"36a1f1e1-f620-422c-b66d-db401c8be015\") " pod="openstack/dnsmasq-dns-74f6bcbc87-5hcms" Dec 01 14:14:36 crc kubenswrapper[4585]: I1201 14:14:36.624729 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzg4f\" (UniqueName: \"kubernetes.io/projected/36a1f1e1-f620-422c-b66d-db401c8be015-kube-api-access-gzg4f\") pod \"dnsmasq-dns-74f6bcbc87-5hcms\" (UID: \"36a1f1e1-f620-422c-b66d-db401c8be015\") " pod="openstack/dnsmasq-dns-74f6bcbc87-5hcms" Dec 01 14:14:36 crc kubenswrapper[4585]: I1201 14:14:36.624758 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/36a1f1e1-f620-422c-b66d-db401c8be015-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-5hcms\" (UID: \"36a1f1e1-f620-422c-b66d-db401c8be015\") " pod="openstack/dnsmasq-dns-74f6bcbc87-5hcms" Dec 01 14:14:36 crc kubenswrapper[4585]: I1201 14:14:36.625378 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36a1f1e1-f620-422c-b66d-db401c8be015-config\") pod \"dnsmasq-dns-74f6bcbc87-5hcms\" (UID: \"36a1f1e1-f620-422c-b66d-db401c8be015\") " pod="openstack/dnsmasq-dns-74f6bcbc87-5hcms" Dec 01 14:14:36 crc kubenswrapper[4585]: I1201 14:14:36.625705 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/36a1f1e1-f620-422c-b66d-db401c8be015-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-5hcms\" (UID: \"36a1f1e1-f620-422c-b66d-db401c8be015\") " pod="openstack/dnsmasq-dns-74f6bcbc87-5hcms" Dec 01 14:14:36 crc kubenswrapper[4585]: I1201 14:14:36.625719 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/36a1f1e1-f620-422c-b66d-db401c8be015-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-5hcms\" (UID: \"36a1f1e1-f620-422c-b66d-db401c8be015\") " pod="openstack/dnsmasq-dns-74f6bcbc87-5hcms" Dec 01 14:14:36 crc kubenswrapper[4585]: I1201 14:14:36.626234 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/36a1f1e1-f620-422c-b66d-db401c8be015-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-5hcms\" (UID: \"36a1f1e1-f620-422c-b66d-db401c8be015\") " pod="openstack/dnsmasq-dns-74f6bcbc87-5hcms" Dec 01 14:14:36 crc kubenswrapper[4585]: I1201 14:14:36.626887 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/36a1f1e1-f620-422c-b66d-db401c8be015-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-5hcms\" (UID: \"36a1f1e1-f620-422c-b66d-db401c8be015\") " pod="openstack/dnsmasq-dns-74f6bcbc87-5hcms" Dec 01 14:14:36 crc kubenswrapper[4585]: I1201 14:14:36.626999 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/36a1f1e1-f620-422c-b66d-db401c8be015-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-5hcms\" (UID: \"36a1f1e1-f620-422c-b66d-db401c8be015\") " pod="openstack/dnsmasq-dns-74f6bcbc87-5hcms" Dec 01 14:14:36 crc kubenswrapper[4585]: I1201 14:14:36.627103 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/36a1f1e1-f620-422c-b66d-db401c8be015-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-5hcms\" (UID: \"36a1f1e1-f620-422c-b66d-db401c8be015\") " pod="openstack/dnsmasq-dns-74f6bcbc87-5hcms" Dec 01 14:14:36 crc kubenswrapper[4585]: I1201 14:14:36.627690 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/36a1f1e1-f620-422c-b66d-db401c8be015-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-5hcms\" (UID: \"36a1f1e1-f620-422c-b66d-db401c8be015\") " pod="openstack/dnsmasq-dns-74f6bcbc87-5hcms" Dec 01 14:14:36 crc kubenswrapper[4585]: I1201 14:14:36.645016 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzg4f\" (UniqueName: \"kubernetes.io/projected/36a1f1e1-f620-422c-b66d-db401c8be015-kube-api-access-gzg4f\") pod \"dnsmasq-dns-74f6bcbc87-5hcms\" (UID: \"36a1f1e1-f620-422c-b66d-db401c8be015\") " pod="openstack/dnsmasq-dns-74f6bcbc87-5hcms" Dec 01 14:14:36 crc kubenswrapper[4585]: I1201 14:14:36.679731 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-5hcms" Dec 01 14:14:36 crc kubenswrapper[4585]: I1201 14:14:36.807291 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-clpkz" event={"ID":"8221747f-ffad-4b2e-bbd9-94b7d25949cc","Type":"ContainerStarted","Data":"f26e017f19af5f717b0f8d349fc9ae2e86c4391db97399115453c902e31de17f"} Dec 01 14:14:36 crc kubenswrapper[4585]: I1201 14:14:36.807544 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-764c5664d7-clpkz" Dec 01 14:14:36 crc kubenswrapper[4585]: I1201 14:14:36.838868 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-764c5664d7-clpkz" podStartSLOduration=2.838849929 podStartE2EDuration="2.838849929s" podCreationTimestamp="2025-12-01 14:14:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:14:36.827059854 +0000 UTC m=+990.811273719" watchObservedRunningTime="2025-12-01 14:14:36.838849929 +0000 UTC m=+990.823063784" Dec 01 14:14:36 crc kubenswrapper[4585]: I1201 14:14:36.970421 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-5hcms"] Dec 01 14:14:36 crc kubenswrapper[4585]: W1201 14:14:36.983381 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36a1f1e1_f620_422c_b66d_db401c8be015.slice/crio-92ea8f40ebe9db56fa69f0a445b1cf0e5d0df8fa85aa52080388d91f808fe172 WatchSource:0}: Error finding container 92ea8f40ebe9db56fa69f0a445b1cf0e5d0df8fa85aa52080388d91f808fe172: Status 404 returned error can't find the container with id 92ea8f40ebe9db56fa69f0a445b1cf0e5d0df8fa85aa52080388d91f808fe172 Dec 01 14:14:37 crc kubenswrapper[4585]: I1201 14:14:37.821038 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-5hcms" event={"ID":"36a1f1e1-f620-422c-b66d-db401c8be015","Type":"ContainerStarted","Data":"454e37522701b59d5c7376872f5534c7878a7027ca4dfb4e37d6f4044ec79619"} Dec 01 14:14:37 crc kubenswrapper[4585]: I1201 14:14:37.821356 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-5hcms" event={"ID":"36a1f1e1-f620-422c-b66d-db401c8be015","Type":"ContainerStarted","Data":"92ea8f40ebe9db56fa69f0a445b1cf0e5d0df8fa85aa52080388d91f808fe172"} Dec 01 14:14:37 crc kubenswrapper[4585]: I1201 14:14:37.821785 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-764c5664d7-clpkz" podUID="8221747f-ffad-4b2e-bbd9-94b7d25949cc" containerName="dnsmasq-dns" containerID="cri-o://f26e017f19af5f717b0f8d349fc9ae2e86c4391db97399115453c902e31de17f" gracePeriod=10 Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.063356 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-a91f-account-create-update-v4qw7"] Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.064464 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a91f-account-create-update-v4qw7" Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.070469 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.077040 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-k9pq4"] Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.080134 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-k9pq4" Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.087794 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-k9pq4"] Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.122049 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-a91f-account-create-update-v4qw7"] Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.159127 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb978ed4-fbc2-4706-821e-3e820802d995-operator-scripts\") pod \"cinder-db-create-k9pq4\" (UID: \"fb978ed4-fbc2-4706-821e-3e820802d995\") " pod="openstack/cinder-db-create-k9pq4" Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.159256 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bed28d37-5c2e-4088-b0ed-d4b3dbbc42b2-operator-scripts\") pod \"cinder-a91f-account-create-update-v4qw7\" (UID: \"bed28d37-5c2e-4088-b0ed-d4b3dbbc42b2\") " pod="openstack/cinder-a91f-account-create-update-v4qw7" Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.159319 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlfnx\" (UniqueName: \"kubernetes.io/projected/fb978ed4-fbc2-4706-821e-3e820802d995-kube-api-access-hlfnx\") pod \"cinder-db-create-k9pq4\" (UID: \"fb978ed4-fbc2-4706-821e-3e820802d995\") " pod="openstack/cinder-db-create-k9pq4" Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.160524 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thtvw\" (UniqueName: \"kubernetes.io/projected/bed28d37-5c2e-4088-b0ed-d4b3dbbc42b2-kube-api-access-thtvw\") pod \"cinder-a91f-account-create-update-v4qw7\" (UID: \"bed28d37-5c2e-4088-b0ed-d4b3dbbc42b2\") " pod="openstack/cinder-a91f-account-create-update-v4qw7" Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.201119 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-5bz9b"] Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.203034 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-5bz9b" Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.245573 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-5bz9b"] Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.264300 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thtvw\" (UniqueName: \"kubernetes.io/projected/bed28d37-5c2e-4088-b0ed-d4b3dbbc42b2-kube-api-access-thtvw\") pod \"cinder-a91f-account-create-update-v4qw7\" (UID: \"bed28d37-5c2e-4088-b0ed-d4b3dbbc42b2\") " pod="openstack/cinder-a91f-account-create-update-v4qw7" Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.264372 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb978ed4-fbc2-4706-821e-3e820802d995-operator-scripts\") pod \"cinder-db-create-k9pq4\" (UID: \"fb978ed4-fbc2-4706-821e-3e820802d995\") " pod="openstack/cinder-db-create-k9pq4" Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.264451 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bed28d37-5c2e-4088-b0ed-d4b3dbbc42b2-operator-scripts\") pod \"cinder-a91f-account-create-update-v4qw7\" (UID: \"bed28d37-5c2e-4088-b0ed-d4b3dbbc42b2\") " pod="openstack/cinder-a91f-account-create-update-v4qw7" Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.264491 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlfnx\" (UniqueName: \"kubernetes.io/projected/fb978ed4-fbc2-4706-821e-3e820802d995-kube-api-access-hlfnx\") pod \"cinder-db-create-k9pq4\" (UID: \"fb978ed4-fbc2-4706-821e-3e820802d995\") " pod="openstack/cinder-db-create-k9pq4" Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.265723 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb978ed4-fbc2-4706-821e-3e820802d995-operator-scripts\") pod \"cinder-db-create-k9pq4\" (UID: \"fb978ed4-fbc2-4706-821e-3e820802d995\") " pod="openstack/cinder-db-create-k9pq4" Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.266435 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bed28d37-5c2e-4088-b0ed-d4b3dbbc42b2-operator-scripts\") pod \"cinder-a91f-account-create-update-v4qw7\" (UID: \"bed28d37-5c2e-4088-b0ed-d4b3dbbc42b2\") " pod="openstack/cinder-a91f-account-create-update-v4qw7" Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.304082 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlfnx\" (UniqueName: \"kubernetes.io/projected/fb978ed4-fbc2-4706-821e-3e820802d995-kube-api-access-hlfnx\") pod \"cinder-db-create-k9pq4\" (UID: \"fb978ed4-fbc2-4706-821e-3e820802d995\") " pod="openstack/cinder-db-create-k9pq4" Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.310322 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thtvw\" (UniqueName: \"kubernetes.io/projected/bed28d37-5c2e-4088-b0ed-d4b3dbbc42b2-kube-api-access-thtvw\") pod \"cinder-a91f-account-create-update-v4qw7\" (UID: \"bed28d37-5c2e-4088-b0ed-d4b3dbbc42b2\") " pod="openstack/cinder-a91f-account-create-update-v4qw7" Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.347550 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-1a88-account-create-update-wkjzb"] Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.348701 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-1a88-account-create-update-wkjzb" Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.352243 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.379641 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a91f-account-create-update-v4qw7" Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.381161 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9444ec25-acd8-4038-8e7f-052ec1ba2f36-operator-scripts\") pod \"barbican-db-create-5bz9b\" (UID: \"9444ec25-acd8-4038-8e7f-052ec1ba2f36\") " pod="openstack/barbican-db-create-5bz9b" Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.381422 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jdcm\" (UniqueName: \"kubernetes.io/projected/9444ec25-acd8-4038-8e7f-052ec1ba2f36-kube-api-access-6jdcm\") pod \"barbican-db-create-5bz9b\" (UID: \"9444ec25-acd8-4038-8e7f-052ec1ba2f36\") " pod="openstack/barbican-db-create-5bz9b" Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.393197 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-k9pq4" Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.405580 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-1a88-account-create-update-wkjzb"] Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.485361 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9444ec25-acd8-4038-8e7f-052ec1ba2f36-operator-scripts\") pod \"barbican-db-create-5bz9b\" (UID: \"9444ec25-acd8-4038-8e7f-052ec1ba2f36\") " pod="openstack/barbican-db-create-5bz9b" Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.486888 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qd2vc\" (UniqueName: \"kubernetes.io/projected/0be6c2fe-329c-411f-8557-cf19f2f0be4c-kube-api-access-qd2vc\") pod \"barbican-1a88-account-create-update-wkjzb\" (UID: \"0be6c2fe-329c-411f-8557-cf19f2f0be4c\") " pod="openstack/barbican-1a88-account-create-update-wkjzb" Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.486942 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0be6c2fe-329c-411f-8557-cf19f2f0be4c-operator-scripts\") pod \"barbican-1a88-account-create-update-wkjzb\" (UID: \"0be6c2fe-329c-411f-8557-cf19f2f0be4c\") " pod="openstack/barbican-1a88-account-create-update-wkjzb" Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.487063 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jdcm\" (UniqueName: \"kubernetes.io/projected/9444ec25-acd8-4038-8e7f-052ec1ba2f36-kube-api-access-6jdcm\") pod \"barbican-db-create-5bz9b\" (UID: \"9444ec25-acd8-4038-8e7f-052ec1ba2f36\") " pod="openstack/barbican-db-create-5bz9b" Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.491417 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9444ec25-acd8-4038-8e7f-052ec1ba2f36-operator-scripts\") pod \"barbican-db-create-5bz9b\" (UID: \"9444ec25-acd8-4038-8e7f-052ec1ba2f36\") " pod="openstack/barbican-db-create-5bz9b" Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.491502 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-lrc4n"] Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.492732 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-lrc4n" Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.525987 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jdcm\" (UniqueName: \"kubernetes.io/projected/9444ec25-acd8-4038-8e7f-052ec1ba2f36-kube-api-access-6jdcm\") pod \"barbican-db-create-5bz9b\" (UID: \"9444ec25-acd8-4038-8e7f-052ec1ba2f36\") " pod="openstack/barbican-db-create-5bz9b" Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.551051 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-lrc4n"] Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.559475 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-5bz9b" Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.573541 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-fh5fg"] Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.574840 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-fh5fg" Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.580727 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.580920 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-mj6w8" Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.581111 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.581211 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.588325 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0be6c2fe-329c-411f-8557-cf19f2f0be4c-operator-scripts\") pod \"barbican-1a88-account-create-update-wkjzb\" (UID: \"0be6c2fe-329c-411f-8557-cf19f2f0be4c\") " pod="openstack/barbican-1a88-account-create-update-wkjzb" Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.588582 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqqcp\" (UniqueName: \"kubernetes.io/projected/977a54d5-9b5e-4399-80ce-6682e0a78d3c-kube-api-access-wqqcp\") pod \"neutron-db-create-lrc4n\" (UID: \"977a54d5-9b5e-4399-80ce-6682e0a78d3c\") " pod="openstack/neutron-db-create-lrc4n" Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.588606 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qd2vc\" (UniqueName: \"kubernetes.io/projected/0be6c2fe-329c-411f-8557-cf19f2f0be4c-kube-api-access-qd2vc\") pod \"barbican-1a88-account-create-update-wkjzb\" (UID: \"0be6c2fe-329c-411f-8557-cf19f2f0be4c\") " pod="openstack/barbican-1a88-account-create-update-wkjzb" Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.588648 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/977a54d5-9b5e-4399-80ce-6682e0a78d3c-operator-scripts\") pod \"neutron-db-create-lrc4n\" (UID: \"977a54d5-9b5e-4399-80ce-6682e0a78d3c\") " pod="openstack/neutron-db-create-lrc4n" Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.589144 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0be6c2fe-329c-411f-8557-cf19f2f0be4c-operator-scripts\") pod \"barbican-1a88-account-create-update-wkjzb\" (UID: \"0be6c2fe-329c-411f-8557-cf19f2f0be4c\") " pod="openstack/barbican-1a88-account-create-update-wkjzb" Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.621753 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qd2vc\" (UniqueName: \"kubernetes.io/projected/0be6c2fe-329c-411f-8557-cf19f2f0be4c-kube-api-access-qd2vc\") pod \"barbican-1a88-account-create-update-wkjzb\" (UID: \"0be6c2fe-329c-411f-8557-cf19f2f0be4c\") " pod="openstack/barbican-1a88-account-create-update-wkjzb" Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.652142 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-fh5fg"] Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.690619 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqqcp\" (UniqueName: \"kubernetes.io/projected/977a54d5-9b5e-4399-80ce-6682e0a78d3c-kube-api-access-wqqcp\") pod \"neutron-db-create-lrc4n\" (UID: \"977a54d5-9b5e-4399-80ce-6682e0a78d3c\") " pod="openstack/neutron-db-create-lrc4n" Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.690862 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/977a54d5-9b5e-4399-80ce-6682e0a78d3c-operator-scripts\") pod \"neutron-db-create-lrc4n\" (UID: \"977a54d5-9b5e-4399-80ce-6682e0a78d3c\") " pod="openstack/neutron-db-create-lrc4n" Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.690984 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f99104c3-84fc-45e1-8b1d-e92a2cf55633-combined-ca-bundle\") pod \"keystone-db-sync-fh5fg\" (UID: \"f99104c3-84fc-45e1-8b1d-e92a2cf55633\") " pod="openstack/keystone-db-sync-fh5fg" Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.691098 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpzsj\" (UniqueName: \"kubernetes.io/projected/f99104c3-84fc-45e1-8b1d-e92a2cf55633-kube-api-access-wpzsj\") pod \"keystone-db-sync-fh5fg\" (UID: \"f99104c3-84fc-45e1-8b1d-e92a2cf55633\") " pod="openstack/keystone-db-sync-fh5fg" Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.691222 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f99104c3-84fc-45e1-8b1d-e92a2cf55633-config-data\") pod \"keystone-db-sync-fh5fg\" (UID: \"f99104c3-84fc-45e1-8b1d-e92a2cf55633\") " pod="openstack/keystone-db-sync-fh5fg" Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.692245 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/977a54d5-9b5e-4399-80ce-6682e0a78d3c-operator-scripts\") pod \"neutron-db-create-lrc4n\" (UID: \"977a54d5-9b5e-4399-80ce-6682e0a78d3c\") " pod="openstack/neutron-db-create-lrc4n" Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.703042 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-8a75-account-create-update-qzf69"] Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.704421 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8a75-account-create-update-qzf69" Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.710305 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.716840 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqqcp\" (UniqueName: \"kubernetes.io/projected/977a54d5-9b5e-4399-80ce-6682e0a78d3c-kube-api-access-wqqcp\") pod \"neutron-db-create-lrc4n\" (UID: \"977a54d5-9b5e-4399-80ce-6682e0a78d3c\") " pod="openstack/neutron-db-create-lrc4n" Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.725159 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-8a75-account-create-update-qzf69"] Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.771300 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-1a88-account-create-update-wkjzb" Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.797091 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f99104c3-84fc-45e1-8b1d-e92a2cf55633-config-data\") pod \"keystone-db-sync-fh5fg\" (UID: \"f99104c3-84fc-45e1-8b1d-e92a2cf55633\") " pod="openstack/keystone-db-sync-fh5fg" Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.797992 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bbbt\" (UniqueName: \"kubernetes.io/projected/27af24bb-3898-4376-9985-63237c74d33f-kube-api-access-5bbbt\") pod \"neutron-8a75-account-create-update-qzf69\" (UID: \"27af24bb-3898-4376-9985-63237c74d33f\") " pod="openstack/neutron-8a75-account-create-update-qzf69" Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.798139 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f99104c3-84fc-45e1-8b1d-e92a2cf55633-combined-ca-bundle\") pod \"keystone-db-sync-fh5fg\" (UID: \"f99104c3-84fc-45e1-8b1d-e92a2cf55633\") " pod="openstack/keystone-db-sync-fh5fg" Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.798225 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27af24bb-3898-4376-9985-63237c74d33f-operator-scripts\") pod \"neutron-8a75-account-create-update-qzf69\" (UID: \"27af24bb-3898-4376-9985-63237c74d33f\") " pod="openstack/neutron-8a75-account-create-update-qzf69" Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.798267 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpzsj\" (UniqueName: \"kubernetes.io/projected/f99104c3-84fc-45e1-8b1d-e92a2cf55633-kube-api-access-wpzsj\") pod \"keystone-db-sync-fh5fg\" (UID: \"f99104c3-84fc-45e1-8b1d-e92a2cf55633\") " pod="openstack/keystone-db-sync-fh5fg" Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.810099 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f99104c3-84fc-45e1-8b1d-e92a2cf55633-config-data\") pod \"keystone-db-sync-fh5fg\" (UID: \"f99104c3-84fc-45e1-8b1d-e92a2cf55633\") " pod="openstack/keystone-db-sync-fh5fg" Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.810593 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f99104c3-84fc-45e1-8b1d-e92a2cf55633-combined-ca-bundle\") pod \"keystone-db-sync-fh5fg\" (UID: \"f99104c3-84fc-45e1-8b1d-e92a2cf55633\") " pod="openstack/keystone-db-sync-fh5fg" Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.827564 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpzsj\" (UniqueName: \"kubernetes.io/projected/f99104c3-84fc-45e1-8b1d-e92a2cf55633-kube-api-access-wpzsj\") pod \"keystone-db-sync-fh5fg\" (UID: \"f99104c3-84fc-45e1-8b1d-e92a2cf55633\") " pod="openstack/keystone-db-sync-fh5fg" Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.880332 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-lrc4n" Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.890888 4585 generic.go:334] "Generic (PLEG): container finished" podID="8221747f-ffad-4b2e-bbd9-94b7d25949cc" containerID="f26e017f19af5f717b0f8d349fc9ae2e86c4391db97399115453c902e31de17f" exitCode=0 Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.891152 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-clpkz" event={"ID":"8221747f-ffad-4b2e-bbd9-94b7d25949cc","Type":"ContainerDied","Data":"f26e017f19af5f717b0f8d349fc9ae2e86c4391db97399115453c902e31de17f"} Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.895309 4585 generic.go:334] "Generic (PLEG): container finished" podID="36a1f1e1-f620-422c-b66d-db401c8be015" containerID="454e37522701b59d5c7376872f5534c7878a7027ca4dfb4e37d6f4044ec79619" exitCode=0 Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.895351 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-5hcms" event={"ID":"36a1f1e1-f620-422c-b66d-db401c8be015","Type":"ContainerDied","Data":"454e37522701b59d5c7376872f5534c7878a7027ca4dfb4e37d6f4044ec79619"} Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.899710 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27af24bb-3898-4376-9985-63237c74d33f-operator-scripts\") pod \"neutron-8a75-account-create-update-qzf69\" (UID: \"27af24bb-3898-4376-9985-63237c74d33f\") " pod="openstack/neutron-8a75-account-create-update-qzf69" Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.899802 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bbbt\" (UniqueName: \"kubernetes.io/projected/27af24bb-3898-4376-9985-63237c74d33f-kube-api-access-5bbbt\") pod \"neutron-8a75-account-create-update-qzf69\" (UID: \"27af24bb-3898-4376-9985-63237c74d33f\") " pod="openstack/neutron-8a75-account-create-update-qzf69" Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.900577 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27af24bb-3898-4376-9985-63237c74d33f-operator-scripts\") pod \"neutron-8a75-account-create-update-qzf69\" (UID: \"27af24bb-3898-4376-9985-63237c74d33f\") " pod="openstack/neutron-8a75-account-create-update-qzf69" Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.910283 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-clpkz" Dec 01 14:14:38 crc kubenswrapper[4585]: I1201 14:14:38.918186 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-fh5fg" Dec 01 14:14:39 crc kubenswrapper[4585]: I1201 14:14:39.001527 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8221747f-ffad-4b2e-bbd9-94b7d25949cc-ovsdbserver-sb\") pod \"8221747f-ffad-4b2e-bbd9-94b7d25949cc\" (UID: \"8221747f-ffad-4b2e-bbd9-94b7d25949cc\") " Dec 01 14:14:39 crc kubenswrapper[4585]: I1201 14:14:39.001795 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8221747f-ffad-4b2e-bbd9-94b7d25949cc-config\") pod \"8221747f-ffad-4b2e-bbd9-94b7d25949cc\" (UID: \"8221747f-ffad-4b2e-bbd9-94b7d25949cc\") " Dec 01 14:14:39 crc kubenswrapper[4585]: I1201 14:14:39.001927 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8221747f-ffad-4b2e-bbd9-94b7d25949cc-dns-svc\") pod \"8221747f-ffad-4b2e-bbd9-94b7d25949cc\" (UID: \"8221747f-ffad-4b2e-bbd9-94b7d25949cc\") " Dec 01 14:14:39 crc kubenswrapper[4585]: I1201 14:14:39.002085 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8221747f-ffad-4b2e-bbd9-94b7d25949cc-dns-swift-storage-0\") pod \"8221747f-ffad-4b2e-bbd9-94b7d25949cc\" (UID: \"8221747f-ffad-4b2e-bbd9-94b7d25949cc\") " Dec 01 14:14:39 crc kubenswrapper[4585]: I1201 14:14:39.002216 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8221747f-ffad-4b2e-bbd9-94b7d25949cc-ovsdbserver-nb\") pod \"8221747f-ffad-4b2e-bbd9-94b7d25949cc\" (UID: \"8221747f-ffad-4b2e-bbd9-94b7d25949cc\") " Dec 01 14:14:39 crc kubenswrapper[4585]: I1201 14:14:39.002391 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7pwh6\" (UniqueName: \"kubernetes.io/projected/8221747f-ffad-4b2e-bbd9-94b7d25949cc-kube-api-access-7pwh6\") pod \"8221747f-ffad-4b2e-bbd9-94b7d25949cc\" (UID: \"8221747f-ffad-4b2e-bbd9-94b7d25949cc\") " Dec 01 14:14:39 crc kubenswrapper[4585]: I1201 14:14:39.026250 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8221747f-ffad-4b2e-bbd9-94b7d25949cc-kube-api-access-7pwh6" (OuterVolumeSpecName: "kube-api-access-7pwh6") pod "8221747f-ffad-4b2e-bbd9-94b7d25949cc" (UID: "8221747f-ffad-4b2e-bbd9-94b7d25949cc"). InnerVolumeSpecName "kube-api-access-7pwh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:14:39 crc kubenswrapper[4585]: I1201 14:14:39.033541 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bbbt\" (UniqueName: \"kubernetes.io/projected/27af24bb-3898-4376-9985-63237c74d33f-kube-api-access-5bbbt\") pod \"neutron-8a75-account-create-update-qzf69\" (UID: \"27af24bb-3898-4376-9985-63237c74d33f\") " pod="openstack/neutron-8a75-account-create-update-qzf69" Dec 01 14:14:39 crc kubenswrapper[4585]: I1201 14:14:39.083835 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8a75-account-create-update-qzf69" Dec 01 14:14:39 crc kubenswrapper[4585]: I1201 14:14:39.102187 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8221747f-ffad-4b2e-bbd9-94b7d25949cc-config" (OuterVolumeSpecName: "config") pod "8221747f-ffad-4b2e-bbd9-94b7d25949cc" (UID: "8221747f-ffad-4b2e-bbd9-94b7d25949cc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:14:39 crc kubenswrapper[4585]: I1201 14:14:39.103952 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7pwh6\" (UniqueName: \"kubernetes.io/projected/8221747f-ffad-4b2e-bbd9-94b7d25949cc-kube-api-access-7pwh6\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:39 crc kubenswrapper[4585]: I1201 14:14:39.103987 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8221747f-ffad-4b2e-bbd9-94b7d25949cc-config\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:39 crc kubenswrapper[4585]: I1201 14:14:39.133461 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8221747f-ffad-4b2e-bbd9-94b7d25949cc-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8221747f-ffad-4b2e-bbd9-94b7d25949cc" (UID: "8221747f-ffad-4b2e-bbd9-94b7d25949cc"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:14:39 crc kubenswrapper[4585]: I1201 14:14:39.141177 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8221747f-ffad-4b2e-bbd9-94b7d25949cc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8221747f-ffad-4b2e-bbd9-94b7d25949cc" (UID: "8221747f-ffad-4b2e-bbd9-94b7d25949cc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:14:39 crc kubenswrapper[4585]: I1201 14:14:39.185768 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8221747f-ffad-4b2e-bbd9-94b7d25949cc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8221747f-ffad-4b2e-bbd9-94b7d25949cc" (UID: "8221747f-ffad-4b2e-bbd9-94b7d25949cc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:14:39 crc kubenswrapper[4585]: I1201 14:14:39.198704 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8221747f-ffad-4b2e-bbd9-94b7d25949cc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8221747f-ffad-4b2e-bbd9-94b7d25949cc" (UID: "8221747f-ffad-4b2e-bbd9-94b7d25949cc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:14:39 crc kubenswrapper[4585]: I1201 14:14:39.217881 4585 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8221747f-ffad-4b2e-bbd9-94b7d25949cc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:39 crc kubenswrapper[4585]: I1201 14:14:39.217914 4585 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8221747f-ffad-4b2e-bbd9-94b7d25949cc-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:39 crc kubenswrapper[4585]: I1201 14:14:39.217924 4585 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8221747f-ffad-4b2e-bbd9-94b7d25949cc-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:39 crc kubenswrapper[4585]: I1201 14:14:39.217934 4585 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8221747f-ffad-4b2e-bbd9-94b7d25949cc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:39 crc kubenswrapper[4585]: I1201 14:14:39.357921 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-k9pq4"] Dec 01 14:14:39 crc kubenswrapper[4585]: I1201 14:14:39.382543 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-a91f-account-create-update-v4qw7"] Dec 01 14:14:39 crc kubenswrapper[4585]: I1201 14:14:39.516484 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-5bz9b"] Dec 01 14:14:39 crc kubenswrapper[4585]: W1201 14:14:39.551268 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9444ec25_acd8_4038_8e7f_052ec1ba2f36.slice/crio-f84f1631b76658b83241d6b6296883bf773c5a9c4ed67f73b75e6a197131f5d8 WatchSource:0}: Error finding container f84f1631b76658b83241d6b6296883bf773c5a9c4ed67f73b75e6a197131f5d8: Status 404 returned error can't find the container with id f84f1631b76658b83241d6b6296883bf773c5a9c4ed67f73b75e6a197131f5d8 Dec 01 14:14:39 crc kubenswrapper[4585]: I1201 14:14:39.904045 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-5bz9b" event={"ID":"9444ec25-acd8-4038-8e7f-052ec1ba2f36","Type":"ContainerStarted","Data":"f84f1631b76658b83241d6b6296883bf773c5a9c4ed67f73b75e6a197131f5d8"} Dec 01 14:14:39 crc kubenswrapper[4585]: I1201 14:14:39.909705 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-clpkz" event={"ID":"8221747f-ffad-4b2e-bbd9-94b7d25949cc","Type":"ContainerDied","Data":"0ef5c3ea209683a1ac074d0314872a8cc32447351a782476f2611d224a0a65db"} Dec 01 14:14:39 crc kubenswrapper[4585]: I1201 14:14:39.909740 4585 scope.go:117] "RemoveContainer" containerID="f26e017f19af5f717b0f8d349fc9ae2e86c4391db97399115453c902e31de17f" Dec 01 14:14:39 crc kubenswrapper[4585]: I1201 14:14:39.909848 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-clpkz" Dec 01 14:14:39 crc kubenswrapper[4585]: I1201 14:14:39.928483 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a91f-account-create-update-v4qw7" event={"ID":"bed28d37-5c2e-4088-b0ed-d4b3dbbc42b2","Type":"ContainerStarted","Data":"824042bdef4d457c0c50bdb2e61a134d74854465c34f29dd09592c239618a939"} Dec 01 14:14:39 crc kubenswrapper[4585]: I1201 14:14:39.928515 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a91f-account-create-update-v4qw7" event={"ID":"bed28d37-5c2e-4088-b0ed-d4b3dbbc42b2","Type":"ContainerStarted","Data":"9e71330c36c78e41a409ca0602ccc90ae2c68fecfad45f478b674ef1a5c9cf4b"} Dec 01 14:14:39 crc kubenswrapper[4585]: I1201 14:14:39.943858 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-5hcms" event={"ID":"36a1f1e1-f620-422c-b66d-db401c8be015","Type":"ContainerStarted","Data":"3dd39e7994be5549a91ddb1139b0379927b4d57c3046dd7791b50a8ed3f81d28"} Dec 01 14:14:39 crc kubenswrapper[4585]: I1201 14:14:39.944598 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74f6bcbc87-5hcms" Dec 01 14:14:39 crc kubenswrapper[4585]: I1201 14:14:39.949467 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-k9pq4" event={"ID":"fb978ed4-fbc2-4706-821e-3e820802d995","Type":"ContainerStarted","Data":"290f7aca9059a288646ea974a383ec6db3819c88e856a2add53f4a313d8b00fe"} Dec 01 14:14:39 crc kubenswrapper[4585]: I1201 14:14:39.949495 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-k9pq4" event={"ID":"fb978ed4-fbc2-4706-821e-3e820802d995","Type":"ContainerStarted","Data":"c28224df2a5ab3e5ce76c8cc465a2dffa9a95049bd181f6c4f7be191364dbcd3"} Dec 01 14:14:39 crc kubenswrapper[4585]: I1201 14:14:39.956747 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-a91f-account-create-update-v4qw7" podStartSLOduration=1.9567285989999998 podStartE2EDuration="1.956728599s" podCreationTimestamp="2025-12-01 14:14:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:14:39.952317191 +0000 UTC m=+993.936531036" watchObservedRunningTime="2025-12-01 14:14:39.956728599 +0000 UTC m=+993.940942454" Dec 01 14:14:39 crc kubenswrapper[4585]: I1201 14:14:39.963254 4585 scope.go:117] "RemoveContainer" containerID="9af3e001e6713a8464ffda57096c95b11f6f20a92b9ad1e2f7b2ef8cda667999" Dec 01 14:14:40 crc kubenswrapper[4585]: I1201 14:14:40.071494 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-clpkz"] Dec 01 14:14:40 crc kubenswrapper[4585]: I1201 14:14:40.092044 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-clpkz"] Dec 01 14:14:40 crc kubenswrapper[4585]: I1201 14:14:40.094805 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74f6bcbc87-5hcms" podStartSLOduration=4.094781059 podStartE2EDuration="4.094781059s" podCreationTimestamp="2025-12-01 14:14:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:14:40.030561037 +0000 UTC m=+994.014774892" watchObservedRunningTime="2025-12-01 14:14:40.094781059 +0000 UTC m=+994.078994924" Dec 01 14:14:40 crc kubenswrapper[4585]: I1201 14:14:40.118456 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-k9pq4" podStartSLOduration=2.1184335 podStartE2EDuration="2.1184335s" podCreationTimestamp="2025-12-01 14:14:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:14:40.072947677 +0000 UTC m=+994.057161542" watchObservedRunningTime="2025-12-01 14:14:40.1184335 +0000 UTC m=+994.102647355" Dec 01 14:14:40 crc kubenswrapper[4585]: I1201 14:14:40.425417 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8221747f-ffad-4b2e-bbd9-94b7d25949cc" path="/var/lib/kubelet/pods/8221747f-ffad-4b2e-bbd9-94b7d25949cc/volumes" Dec 01 14:14:40 crc kubenswrapper[4585]: I1201 14:14:40.517046 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-lrc4n"] Dec 01 14:14:40 crc kubenswrapper[4585]: I1201 14:14:40.583563 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-1a88-account-create-update-wkjzb"] Dec 01 14:14:40 crc kubenswrapper[4585]: I1201 14:14:40.592710 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-8a75-account-create-update-qzf69"] Dec 01 14:14:40 crc kubenswrapper[4585]: W1201 14:14:40.599221 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27af24bb_3898_4376_9985_63237c74d33f.slice/crio-14189402f21faacf3b52d6e81203c4898a1aba0bd5915940daa2d02df9943c7f WatchSource:0}: Error finding container 14189402f21faacf3b52d6e81203c4898a1aba0bd5915940daa2d02df9943c7f: Status 404 returned error can't find the container with id 14189402f21faacf3b52d6e81203c4898a1aba0bd5915940daa2d02df9943c7f Dec 01 14:14:40 crc kubenswrapper[4585]: W1201 14:14:40.600393 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0be6c2fe_329c_411f_8557_cf19f2f0be4c.slice/crio-27218f47626c38bf7394b47a8248c11f1658a31b22c8a78812f94a20901eaf31 WatchSource:0}: Error finding container 27218f47626c38bf7394b47a8248c11f1658a31b22c8a78812f94a20901eaf31: Status 404 returned error can't find the container with id 27218f47626c38bf7394b47a8248c11f1658a31b22c8a78812f94a20901eaf31 Dec 01 14:14:40 crc kubenswrapper[4585]: W1201 14:14:40.606854 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf99104c3_84fc_45e1_8b1d_e92a2cf55633.slice/crio-6a4d2f1cf544500fe937b62c8561c571fbfab45526a6bf69b1281c4a82683630 WatchSource:0}: Error finding container 6a4d2f1cf544500fe937b62c8561c571fbfab45526a6bf69b1281c4a82683630: Status 404 returned error can't find the container with id 6a4d2f1cf544500fe937b62c8561c571fbfab45526a6bf69b1281c4a82683630 Dec 01 14:14:40 crc kubenswrapper[4585]: I1201 14:14:40.607016 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-fh5fg"] Dec 01 14:14:40 crc kubenswrapper[4585]: I1201 14:14:40.959676 4585 generic.go:334] "Generic (PLEG): container finished" podID="977a54d5-9b5e-4399-80ce-6682e0a78d3c" containerID="d9e2499863533663da2eb2ac3ce14aec95113619c7e1324357b02610d35f5953" exitCode=0 Dec 01 14:14:40 crc kubenswrapper[4585]: I1201 14:14:40.959732 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-lrc4n" event={"ID":"977a54d5-9b5e-4399-80ce-6682e0a78d3c","Type":"ContainerDied","Data":"d9e2499863533663da2eb2ac3ce14aec95113619c7e1324357b02610d35f5953"} Dec 01 14:14:40 crc kubenswrapper[4585]: I1201 14:14:40.960143 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-lrc4n" event={"ID":"977a54d5-9b5e-4399-80ce-6682e0a78d3c","Type":"ContainerStarted","Data":"148d92407a3bf34d53f8f9a6f89e5af6203848294c2c2a6d82e94fe65475a85f"} Dec 01 14:14:40 crc kubenswrapper[4585]: I1201 14:14:40.961588 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-fh5fg" event={"ID":"f99104c3-84fc-45e1-8b1d-e92a2cf55633","Type":"ContainerStarted","Data":"6a4d2f1cf544500fe937b62c8561c571fbfab45526a6bf69b1281c4a82683630"} Dec 01 14:14:40 crc kubenswrapper[4585]: I1201 14:14:40.963374 4585 generic.go:334] "Generic (PLEG): container finished" podID="fb978ed4-fbc2-4706-821e-3e820802d995" containerID="290f7aca9059a288646ea974a383ec6db3819c88e856a2add53f4a313d8b00fe" exitCode=0 Dec 01 14:14:40 crc kubenswrapper[4585]: I1201 14:14:40.963418 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-k9pq4" event={"ID":"fb978ed4-fbc2-4706-821e-3e820802d995","Type":"ContainerDied","Data":"290f7aca9059a288646ea974a383ec6db3819c88e856a2add53f4a313d8b00fe"} Dec 01 14:14:40 crc kubenswrapper[4585]: I1201 14:14:40.964817 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8a75-account-create-update-qzf69" event={"ID":"27af24bb-3898-4376-9985-63237c74d33f","Type":"ContainerStarted","Data":"8bd698d13098a76d2d67f8c25fdd60899fe0d676c9b860ef4f351828a01641ec"} Dec 01 14:14:40 crc kubenswrapper[4585]: I1201 14:14:40.964842 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8a75-account-create-update-qzf69" event={"ID":"27af24bb-3898-4376-9985-63237c74d33f","Type":"ContainerStarted","Data":"14189402f21faacf3b52d6e81203c4898a1aba0bd5915940daa2d02df9943c7f"} Dec 01 14:14:40 crc kubenswrapper[4585]: I1201 14:14:40.969839 4585 generic.go:334] "Generic (PLEG): container finished" podID="9444ec25-acd8-4038-8e7f-052ec1ba2f36" containerID="6c1ef3d1614a76dcda37f03f84c33346175a6b23c90a90aa3258596190ac356f" exitCode=0 Dec 01 14:14:40 crc kubenswrapper[4585]: I1201 14:14:40.969877 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-5bz9b" event={"ID":"9444ec25-acd8-4038-8e7f-052ec1ba2f36","Type":"ContainerDied","Data":"6c1ef3d1614a76dcda37f03f84c33346175a6b23c90a90aa3258596190ac356f"} Dec 01 14:14:40 crc kubenswrapper[4585]: I1201 14:14:40.972998 4585 generic.go:334] "Generic (PLEG): container finished" podID="bed28d37-5c2e-4088-b0ed-d4b3dbbc42b2" containerID="824042bdef4d457c0c50bdb2e61a134d74854465c34f29dd09592c239618a939" exitCode=0 Dec 01 14:14:40 crc kubenswrapper[4585]: I1201 14:14:40.973062 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a91f-account-create-update-v4qw7" event={"ID":"bed28d37-5c2e-4088-b0ed-d4b3dbbc42b2","Type":"ContainerDied","Data":"824042bdef4d457c0c50bdb2e61a134d74854465c34f29dd09592c239618a939"} Dec 01 14:14:40 crc kubenswrapper[4585]: I1201 14:14:40.979249 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-1a88-account-create-update-wkjzb" event={"ID":"0be6c2fe-329c-411f-8557-cf19f2f0be4c","Type":"ContainerStarted","Data":"435c0ba6b3fa213eaa1b88590bc3d810dc145ce2c8fb5ff6e1ef3ff9258846a7"} Dec 01 14:14:40 crc kubenswrapper[4585]: I1201 14:14:40.979286 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-1a88-account-create-update-wkjzb" event={"ID":"0be6c2fe-329c-411f-8557-cf19f2f0be4c","Type":"ContainerStarted","Data":"27218f47626c38bf7394b47a8248c11f1658a31b22c8a78812f94a20901eaf31"} Dec 01 14:14:41 crc kubenswrapper[4585]: I1201 14:14:41.023334 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-8a75-account-create-update-qzf69" podStartSLOduration=3.0233095 podStartE2EDuration="3.0233095s" podCreationTimestamp="2025-12-01 14:14:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:14:41.019802037 +0000 UTC m=+995.004015892" watchObservedRunningTime="2025-12-01 14:14:41.0233095 +0000 UTC m=+995.007523365" Dec 01 14:14:41 crc kubenswrapper[4585]: I1201 14:14:41.051425 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-1a88-account-create-update-wkjzb" podStartSLOduration=3.051405999 podStartE2EDuration="3.051405999s" podCreationTimestamp="2025-12-01 14:14:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:14:41.050805193 +0000 UTC m=+995.035019048" watchObservedRunningTime="2025-12-01 14:14:41.051405999 +0000 UTC m=+995.035619864" Dec 01 14:14:41 crc kubenswrapper[4585]: I1201 14:14:41.986943 4585 generic.go:334] "Generic (PLEG): container finished" podID="27af24bb-3898-4376-9985-63237c74d33f" containerID="8bd698d13098a76d2d67f8c25fdd60899fe0d676c9b860ef4f351828a01641ec" exitCode=0 Dec 01 14:14:41 crc kubenswrapper[4585]: I1201 14:14:41.987014 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8a75-account-create-update-qzf69" event={"ID":"27af24bb-3898-4376-9985-63237c74d33f","Type":"ContainerDied","Data":"8bd698d13098a76d2d67f8c25fdd60899fe0d676c9b860ef4f351828a01641ec"} Dec 01 14:14:41 crc kubenswrapper[4585]: I1201 14:14:41.988623 4585 generic.go:334] "Generic (PLEG): container finished" podID="0be6c2fe-329c-411f-8557-cf19f2f0be4c" containerID="435c0ba6b3fa213eaa1b88590bc3d810dc145ce2c8fb5ff6e1ef3ff9258846a7" exitCode=0 Dec 01 14:14:41 crc kubenswrapper[4585]: I1201 14:14:41.988745 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-1a88-account-create-update-wkjzb" event={"ID":"0be6c2fe-329c-411f-8557-cf19f2f0be4c","Type":"ContainerDied","Data":"435c0ba6b3fa213eaa1b88590bc3d810dc145ce2c8fb5ff6e1ef3ff9258846a7"} Dec 01 14:14:42 crc kubenswrapper[4585]: I1201 14:14:42.407085 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-k9pq4" Dec 01 14:14:42 crc kubenswrapper[4585]: I1201 14:14:42.495425 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb978ed4-fbc2-4706-821e-3e820802d995-operator-scripts\") pod \"fb978ed4-fbc2-4706-821e-3e820802d995\" (UID: \"fb978ed4-fbc2-4706-821e-3e820802d995\") " Dec 01 14:14:42 crc kubenswrapper[4585]: I1201 14:14:42.495511 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlfnx\" (UniqueName: \"kubernetes.io/projected/fb978ed4-fbc2-4706-821e-3e820802d995-kube-api-access-hlfnx\") pod \"fb978ed4-fbc2-4706-821e-3e820802d995\" (UID: \"fb978ed4-fbc2-4706-821e-3e820802d995\") " Dec 01 14:14:42 crc kubenswrapper[4585]: I1201 14:14:42.496831 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb978ed4-fbc2-4706-821e-3e820802d995-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fb978ed4-fbc2-4706-821e-3e820802d995" (UID: "fb978ed4-fbc2-4706-821e-3e820802d995"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:14:42 crc kubenswrapper[4585]: I1201 14:14:42.528452 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb978ed4-fbc2-4706-821e-3e820802d995-kube-api-access-hlfnx" (OuterVolumeSpecName: "kube-api-access-hlfnx") pod "fb978ed4-fbc2-4706-821e-3e820802d995" (UID: "fb978ed4-fbc2-4706-821e-3e820802d995"). InnerVolumeSpecName "kube-api-access-hlfnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:14:42 crc kubenswrapper[4585]: I1201 14:14:42.583634 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a91f-account-create-update-v4qw7" Dec 01 14:14:42 crc kubenswrapper[4585]: I1201 14:14:42.597396 4585 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb978ed4-fbc2-4706-821e-3e820802d995-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:42 crc kubenswrapper[4585]: I1201 14:14:42.597431 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hlfnx\" (UniqueName: \"kubernetes.io/projected/fb978ed4-fbc2-4706-821e-3e820802d995-kube-api-access-hlfnx\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:42 crc kubenswrapper[4585]: I1201 14:14:42.685466 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-5bz9b" Dec 01 14:14:42 crc kubenswrapper[4585]: I1201 14:14:42.694656 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-lrc4n" Dec 01 14:14:42 crc kubenswrapper[4585]: I1201 14:14:42.698589 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bed28d37-5c2e-4088-b0ed-d4b3dbbc42b2-operator-scripts\") pod \"bed28d37-5c2e-4088-b0ed-d4b3dbbc42b2\" (UID: \"bed28d37-5c2e-4088-b0ed-d4b3dbbc42b2\") " Dec 01 14:14:42 crc kubenswrapper[4585]: I1201 14:14:42.698692 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thtvw\" (UniqueName: \"kubernetes.io/projected/bed28d37-5c2e-4088-b0ed-d4b3dbbc42b2-kube-api-access-thtvw\") pod \"bed28d37-5c2e-4088-b0ed-d4b3dbbc42b2\" (UID: \"bed28d37-5c2e-4088-b0ed-d4b3dbbc42b2\") " Dec 01 14:14:42 crc kubenswrapper[4585]: I1201 14:14:42.700102 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bed28d37-5c2e-4088-b0ed-d4b3dbbc42b2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bed28d37-5c2e-4088-b0ed-d4b3dbbc42b2" (UID: "bed28d37-5c2e-4088-b0ed-d4b3dbbc42b2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:14:42 crc kubenswrapper[4585]: I1201 14:14:42.710635 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bed28d37-5c2e-4088-b0ed-d4b3dbbc42b2-kube-api-access-thtvw" (OuterVolumeSpecName: "kube-api-access-thtvw") pod "bed28d37-5c2e-4088-b0ed-d4b3dbbc42b2" (UID: "bed28d37-5c2e-4088-b0ed-d4b3dbbc42b2"). InnerVolumeSpecName "kube-api-access-thtvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:14:42 crc kubenswrapper[4585]: I1201 14:14:42.803546 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/977a54d5-9b5e-4399-80ce-6682e0a78d3c-operator-scripts\") pod \"977a54d5-9b5e-4399-80ce-6682e0a78d3c\" (UID: \"977a54d5-9b5e-4399-80ce-6682e0a78d3c\") " Dec 01 14:14:42 crc kubenswrapper[4585]: I1201 14:14:42.803638 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9444ec25-acd8-4038-8e7f-052ec1ba2f36-operator-scripts\") pod \"9444ec25-acd8-4038-8e7f-052ec1ba2f36\" (UID: \"9444ec25-acd8-4038-8e7f-052ec1ba2f36\") " Dec 01 14:14:42 crc kubenswrapper[4585]: I1201 14:14:42.803667 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jdcm\" (UniqueName: \"kubernetes.io/projected/9444ec25-acd8-4038-8e7f-052ec1ba2f36-kube-api-access-6jdcm\") pod \"9444ec25-acd8-4038-8e7f-052ec1ba2f36\" (UID: \"9444ec25-acd8-4038-8e7f-052ec1ba2f36\") " Dec 01 14:14:42 crc kubenswrapper[4585]: I1201 14:14:42.803731 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqqcp\" (UniqueName: \"kubernetes.io/projected/977a54d5-9b5e-4399-80ce-6682e0a78d3c-kube-api-access-wqqcp\") pod \"977a54d5-9b5e-4399-80ce-6682e0a78d3c\" (UID: \"977a54d5-9b5e-4399-80ce-6682e0a78d3c\") " Dec 01 14:14:42 crc kubenswrapper[4585]: I1201 14:14:42.804043 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/977a54d5-9b5e-4399-80ce-6682e0a78d3c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "977a54d5-9b5e-4399-80ce-6682e0a78d3c" (UID: "977a54d5-9b5e-4399-80ce-6682e0a78d3c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:14:42 crc kubenswrapper[4585]: I1201 14:14:42.804487 4585 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bed28d37-5c2e-4088-b0ed-d4b3dbbc42b2-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:42 crc kubenswrapper[4585]: I1201 14:14:42.804504 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thtvw\" (UniqueName: \"kubernetes.io/projected/bed28d37-5c2e-4088-b0ed-d4b3dbbc42b2-kube-api-access-thtvw\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:42 crc kubenswrapper[4585]: I1201 14:14:42.804514 4585 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/977a54d5-9b5e-4399-80ce-6682e0a78d3c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:42 crc kubenswrapper[4585]: I1201 14:14:42.807581 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/977a54d5-9b5e-4399-80ce-6682e0a78d3c-kube-api-access-wqqcp" (OuterVolumeSpecName: "kube-api-access-wqqcp") pod "977a54d5-9b5e-4399-80ce-6682e0a78d3c" (UID: "977a54d5-9b5e-4399-80ce-6682e0a78d3c"). InnerVolumeSpecName "kube-api-access-wqqcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:14:42 crc kubenswrapper[4585]: I1201 14:14:42.808410 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9444ec25-acd8-4038-8e7f-052ec1ba2f36-kube-api-access-6jdcm" (OuterVolumeSpecName: "kube-api-access-6jdcm") pod "9444ec25-acd8-4038-8e7f-052ec1ba2f36" (UID: "9444ec25-acd8-4038-8e7f-052ec1ba2f36"). InnerVolumeSpecName "kube-api-access-6jdcm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:14:42 crc kubenswrapper[4585]: I1201 14:14:42.809437 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9444ec25-acd8-4038-8e7f-052ec1ba2f36-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9444ec25-acd8-4038-8e7f-052ec1ba2f36" (UID: "9444ec25-acd8-4038-8e7f-052ec1ba2f36"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:14:42 crc kubenswrapper[4585]: I1201 14:14:42.907559 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqqcp\" (UniqueName: \"kubernetes.io/projected/977a54d5-9b5e-4399-80ce-6682e0a78d3c-kube-api-access-wqqcp\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:42 crc kubenswrapper[4585]: I1201 14:14:42.907608 4585 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9444ec25-acd8-4038-8e7f-052ec1ba2f36-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:42 crc kubenswrapper[4585]: I1201 14:14:42.907618 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jdcm\" (UniqueName: \"kubernetes.io/projected/9444ec25-acd8-4038-8e7f-052ec1ba2f36-kube-api-access-6jdcm\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:42 crc kubenswrapper[4585]: I1201 14:14:42.996484 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-5bz9b" event={"ID":"9444ec25-acd8-4038-8e7f-052ec1ba2f36","Type":"ContainerDied","Data":"f84f1631b76658b83241d6b6296883bf773c5a9c4ed67f73b75e6a197131f5d8"} Dec 01 14:14:42 crc kubenswrapper[4585]: I1201 14:14:42.996511 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-5bz9b" Dec 01 14:14:42 crc kubenswrapper[4585]: I1201 14:14:42.996526 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f84f1631b76658b83241d6b6296883bf773c5a9c4ed67f73b75e6a197131f5d8" Dec 01 14:14:42 crc kubenswrapper[4585]: I1201 14:14:42.999522 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a91f-account-create-update-v4qw7" event={"ID":"bed28d37-5c2e-4088-b0ed-d4b3dbbc42b2","Type":"ContainerDied","Data":"9e71330c36c78e41a409ca0602ccc90ae2c68fecfad45f478b674ef1a5c9cf4b"} Dec 01 14:14:42 crc kubenswrapper[4585]: I1201 14:14:42.999565 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e71330c36c78e41a409ca0602ccc90ae2c68fecfad45f478b674ef1a5c9cf4b" Dec 01 14:14:42 crc kubenswrapper[4585]: I1201 14:14:42.999640 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a91f-account-create-update-v4qw7" Dec 01 14:14:43 crc kubenswrapper[4585]: I1201 14:14:43.008540 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-lrc4n" event={"ID":"977a54d5-9b5e-4399-80ce-6682e0a78d3c","Type":"ContainerDied","Data":"148d92407a3bf34d53f8f9a6f89e5af6203848294c2c2a6d82e94fe65475a85f"} Dec 01 14:14:43 crc kubenswrapper[4585]: I1201 14:14:43.008577 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="148d92407a3bf34d53f8f9a6f89e5af6203848294c2c2a6d82e94fe65475a85f" Dec 01 14:14:43 crc kubenswrapper[4585]: I1201 14:14:43.008647 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-lrc4n" Dec 01 14:14:43 crc kubenswrapper[4585]: I1201 14:14:43.011181 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-k9pq4" Dec 01 14:14:43 crc kubenswrapper[4585]: I1201 14:14:43.012924 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-k9pq4" event={"ID":"fb978ed4-fbc2-4706-821e-3e820802d995","Type":"ContainerDied","Data":"c28224df2a5ab3e5ce76c8cc465a2dffa9a95049bd181f6c4f7be191364dbcd3"} Dec 01 14:14:43 crc kubenswrapper[4585]: I1201 14:14:43.012949 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c28224df2a5ab3e5ce76c8cc465a2dffa9a95049bd181f6c4f7be191364dbcd3" Dec 01 14:14:43 crc kubenswrapper[4585]: I1201 14:14:43.716027 4585 patch_prober.go:28] interesting pod/machine-config-daemon-lj9gs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 14:14:43 crc kubenswrapper[4585]: I1201 14:14:43.716307 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 14:14:45 crc kubenswrapper[4585]: I1201 14:14:45.727829 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-1a88-account-create-update-wkjzb" Dec 01 14:14:45 crc kubenswrapper[4585]: I1201 14:14:45.779599 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8a75-account-create-update-qzf69" Dec 01 14:14:45 crc kubenswrapper[4585]: I1201 14:14:45.858284 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27af24bb-3898-4376-9985-63237c74d33f-operator-scripts\") pod \"27af24bb-3898-4376-9985-63237c74d33f\" (UID: \"27af24bb-3898-4376-9985-63237c74d33f\") " Dec 01 14:14:45 crc kubenswrapper[4585]: I1201 14:14:45.858325 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bbbt\" (UniqueName: \"kubernetes.io/projected/27af24bb-3898-4376-9985-63237c74d33f-kube-api-access-5bbbt\") pod \"27af24bb-3898-4376-9985-63237c74d33f\" (UID: \"27af24bb-3898-4376-9985-63237c74d33f\") " Dec 01 14:14:45 crc kubenswrapper[4585]: I1201 14:14:45.858367 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qd2vc\" (UniqueName: \"kubernetes.io/projected/0be6c2fe-329c-411f-8557-cf19f2f0be4c-kube-api-access-qd2vc\") pod \"0be6c2fe-329c-411f-8557-cf19f2f0be4c\" (UID: \"0be6c2fe-329c-411f-8557-cf19f2f0be4c\") " Dec 01 14:14:45 crc kubenswrapper[4585]: I1201 14:14:45.858390 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0be6c2fe-329c-411f-8557-cf19f2f0be4c-operator-scripts\") pod \"0be6c2fe-329c-411f-8557-cf19f2f0be4c\" (UID: \"0be6c2fe-329c-411f-8557-cf19f2f0be4c\") " Dec 01 14:14:45 crc kubenswrapper[4585]: I1201 14:14:45.859108 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27af24bb-3898-4376-9985-63237c74d33f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "27af24bb-3898-4376-9985-63237c74d33f" (UID: "27af24bb-3898-4376-9985-63237c74d33f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:14:45 crc kubenswrapper[4585]: I1201 14:14:45.859114 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0be6c2fe-329c-411f-8557-cf19f2f0be4c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0be6c2fe-329c-411f-8557-cf19f2f0be4c" (UID: "0be6c2fe-329c-411f-8557-cf19f2f0be4c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:14:45 crc kubenswrapper[4585]: I1201 14:14:45.859595 4585 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27af24bb-3898-4376-9985-63237c74d33f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:45 crc kubenswrapper[4585]: I1201 14:14:45.859624 4585 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0be6c2fe-329c-411f-8557-cf19f2f0be4c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:45 crc kubenswrapper[4585]: I1201 14:14:45.861601 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27af24bb-3898-4376-9985-63237c74d33f-kube-api-access-5bbbt" (OuterVolumeSpecName: "kube-api-access-5bbbt") pod "27af24bb-3898-4376-9985-63237c74d33f" (UID: "27af24bb-3898-4376-9985-63237c74d33f"). InnerVolumeSpecName "kube-api-access-5bbbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:14:45 crc kubenswrapper[4585]: I1201 14:14:45.861864 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0be6c2fe-329c-411f-8557-cf19f2f0be4c-kube-api-access-qd2vc" (OuterVolumeSpecName: "kube-api-access-qd2vc") pod "0be6c2fe-329c-411f-8557-cf19f2f0be4c" (UID: "0be6c2fe-329c-411f-8557-cf19f2f0be4c"). InnerVolumeSpecName "kube-api-access-qd2vc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:14:45 crc kubenswrapper[4585]: I1201 14:14:45.961064 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bbbt\" (UniqueName: \"kubernetes.io/projected/27af24bb-3898-4376-9985-63237c74d33f-kube-api-access-5bbbt\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:45 crc kubenswrapper[4585]: I1201 14:14:45.961100 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qd2vc\" (UniqueName: \"kubernetes.io/projected/0be6c2fe-329c-411f-8557-cf19f2f0be4c-kube-api-access-qd2vc\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:46 crc kubenswrapper[4585]: I1201 14:14:46.038909 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-fh5fg" event={"ID":"f99104c3-84fc-45e1-8b1d-e92a2cf55633","Type":"ContainerStarted","Data":"cd2e776b376b295748b994c27808891c21e62c16b16ebbc155364236a0b76001"} Dec 01 14:14:46 crc kubenswrapper[4585]: I1201 14:14:46.041599 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8a75-account-create-update-qzf69" event={"ID":"27af24bb-3898-4376-9985-63237c74d33f","Type":"ContainerDied","Data":"14189402f21faacf3b52d6e81203c4898a1aba0bd5915940daa2d02df9943c7f"} Dec 01 14:14:46 crc kubenswrapper[4585]: I1201 14:14:46.041643 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14189402f21faacf3b52d6e81203c4898a1aba0bd5915940daa2d02df9943c7f" Dec 01 14:14:46 crc kubenswrapper[4585]: I1201 14:14:46.041715 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8a75-account-create-update-qzf69" Dec 01 14:14:46 crc kubenswrapper[4585]: I1201 14:14:46.045425 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-1a88-account-create-update-wkjzb" event={"ID":"0be6c2fe-329c-411f-8557-cf19f2f0be4c","Type":"ContainerDied","Data":"27218f47626c38bf7394b47a8248c11f1658a31b22c8a78812f94a20901eaf31"} Dec 01 14:14:46 crc kubenswrapper[4585]: I1201 14:14:46.045582 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27218f47626c38bf7394b47a8248c11f1658a31b22c8a78812f94a20901eaf31" Dec 01 14:14:46 crc kubenswrapper[4585]: I1201 14:14:46.045751 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-1a88-account-create-update-wkjzb" Dec 01 14:14:46 crc kubenswrapper[4585]: I1201 14:14:46.069658 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-fh5fg" podStartSLOduration=3.048898832 podStartE2EDuration="8.069640515s" podCreationTimestamp="2025-12-01 14:14:38 +0000 UTC" firstStartedPulling="2025-12-01 14:14:40.615225172 +0000 UTC m=+994.599439017" lastFinishedPulling="2025-12-01 14:14:45.635966845 +0000 UTC m=+999.620180700" observedRunningTime="2025-12-01 14:14:46.064929549 +0000 UTC m=+1000.049143434" watchObservedRunningTime="2025-12-01 14:14:46.069640515 +0000 UTC m=+1000.053854370" Dec 01 14:14:46 crc kubenswrapper[4585]: I1201 14:14:46.681172 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-74f6bcbc87-5hcms" Dec 01 14:14:46 crc kubenswrapper[4585]: I1201 14:14:46.759561 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-xb2ht"] Dec 01 14:14:46 crc kubenswrapper[4585]: I1201 14:14:46.760547 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-xb2ht" podUID="dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53" containerName="dnsmasq-dns" containerID="cri-o://4b15fe80d49b4bf44ec4a241ee2fe3d17d389b28995296458a1b5f12adcb2cde" gracePeriod=10 Dec 01 14:14:46 crc kubenswrapper[4585]: I1201 14:14:46.961344 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-xb2ht" podUID="dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.112:5353: connect: connection refused" Dec 01 14:14:47 crc kubenswrapper[4585]: I1201 14:14:47.055117 4585 generic.go:334] "Generic (PLEG): container finished" podID="dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53" containerID="4b15fe80d49b4bf44ec4a241ee2fe3d17d389b28995296458a1b5f12adcb2cde" exitCode=0 Dec 01 14:14:47 crc kubenswrapper[4585]: I1201 14:14:47.055958 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-xb2ht" event={"ID":"dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53","Type":"ContainerDied","Data":"4b15fe80d49b4bf44ec4a241ee2fe3d17d389b28995296458a1b5f12adcb2cde"} Dec 01 14:14:47 crc kubenswrapper[4585]: I1201 14:14:47.317809 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-xb2ht" Dec 01 14:14:47 crc kubenswrapper[4585]: I1201 14:14:47.385141 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53-config\") pod \"dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53\" (UID: \"dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53\") " Dec 01 14:14:47 crc kubenswrapper[4585]: I1201 14:14:47.385235 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rf9qj\" (UniqueName: \"kubernetes.io/projected/dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53-kube-api-access-rf9qj\") pod \"dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53\" (UID: \"dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53\") " Dec 01 14:14:47 crc kubenswrapper[4585]: I1201 14:14:47.385290 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53-dns-svc\") pod \"dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53\" (UID: \"dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53\") " Dec 01 14:14:47 crc kubenswrapper[4585]: I1201 14:14:47.385343 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53-ovsdbserver-nb\") pod \"dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53\" (UID: \"dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53\") " Dec 01 14:14:47 crc kubenswrapper[4585]: I1201 14:14:47.385419 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53-ovsdbserver-sb\") pod \"dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53\" (UID: \"dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53\") " Dec 01 14:14:47 crc kubenswrapper[4585]: I1201 14:14:47.422724 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53-kube-api-access-rf9qj" (OuterVolumeSpecName: "kube-api-access-rf9qj") pod "dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53" (UID: "dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53"). InnerVolumeSpecName "kube-api-access-rf9qj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:14:47 crc kubenswrapper[4585]: I1201 14:14:47.445188 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53" (UID: "dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:14:47 crc kubenswrapper[4585]: I1201 14:14:47.448738 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53" (UID: "dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:14:47 crc kubenswrapper[4585]: I1201 14:14:47.451919 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53" (UID: "dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:14:47 crc kubenswrapper[4585]: I1201 14:14:47.455615 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53-config" (OuterVolumeSpecName: "config") pod "dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53" (UID: "dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:14:47 crc kubenswrapper[4585]: I1201 14:14:47.487520 4585 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:47 crc kubenswrapper[4585]: I1201 14:14:47.487566 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53-config\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:47 crc kubenswrapper[4585]: I1201 14:14:47.487579 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rf9qj\" (UniqueName: \"kubernetes.io/projected/dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53-kube-api-access-rf9qj\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:47 crc kubenswrapper[4585]: I1201 14:14:47.487593 4585 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:47 crc kubenswrapper[4585]: I1201 14:14:47.487608 4585 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:48 crc kubenswrapper[4585]: I1201 14:14:48.064181 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-xb2ht" event={"ID":"dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53","Type":"ContainerDied","Data":"133dad6975a3e638ced50ddbd5256be72a3a6128e335a2cb037f1d126d9d3f81"} Dec 01 14:14:48 crc kubenswrapper[4585]: I1201 14:14:48.064199 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-xb2ht" Dec 01 14:14:48 crc kubenswrapper[4585]: I1201 14:14:48.064500 4585 scope.go:117] "RemoveContainer" containerID="4b15fe80d49b4bf44ec4a241ee2fe3d17d389b28995296458a1b5f12adcb2cde" Dec 01 14:14:48 crc kubenswrapper[4585]: I1201 14:14:48.086248 4585 scope.go:117] "RemoveContainer" containerID="ac96dd283f58772aef8cb613c5fdec3760127dd34d7c4f646cfe65ec65e6f821" Dec 01 14:14:48 crc kubenswrapper[4585]: I1201 14:14:48.100959 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-xb2ht"] Dec 01 14:14:48 crc kubenswrapper[4585]: I1201 14:14:48.112026 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-xb2ht"] Dec 01 14:14:48 crc kubenswrapper[4585]: I1201 14:14:48.423385 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53" path="/var/lib/kubelet/pods/dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53/volumes" Dec 01 14:14:49 crc kubenswrapper[4585]: I1201 14:14:49.072706 4585 generic.go:334] "Generic (PLEG): container finished" podID="f99104c3-84fc-45e1-8b1d-e92a2cf55633" containerID="cd2e776b376b295748b994c27808891c21e62c16b16ebbc155364236a0b76001" exitCode=0 Dec 01 14:14:49 crc kubenswrapper[4585]: I1201 14:14:49.072767 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-fh5fg" event={"ID":"f99104c3-84fc-45e1-8b1d-e92a2cf55633","Type":"ContainerDied","Data":"cd2e776b376b295748b994c27808891c21e62c16b16ebbc155364236a0b76001"} Dec 01 14:14:50 crc kubenswrapper[4585]: I1201 14:14:50.412421 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-fh5fg" Dec 01 14:14:50 crc kubenswrapper[4585]: I1201 14:14:50.534057 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f99104c3-84fc-45e1-8b1d-e92a2cf55633-config-data\") pod \"f99104c3-84fc-45e1-8b1d-e92a2cf55633\" (UID: \"f99104c3-84fc-45e1-8b1d-e92a2cf55633\") " Dec 01 14:14:50 crc kubenswrapper[4585]: I1201 14:14:50.534303 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpzsj\" (UniqueName: \"kubernetes.io/projected/f99104c3-84fc-45e1-8b1d-e92a2cf55633-kube-api-access-wpzsj\") pod \"f99104c3-84fc-45e1-8b1d-e92a2cf55633\" (UID: \"f99104c3-84fc-45e1-8b1d-e92a2cf55633\") " Dec 01 14:14:50 crc kubenswrapper[4585]: I1201 14:14:50.534357 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f99104c3-84fc-45e1-8b1d-e92a2cf55633-combined-ca-bundle\") pod \"f99104c3-84fc-45e1-8b1d-e92a2cf55633\" (UID: \"f99104c3-84fc-45e1-8b1d-e92a2cf55633\") " Dec 01 14:14:50 crc kubenswrapper[4585]: I1201 14:14:50.542274 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f99104c3-84fc-45e1-8b1d-e92a2cf55633-kube-api-access-wpzsj" (OuterVolumeSpecName: "kube-api-access-wpzsj") pod "f99104c3-84fc-45e1-8b1d-e92a2cf55633" (UID: "f99104c3-84fc-45e1-8b1d-e92a2cf55633"). InnerVolumeSpecName "kube-api-access-wpzsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:14:50 crc kubenswrapper[4585]: I1201 14:14:50.558632 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f99104c3-84fc-45e1-8b1d-e92a2cf55633-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f99104c3-84fc-45e1-8b1d-e92a2cf55633" (UID: "f99104c3-84fc-45e1-8b1d-e92a2cf55633"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:14:50 crc kubenswrapper[4585]: I1201 14:14:50.580601 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f99104c3-84fc-45e1-8b1d-e92a2cf55633-config-data" (OuterVolumeSpecName: "config-data") pod "f99104c3-84fc-45e1-8b1d-e92a2cf55633" (UID: "f99104c3-84fc-45e1-8b1d-e92a2cf55633"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:14:50 crc kubenswrapper[4585]: I1201 14:14:50.639034 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f99104c3-84fc-45e1-8b1d-e92a2cf55633-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:50 crc kubenswrapper[4585]: I1201 14:14:50.639061 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpzsj\" (UniqueName: \"kubernetes.io/projected/f99104c3-84fc-45e1-8b1d-e92a2cf55633-kube-api-access-wpzsj\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:50 crc kubenswrapper[4585]: I1201 14:14:50.639071 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f99104c3-84fc-45e1-8b1d-e92a2cf55633-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.091187 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-fh5fg" event={"ID":"f99104c3-84fc-45e1-8b1d-e92a2cf55633","Type":"ContainerDied","Data":"6a4d2f1cf544500fe937b62c8561c571fbfab45526a6bf69b1281c4a82683630"} Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.091234 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a4d2f1cf544500fe937b62c8561c571fbfab45526a6bf69b1281c4a82683630" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.091251 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-fh5fg" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.388532 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-jfbr6"] Dec 01 14:14:51 crc kubenswrapper[4585]: E1201 14:14:51.389054 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27af24bb-3898-4376-9985-63237c74d33f" containerName="mariadb-account-create-update" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.389119 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="27af24bb-3898-4376-9985-63237c74d33f" containerName="mariadb-account-create-update" Dec 01 14:14:51 crc kubenswrapper[4585]: E1201 14:14:51.389175 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8221747f-ffad-4b2e-bbd9-94b7d25949cc" containerName="dnsmasq-dns" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.389242 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="8221747f-ffad-4b2e-bbd9-94b7d25949cc" containerName="dnsmasq-dns" Dec 01 14:14:51 crc kubenswrapper[4585]: E1201 14:14:51.389298 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9444ec25-acd8-4038-8e7f-052ec1ba2f36" containerName="mariadb-database-create" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.389343 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="9444ec25-acd8-4038-8e7f-052ec1ba2f36" containerName="mariadb-database-create" Dec 01 14:14:51 crc kubenswrapper[4585]: E1201 14:14:51.389392 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53" containerName="init" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.389435 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53" containerName="init" Dec 01 14:14:51 crc kubenswrapper[4585]: E1201 14:14:51.389496 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bed28d37-5c2e-4088-b0ed-d4b3dbbc42b2" containerName="mariadb-account-create-update" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.389546 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="bed28d37-5c2e-4088-b0ed-d4b3dbbc42b2" containerName="mariadb-account-create-update" Dec 01 14:14:51 crc kubenswrapper[4585]: E1201 14:14:51.389597 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f99104c3-84fc-45e1-8b1d-e92a2cf55633" containerName="keystone-db-sync" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.389663 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="f99104c3-84fc-45e1-8b1d-e92a2cf55633" containerName="keystone-db-sync" Dec 01 14:14:51 crc kubenswrapper[4585]: E1201 14:14:51.389715 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53" containerName="dnsmasq-dns" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.389792 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53" containerName="dnsmasq-dns" Dec 01 14:14:51 crc kubenswrapper[4585]: E1201 14:14:51.389857 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8221747f-ffad-4b2e-bbd9-94b7d25949cc" containerName="init" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.389912 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="8221747f-ffad-4b2e-bbd9-94b7d25949cc" containerName="init" Dec 01 14:14:51 crc kubenswrapper[4585]: E1201 14:14:51.389966 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="977a54d5-9b5e-4399-80ce-6682e0a78d3c" containerName="mariadb-database-create" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.390047 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="977a54d5-9b5e-4399-80ce-6682e0a78d3c" containerName="mariadb-database-create" Dec 01 14:14:51 crc kubenswrapper[4585]: E1201 14:14:51.390097 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb978ed4-fbc2-4706-821e-3e820802d995" containerName="mariadb-database-create" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.390140 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb978ed4-fbc2-4706-821e-3e820802d995" containerName="mariadb-database-create" Dec 01 14:14:51 crc kubenswrapper[4585]: E1201 14:14:51.390190 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0be6c2fe-329c-411f-8557-cf19f2f0be4c" containerName="mariadb-account-create-update" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.390234 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="0be6c2fe-329c-411f-8557-cf19f2f0be4c" containerName="mariadb-account-create-update" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.390448 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="977a54d5-9b5e-4399-80ce-6682e0a78d3c" containerName="mariadb-database-create" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.391809 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc9bc3e1-ba72-4f52-9e0b-5c5c38d4db53" containerName="dnsmasq-dns" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.391876 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb978ed4-fbc2-4706-821e-3e820802d995" containerName="mariadb-database-create" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.391925 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="f99104c3-84fc-45e1-8b1d-e92a2cf55633" containerName="keystone-db-sync" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.391987 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="8221747f-ffad-4b2e-bbd9-94b7d25949cc" containerName="dnsmasq-dns" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.392042 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="9444ec25-acd8-4038-8e7f-052ec1ba2f36" containerName="mariadb-database-create" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.392113 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="27af24bb-3898-4376-9985-63237c74d33f" containerName="mariadb-account-create-update" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.392170 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="0be6c2fe-329c-411f-8557-cf19f2f0be4c" containerName="mariadb-account-create-update" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.392230 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="bed28d37-5c2e-4088-b0ed-d4b3dbbc42b2" containerName="mariadb-account-create-update" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.393201 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-jfbr6" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.411558 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-jfbr6"] Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.450312 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b99ad93-74e8-47fc-b11b-df18ab6edf87-dns-svc\") pod \"dnsmasq-dns-847c4cc679-jfbr6\" (UID: \"8b99ad93-74e8-47fc-b11b-df18ab6edf87\") " pod="openstack/dnsmasq-dns-847c4cc679-jfbr6" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.450372 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s785n\" (UniqueName: \"kubernetes.io/projected/8b99ad93-74e8-47fc-b11b-df18ab6edf87-kube-api-access-s785n\") pod \"dnsmasq-dns-847c4cc679-jfbr6\" (UID: \"8b99ad93-74e8-47fc-b11b-df18ab6edf87\") " pod="openstack/dnsmasq-dns-847c4cc679-jfbr6" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.450422 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8b99ad93-74e8-47fc-b11b-df18ab6edf87-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-jfbr6\" (UID: \"8b99ad93-74e8-47fc-b11b-df18ab6edf87\") " pod="openstack/dnsmasq-dns-847c4cc679-jfbr6" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.450636 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8b99ad93-74e8-47fc-b11b-df18ab6edf87-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-jfbr6\" (UID: \"8b99ad93-74e8-47fc-b11b-df18ab6edf87\") " pod="openstack/dnsmasq-dns-847c4cc679-jfbr6" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.450667 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8b99ad93-74e8-47fc-b11b-df18ab6edf87-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-jfbr6\" (UID: \"8b99ad93-74e8-47fc-b11b-df18ab6edf87\") " pod="openstack/dnsmasq-dns-847c4cc679-jfbr6" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.450702 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b99ad93-74e8-47fc-b11b-df18ab6edf87-config\") pod \"dnsmasq-dns-847c4cc679-jfbr6\" (UID: \"8b99ad93-74e8-47fc-b11b-df18ab6edf87\") " pod="openstack/dnsmasq-dns-847c4cc679-jfbr6" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.505334 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-9f9zw"] Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.506654 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9f9zw" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.517525 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-9f9zw"] Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.520779 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-mj6w8" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.521184 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.521377 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.521549 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.521727 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.551744 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8b99ad93-74e8-47fc-b11b-df18ab6edf87-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-jfbr6\" (UID: \"8b99ad93-74e8-47fc-b11b-df18ab6edf87\") " pod="openstack/dnsmasq-dns-847c4cc679-jfbr6" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.551804 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b99ad93-74e8-47fc-b11b-df18ab6edf87-config\") pod \"dnsmasq-dns-847c4cc679-jfbr6\" (UID: \"8b99ad93-74e8-47fc-b11b-df18ab6edf87\") " pod="openstack/dnsmasq-dns-847c4cc679-jfbr6" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.551837 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b99ad93-74e8-47fc-b11b-df18ab6edf87-dns-svc\") pod \"dnsmasq-dns-847c4cc679-jfbr6\" (UID: \"8b99ad93-74e8-47fc-b11b-df18ab6edf87\") " pod="openstack/dnsmasq-dns-847c4cc679-jfbr6" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.551867 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s785n\" (UniqueName: \"kubernetes.io/projected/8b99ad93-74e8-47fc-b11b-df18ab6edf87-kube-api-access-s785n\") pod \"dnsmasq-dns-847c4cc679-jfbr6\" (UID: \"8b99ad93-74e8-47fc-b11b-df18ab6edf87\") " pod="openstack/dnsmasq-dns-847c4cc679-jfbr6" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.551949 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8b99ad93-74e8-47fc-b11b-df18ab6edf87-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-jfbr6\" (UID: \"8b99ad93-74e8-47fc-b11b-df18ab6edf87\") " pod="openstack/dnsmasq-dns-847c4cc679-jfbr6" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.552047 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8b99ad93-74e8-47fc-b11b-df18ab6edf87-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-jfbr6\" (UID: \"8b99ad93-74e8-47fc-b11b-df18ab6edf87\") " pod="openstack/dnsmasq-dns-847c4cc679-jfbr6" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.560842 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8b99ad93-74e8-47fc-b11b-df18ab6edf87-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-jfbr6\" (UID: \"8b99ad93-74e8-47fc-b11b-df18ab6edf87\") " pod="openstack/dnsmasq-dns-847c4cc679-jfbr6" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.565467 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8b99ad93-74e8-47fc-b11b-df18ab6edf87-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-jfbr6\" (UID: \"8b99ad93-74e8-47fc-b11b-df18ab6edf87\") " pod="openstack/dnsmasq-dns-847c4cc679-jfbr6" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.567911 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b99ad93-74e8-47fc-b11b-df18ab6edf87-config\") pod \"dnsmasq-dns-847c4cc679-jfbr6\" (UID: \"8b99ad93-74e8-47fc-b11b-df18ab6edf87\") " pod="openstack/dnsmasq-dns-847c4cc679-jfbr6" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.596144 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8b99ad93-74e8-47fc-b11b-df18ab6edf87-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-jfbr6\" (UID: \"8b99ad93-74e8-47fc-b11b-df18ab6edf87\") " pod="openstack/dnsmasq-dns-847c4cc679-jfbr6" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.599392 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s785n\" (UniqueName: \"kubernetes.io/projected/8b99ad93-74e8-47fc-b11b-df18ab6edf87-kube-api-access-s785n\") pod \"dnsmasq-dns-847c4cc679-jfbr6\" (UID: \"8b99ad93-74e8-47fc-b11b-df18ab6edf87\") " pod="openstack/dnsmasq-dns-847c4cc679-jfbr6" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.633301 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b99ad93-74e8-47fc-b11b-df18ab6edf87-dns-svc\") pod \"dnsmasq-dns-847c4cc679-jfbr6\" (UID: \"8b99ad93-74e8-47fc-b11b-df18ab6edf87\") " pod="openstack/dnsmasq-dns-847c4cc679-jfbr6" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.648121 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-54d4dd665-7g5hl"] Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.649602 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54d4dd665-7g5hl" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.654380 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.655103 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.657396 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-7fr58" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.660115 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d955d7c-086f-484a-872c-43764c37a3b0-config-data\") pod \"keystone-bootstrap-9f9zw\" (UID: \"2d955d7c-086f-484a-872c-43764c37a3b0\") " pod="openstack/keystone-bootstrap-9f9zw" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.660254 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2d955d7c-086f-484a-872c-43764c37a3b0-fernet-keys\") pod \"keystone-bootstrap-9f9zw\" (UID: \"2d955d7c-086f-484a-872c-43764c37a3b0\") " pod="openstack/keystone-bootstrap-9f9zw" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.660280 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msvcz\" (UniqueName: \"kubernetes.io/projected/2d955d7c-086f-484a-872c-43764c37a3b0-kube-api-access-msvcz\") pod \"keystone-bootstrap-9f9zw\" (UID: \"2d955d7c-086f-484a-872c-43764c37a3b0\") " pod="openstack/keystone-bootstrap-9f9zw" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.660139 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.660657 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d955d7c-086f-484a-872c-43764c37a3b0-scripts\") pod \"keystone-bootstrap-9f9zw\" (UID: \"2d955d7c-086f-484a-872c-43764c37a3b0\") " pod="openstack/keystone-bootstrap-9f9zw" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.660796 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2d955d7c-086f-484a-872c-43764c37a3b0-credential-keys\") pod \"keystone-bootstrap-9f9zw\" (UID: \"2d955d7c-086f-484a-872c-43764c37a3b0\") " pod="openstack/keystone-bootstrap-9f9zw" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.660949 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d955d7c-086f-484a-872c-43764c37a3b0-combined-ca-bundle\") pod \"keystone-bootstrap-9f9zw\" (UID: \"2d955d7c-086f-484a-872c-43764c37a3b0\") " pod="openstack/keystone-bootstrap-9f9zw" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.661470 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-54d4dd665-7g5hl"] Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.722444 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-jfbr6" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.762375 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d955d7c-086f-484a-872c-43764c37a3b0-combined-ca-bundle\") pod \"keystone-bootstrap-9f9zw\" (UID: \"2d955d7c-086f-484a-872c-43764c37a3b0\") " pod="openstack/keystone-bootstrap-9f9zw" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.762437 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/477277de-eaf5-4536-90d9-9091737cb66c-scripts\") pod \"horizon-54d4dd665-7g5hl\" (UID: \"477277de-eaf5-4536-90d9-9091737cb66c\") " pod="openstack/horizon-54d4dd665-7g5hl" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.762487 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d955d7c-086f-484a-872c-43764c37a3b0-config-data\") pod \"keystone-bootstrap-9f9zw\" (UID: \"2d955d7c-086f-484a-872c-43764c37a3b0\") " pod="openstack/keystone-bootstrap-9f9zw" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.762519 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2d955d7c-086f-484a-872c-43764c37a3b0-fernet-keys\") pod \"keystone-bootstrap-9f9zw\" (UID: \"2d955d7c-086f-484a-872c-43764c37a3b0\") " pod="openstack/keystone-bootstrap-9f9zw" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.762541 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/477277de-eaf5-4536-90d9-9091737cb66c-logs\") pod \"horizon-54d4dd665-7g5hl\" (UID: \"477277de-eaf5-4536-90d9-9091737cb66c\") " pod="openstack/horizon-54d4dd665-7g5hl" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.762567 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msvcz\" (UniqueName: \"kubernetes.io/projected/2d955d7c-086f-484a-872c-43764c37a3b0-kube-api-access-msvcz\") pod \"keystone-bootstrap-9f9zw\" (UID: \"2d955d7c-086f-484a-872c-43764c37a3b0\") " pod="openstack/keystone-bootstrap-9f9zw" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.762607 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/477277de-eaf5-4536-90d9-9091737cb66c-config-data\") pod \"horizon-54d4dd665-7g5hl\" (UID: \"477277de-eaf5-4536-90d9-9091737cb66c\") " pod="openstack/horizon-54d4dd665-7g5hl" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.762645 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqpbj\" (UniqueName: \"kubernetes.io/projected/477277de-eaf5-4536-90d9-9091737cb66c-kube-api-access-xqpbj\") pod \"horizon-54d4dd665-7g5hl\" (UID: \"477277de-eaf5-4536-90d9-9091737cb66c\") " pod="openstack/horizon-54d4dd665-7g5hl" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.762668 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d955d7c-086f-484a-872c-43764c37a3b0-scripts\") pod \"keystone-bootstrap-9f9zw\" (UID: \"2d955d7c-086f-484a-872c-43764c37a3b0\") " pod="openstack/keystone-bootstrap-9f9zw" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.762712 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2d955d7c-086f-484a-872c-43764c37a3b0-credential-keys\") pod \"keystone-bootstrap-9f9zw\" (UID: \"2d955d7c-086f-484a-872c-43764c37a3b0\") " pod="openstack/keystone-bootstrap-9f9zw" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.762770 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/477277de-eaf5-4536-90d9-9091737cb66c-horizon-secret-key\") pod \"horizon-54d4dd665-7g5hl\" (UID: \"477277de-eaf5-4536-90d9-9091737cb66c\") " pod="openstack/horizon-54d4dd665-7g5hl" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.777837 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d955d7c-086f-484a-872c-43764c37a3b0-combined-ca-bundle\") pod \"keystone-bootstrap-9f9zw\" (UID: \"2d955d7c-086f-484a-872c-43764c37a3b0\") " pod="openstack/keystone-bootstrap-9f9zw" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.778865 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2d955d7c-086f-484a-872c-43764c37a3b0-fernet-keys\") pod \"keystone-bootstrap-9f9zw\" (UID: \"2d955d7c-086f-484a-872c-43764c37a3b0\") " pod="openstack/keystone-bootstrap-9f9zw" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.786672 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d955d7c-086f-484a-872c-43764c37a3b0-config-data\") pod \"keystone-bootstrap-9f9zw\" (UID: \"2d955d7c-086f-484a-872c-43764c37a3b0\") " pod="openstack/keystone-bootstrap-9f9zw" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.805355 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-8556bc9b75-ldp9m"] Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.805516 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2d955d7c-086f-484a-872c-43764c37a3b0-credential-keys\") pod \"keystone-bootstrap-9f9zw\" (UID: \"2d955d7c-086f-484a-872c-43764c37a3b0\") " pod="openstack/keystone-bootstrap-9f9zw" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.806762 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8556bc9b75-ldp9m" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.813403 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-8556bc9b75-ldp9m"] Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.815889 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msvcz\" (UniqueName: \"kubernetes.io/projected/2d955d7c-086f-484a-872c-43764c37a3b0-kube-api-access-msvcz\") pod \"keystone-bootstrap-9f9zw\" (UID: \"2d955d7c-086f-484a-872c-43764c37a3b0\") " pod="openstack/keystone-bootstrap-9f9zw" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.821008 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d955d7c-086f-484a-872c-43764c37a3b0-scripts\") pod \"keystone-bootstrap-9f9zw\" (UID: \"2d955d7c-086f-484a-872c-43764c37a3b0\") " pod="openstack/keystone-bootstrap-9f9zw" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.839401 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9f9zw" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.864038 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/477277de-eaf5-4536-90d9-9091737cb66c-horizon-secret-key\") pod \"horizon-54d4dd665-7g5hl\" (UID: \"477277de-eaf5-4536-90d9-9091737cb66c\") " pod="openstack/horizon-54d4dd665-7g5hl" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.864101 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/477277de-eaf5-4536-90d9-9091737cb66c-scripts\") pod \"horizon-54d4dd665-7g5hl\" (UID: \"477277de-eaf5-4536-90d9-9091737cb66c\") " pod="openstack/horizon-54d4dd665-7g5hl" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.864126 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5a389fb6-e678-4552-8cf8-3aea857a545c-horizon-secret-key\") pod \"horizon-8556bc9b75-ldp9m\" (UID: \"5a389fb6-e678-4552-8cf8-3aea857a545c\") " pod="openstack/horizon-8556bc9b75-ldp9m" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.864151 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5a389fb6-e678-4552-8cf8-3aea857a545c-config-data\") pod \"horizon-8556bc9b75-ldp9m\" (UID: \"5a389fb6-e678-4552-8cf8-3aea857a545c\") " pod="openstack/horizon-8556bc9b75-ldp9m" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.864177 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a389fb6-e678-4552-8cf8-3aea857a545c-logs\") pod \"horizon-8556bc9b75-ldp9m\" (UID: \"5a389fb6-e678-4552-8cf8-3aea857a545c\") " pod="openstack/horizon-8556bc9b75-ldp9m" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.864204 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/477277de-eaf5-4536-90d9-9091737cb66c-logs\") pod \"horizon-54d4dd665-7g5hl\" (UID: \"477277de-eaf5-4536-90d9-9091737cb66c\") " pod="openstack/horizon-54d4dd665-7g5hl" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.864235 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5a389fb6-e678-4552-8cf8-3aea857a545c-scripts\") pod \"horizon-8556bc9b75-ldp9m\" (UID: \"5a389fb6-e678-4552-8cf8-3aea857a545c\") " pod="openstack/horizon-8556bc9b75-ldp9m" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.864255 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/477277de-eaf5-4536-90d9-9091737cb66c-config-data\") pod \"horizon-54d4dd665-7g5hl\" (UID: \"477277de-eaf5-4536-90d9-9091737cb66c\") " pod="openstack/horizon-54d4dd665-7g5hl" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.864284 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqpbj\" (UniqueName: \"kubernetes.io/projected/477277de-eaf5-4536-90d9-9091737cb66c-kube-api-access-xqpbj\") pod \"horizon-54d4dd665-7g5hl\" (UID: \"477277de-eaf5-4536-90d9-9091737cb66c\") " pod="openstack/horizon-54d4dd665-7g5hl" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.864329 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82jp9\" (UniqueName: \"kubernetes.io/projected/5a389fb6-e678-4552-8cf8-3aea857a545c-kube-api-access-82jp9\") pod \"horizon-8556bc9b75-ldp9m\" (UID: \"5a389fb6-e678-4552-8cf8-3aea857a545c\") " pod="openstack/horizon-8556bc9b75-ldp9m" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.864742 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/477277de-eaf5-4536-90d9-9091737cb66c-logs\") pod \"horizon-54d4dd665-7g5hl\" (UID: \"477277de-eaf5-4536-90d9-9091737cb66c\") " pod="openstack/horizon-54d4dd665-7g5hl" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.866600 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/477277de-eaf5-4536-90d9-9091737cb66c-scripts\") pod \"horizon-54d4dd665-7g5hl\" (UID: \"477277de-eaf5-4536-90d9-9091737cb66c\") " pod="openstack/horizon-54d4dd665-7g5hl" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.869069 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/477277de-eaf5-4536-90d9-9091737cb66c-config-data\") pod \"horizon-54d4dd665-7g5hl\" (UID: \"477277de-eaf5-4536-90d9-9091737cb66c\") " pod="openstack/horizon-54d4dd665-7g5hl" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.880585 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/477277de-eaf5-4536-90d9-9091737cb66c-horizon-secret-key\") pod \"horizon-54d4dd665-7g5hl\" (UID: \"477277de-eaf5-4536-90d9-9091737cb66c\") " pod="openstack/horizon-54d4dd665-7g5hl" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.899675 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqpbj\" (UniqueName: \"kubernetes.io/projected/477277de-eaf5-4536-90d9-9091737cb66c-kube-api-access-xqpbj\") pod \"horizon-54d4dd665-7g5hl\" (UID: \"477277de-eaf5-4536-90d9-9091737cb66c\") " pod="openstack/horizon-54d4dd665-7g5hl" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.967887 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5a389fb6-e678-4552-8cf8-3aea857a545c-config-data\") pod \"horizon-8556bc9b75-ldp9m\" (UID: \"5a389fb6-e678-4552-8cf8-3aea857a545c\") " pod="openstack/horizon-8556bc9b75-ldp9m" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.968224 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5a389fb6-e678-4552-8cf8-3aea857a545c-horizon-secret-key\") pod \"horizon-8556bc9b75-ldp9m\" (UID: \"5a389fb6-e678-4552-8cf8-3aea857a545c\") " pod="openstack/horizon-8556bc9b75-ldp9m" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.968257 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a389fb6-e678-4552-8cf8-3aea857a545c-logs\") pod \"horizon-8556bc9b75-ldp9m\" (UID: \"5a389fb6-e678-4552-8cf8-3aea857a545c\") " pod="openstack/horizon-8556bc9b75-ldp9m" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.968298 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5a389fb6-e678-4552-8cf8-3aea857a545c-scripts\") pod \"horizon-8556bc9b75-ldp9m\" (UID: \"5a389fb6-e678-4552-8cf8-3aea857a545c\") " pod="openstack/horizon-8556bc9b75-ldp9m" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.968362 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82jp9\" (UniqueName: \"kubernetes.io/projected/5a389fb6-e678-4552-8cf8-3aea857a545c-kube-api-access-82jp9\") pod \"horizon-8556bc9b75-ldp9m\" (UID: \"5a389fb6-e678-4552-8cf8-3aea857a545c\") " pod="openstack/horizon-8556bc9b75-ldp9m" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.969437 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a389fb6-e678-4552-8cf8-3aea857a545c-logs\") pod \"horizon-8556bc9b75-ldp9m\" (UID: \"5a389fb6-e678-4552-8cf8-3aea857a545c\") " pod="openstack/horizon-8556bc9b75-ldp9m" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.969853 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5a389fb6-e678-4552-8cf8-3aea857a545c-config-data\") pod \"horizon-8556bc9b75-ldp9m\" (UID: \"5a389fb6-e678-4552-8cf8-3aea857a545c\") " pod="openstack/horizon-8556bc9b75-ldp9m" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.970366 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5a389fb6-e678-4552-8cf8-3aea857a545c-scripts\") pod \"horizon-8556bc9b75-ldp9m\" (UID: \"5a389fb6-e678-4552-8cf8-3aea857a545c\") " pod="openstack/horizon-8556bc9b75-ldp9m" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.972211 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54d4dd665-7g5hl" Dec 01 14:14:51 crc kubenswrapper[4585]: I1201 14:14:51.977491 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5a389fb6-e678-4552-8cf8-3aea857a545c-horizon-secret-key\") pod \"horizon-8556bc9b75-ldp9m\" (UID: \"5a389fb6-e678-4552-8cf8-3aea857a545c\") " pod="openstack/horizon-8556bc9b75-ldp9m" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:51.999782 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82jp9\" (UniqueName: \"kubernetes.io/projected/5a389fb6-e678-4552-8cf8-3aea857a545c-kube-api-access-82jp9\") pod \"horizon-8556bc9b75-ldp9m\" (UID: \"5a389fb6-e678-4552-8cf8-3aea857a545c\") " pod="openstack/horizon-8556bc9b75-ldp9m" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.000255 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8556bc9b75-ldp9m" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.168924 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-kg2tb"] Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.175667 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-kg2tb" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.179859 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-kg2tb"] Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.180262 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.180338 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.180448 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-4p2xv" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.234479 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-wmh4t"] Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.235659 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-wmh4t" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.239324 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.239536 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-9gnvq" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.239688 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.257033 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-sf4wz"] Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.258108 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-sf4wz" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.274519 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-fkm5f" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.274811 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.283135 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdqs6\" (UniqueName: \"kubernetes.io/projected/cc9ff7ae-64b5-41db-94e0-5b30cb0a923c-kube-api-access-bdqs6\") pod \"neutron-db-sync-kg2tb\" (UID: \"cc9ff7ae-64b5-41db-94e0-5b30cb0a923c\") " pod="openstack/neutron-db-sync-kg2tb" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.283276 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cc9ff7ae-64b5-41db-94e0-5b30cb0a923c-config\") pod \"neutron-db-sync-kg2tb\" (UID: \"cc9ff7ae-64b5-41db-94e0-5b30cb0a923c\") " pod="openstack/neutron-db-sync-kg2tb" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.283363 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc9ff7ae-64b5-41db-94e0-5b30cb0a923c-combined-ca-bundle\") pod \"neutron-db-sync-kg2tb\" (UID: \"cc9ff7ae-64b5-41db-94e0-5b30cb0a923c\") " pod="openstack/neutron-db-sync-kg2tb" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.323026 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-sf4wz"] Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.343137 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-wmh4t"] Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.389131 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7557d295-9d3a-4d0f-933a-77390e7e179e-db-sync-config-data\") pod \"barbican-db-sync-sf4wz\" (UID: \"7557d295-9d3a-4d0f-933a-77390e7e179e\") " pod="openstack/barbican-db-sync-sf4wz" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.389357 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdqs6\" (UniqueName: \"kubernetes.io/projected/cc9ff7ae-64b5-41db-94e0-5b30cb0a923c-kube-api-access-bdqs6\") pod \"neutron-db-sync-kg2tb\" (UID: \"cc9ff7ae-64b5-41db-94e0-5b30cb0a923c\") " pod="openstack/neutron-db-sync-kg2tb" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.389464 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7557d295-9d3a-4d0f-933a-77390e7e179e-combined-ca-bundle\") pod \"barbican-db-sync-sf4wz\" (UID: \"7557d295-9d3a-4d0f-933a-77390e7e179e\") " pod="openstack/barbican-db-sync-sf4wz" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.389530 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dead942-d6c5-4a4a-aa5e-5c57b6da0c48-config-data\") pod \"cinder-db-sync-wmh4t\" (UID: \"7dead942-d6c5-4a4a-aa5e-5c57b6da0c48\") " pod="openstack/cinder-db-sync-wmh4t" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.389598 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cc9ff7ae-64b5-41db-94e0-5b30cb0a923c-config\") pod \"neutron-db-sync-kg2tb\" (UID: \"cc9ff7ae-64b5-41db-94e0-5b30cb0a923c\") " pod="openstack/neutron-db-sync-kg2tb" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.389670 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7dead942-d6c5-4a4a-aa5e-5c57b6da0c48-etc-machine-id\") pod \"cinder-db-sync-wmh4t\" (UID: \"7dead942-d6c5-4a4a-aa5e-5c57b6da0c48\") " pod="openstack/cinder-db-sync-wmh4t" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.389740 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc9ff7ae-64b5-41db-94e0-5b30cb0a923c-combined-ca-bundle\") pod \"neutron-db-sync-kg2tb\" (UID: \"cc9ff7ae-64b5-41db-94e0-5b30cb0a923c\") " pod="openstack/neutron-db-sync-kg2tb" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.389822 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fv5nz\" (UniqueName: \"kubernetes.io/projected/7557d295-9d3a-4d0f-933a-77390e7e179e-kube-api-access-fv5nz\") pod \"barbican-db-sync-sf4wz\" (UID: \"7557d295-9d3a-4d0f-933a-77390e7e179e\") " pod="openstack/barbican-db-sync-sf4wz" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.389900 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7dead942-d6c5-4a4a-aa5e-5c57b6da0c48-scripts\") pod \"cinder-db-sync-wmh4t\" (UID: \"7dead942-d6c5-4a4a-aa5e-5c57b6da0c48\") " pod="openstack/cinder-db-sync-wmh4t" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.389962 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7dead942-d6c5-4a4a-aa5e-5c57b6da0c48-db-sync-config-data\") pod \"cinder-db-sync-wmh4t\" (UID: \"7dead942-d6c5-4a4a-aa5e-5c57b6da0c48\") " pod="openstack/cinder-db-sync-wmh4t" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.390072 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8c27\" (UniqueName: \"kubernetes.io/projected/7dead942-d6c5-4a4a-aa5e-5c57b6da0c48-kube-api-access-m8c27\") pod \"cinder-db-sync-wmh4t\" (UID: \"7dead942-d6c5-4a4a-aa5e-5c57b6da0c48\") " pod="openstack/cinder-db-sync-wmh4t" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.390142 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dead942-d6c5-4a4a-aa5e-5c57b6da0c48-combined-ca-bundle\") pod \"cinder-db-sync-wmh4t\" (UID: \"7dead942-d6c5-4a4a-aa5e-5c57b6da0c48\") " pod="openstack/cinder-db-sync-wmh4t" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.403695 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc9ff7ae-64b5-41db-94e0-5b30cb0a923c-combined-ca-bundle\") pod \"neutron-db-sync-kg2tb\" (UID: \"cc9ff7ae-64b5-41db-94e0-5b30cb0a923c\") " pod="openstack/neutron-db-sync-kg2tb" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.412222 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/cc9ff7ae-64b5-41db-94e0-5b30cb0a923c-config\") pod \"neutron-db-sync-kg2tb\" (UID: \"cc9ff7ae-64b5-41db-94e0-5b30cb0a923c\") " pod="openstack/neutron-db-sync-kg2tb" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.424636 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdqs6\" (UniqueName: \"kubernetes.io/projected/cc9ff7ae-64b5-41db-94e0-5b30cb0a923c-kube-api-access-bdqs6\") pod \"neutron-db-sync-kg2tb\" (UID: \"cc9ff7ae-64b5-41db-94e0-5b30cb0a923c\") " pod="openstack/neutron-db-sync-kg2tb" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.450719 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-jfbr6"] Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.478917 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-dpwxn"] Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.480542 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-dpwxn" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.496530 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fv5nz\" (UniqueName: \"kubernetes.io/projected/7557d295-9d3a-4d0f-933a-77390e7e179e-kube-api-access-fv5nz\") pod \"barbican-db-sync-sf4wz\" (UID: \"7557d295-9d3a-4d0f-933a-77390e7e179e\") " pod="openstack/barbican-db-sync-sf4wz" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.497116 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7dead942-d6c5-4a4a-aa5e-5c57b6da0c48-scripts\") pod \"cinder-db-sync-wmh4t\" (UID: \"7dead942-d6c5-4a4a-aa5e-5c57b6da0c48\") " pod="openstack/cinder-db-sync-wmh4t" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.497218 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7dead942-d6c5-4a4a-aa5e-5c57b6da0c48-db-sync-config-data\") pod \"cinder-db-sync-wmh4t\" (UID: \"7dead942-d6c5-4a4a-aa5e-5c57b6da0c48\") " pod="openstack/cinder-db-sync-wmh4t" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.497308 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8c27\" (UniqueName: \"kubernetes.io/projected/7dead942-d6c5-4a4a-aa5e-5c57b6da0c48-kube-api-access-m8c27\") pod \"cinder-db-sync-wmh4t\" (UID: \"7dead942-d6c5-4a4a-aa5e-5c57b6da0c48\") " pod="openstack/cinder-db-sync-wmh4t" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.497401 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dead942-d6c5-4a4a-aa5e-5c57b6da0c48-combined-ca-bundle\") pod \"cinder-db-sync-wmh4t\" (UID: \"7dead942-d6c5-4a4a-aa5e-5c57b6da0c48\") " pod="openstack/cinder-db-sync-wmh4t" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.497490 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7557d295-9d3a-4d0f-933a-77390e7e179e-db-sync-config-data\") pod \"barbican-db-sync-sf4wz\" (UID: \"7557d295-9d3a-4d0f-933a-77390e7e179e\") " pod="openstack/barbican-db-sync-sf4wz" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.497592 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7557d295-9d3a-4d0f-933a-77390e7e179e-combined-ca-bundle\") pod \"barbican-db-sync-sf4wz\" (UID: \"7557d295-9d3a-4d0f-933a-77390e7e179e\") " pod="openstack/barbican-db-sync-sf4wz" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.497656 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dead942-d6c5-4a4a-aa5e-5c57b6da0c48-config-data\") pod \"cinder-db-sync-wmh4t\" (UID: \"7dead942-d6c5-4a4a-aa5e-5c57b6da0c48\") " pod="openstack/cinder-db-sync-wmh4t" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.497725 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7dead942-d6c5-4a4a-aa5e-5c57b6da0c48-etc-machine-id\") pod \"cinder-db-sync-wmh4t\" (UID: \"7dead942-d6c5-4a4a-aa5e-5c57b6da0c48\") " pod="openstack/cinder-db-sync-wmh4t" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.497837 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7dead942-d6c5-4a4a-aa5e-5c57b6da0c48-etc-machine-id\") pod \"cinder-db-sync-wmh4t\" (UID: \"7dead942-d6c5-4a4a-aa5e-5c57b6da0c48\") " pod="openstack/cinder-db-sync-wmh4t" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.503526 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-825nv"] Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.504578 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-825nv" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.507732 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7557d295-9d3a-4d0f-933a-77390e7e179e-combined-ca-bundle\") pod \"barbican-db-sync-sf4wz\" (UID: \"7557d295-9d3a-4d0f-933a-77390e7e179e\") " pod="openstack/barbican-db-sync-sf4wz" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.514300 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.514839 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7dead942-d6c5-4a4a-aa5e-5c57b6da0c48-scripts\") pod \"cinder-db-sync-wmh4t\" (UID: \"7dead942-d6c5-4a4a-aa5e-5c57b6da0c48\") " pod="openstack/cinder-db-sync-wmh4t" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.515133 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-wjcrj" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.516619 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.522274 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-kg2tb" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.526803 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dead942-d6c5-4a4a-aa5e-5c57b6da0c48-combined-ca-bundle\") pod \"cinder-db-sync-wmh4t\" (UID: \"7dead942-d6c5-4a4a-aa5e-5c57b6da0c48\") " pod="openstack/cinder-db-sync-wmh4t" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.526855 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7dead942-d6c5-4a4a-aa5e-5c57b6da0c48-db-sync-config-data\") pod \"cinder-db-sync-wmh4t\" (UID: \"7dead942-d6c5-4a4a-aa5e-5c57b6da0c48\") " pod="openstack/cinder-db-sync-wmh4t" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.533511 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7557d295-9d3a-4d0f-933a-77390e7e179e-db-sync-config-data\") pod \"barbican-db-sync-sf4wz\" (UID: \"7557d295-9d3a-4d0f-933a-77390e7e179e\") " pod="openstack/barbican-db-sync-sf4wz" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.548552 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8c27\" (UniqueName: \"kubernetes.io/projected/7dead942-d6c5-4a4a-aa5e-5c57b6da0c48-kube-api-access-m8c27\") pod \"cinder-db-sync-wmh4t\" (UID: \"7dead942-d6c5-4a4a-aa5e-5c57b6da0c48\") " pod="openstack/cinder-db-sync-wmh4t" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.551475 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fv5nz\" (UniqueName: \"kubernetes.io/projected/7557d295-9d3a-4d0f-933a-77390e7e179e-kube-api-access-fv5nz\") pod \"barbican-db-sync-sf4wz\" (UID: \"7557d295-9d3a-4d0f-933a-77390e7e179e\") " pod="openstack/barbican-db-sync-sf4wz" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.556061 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dead942-d6c5-4a4a-aa5e-5c57b6da0c48-config-data\") pod \"cinder-db-sync-wmh4t\" (UID: \"7dead942-d6c5-4a4a-aa5e-5c57b6da0c48\") " pod="openstack/cinder-db-sync-wmh4t" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.564755 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-825nv"] Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.583314 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-dpwxn"] Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.599837 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2vkv\" (UniqueName: \"kubernetes.io/projected/c7490835-3211-4849-83cb-ec2e642df346-kube-api-access-d2vkv\") pod \"dnsmasq-dns-785d8bcb8c-dpwxn\" (UID: \"c7490835-3211-4849-83cb-ec2e642df346\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dpwxn" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.600138 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30f08b08-85f9-4df9-97ce-f0f25238e889-logs\") pod \"placement-db-sync-825nv\" (UID: \"30f08b08-85f9-4df9-97ce-f0f25238e889\") " pod="openstack/placement-db-sync-825nv" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.600173 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7490835-3211-4849-83cb-ec2e642df346-config\") pod \"dnsmasq-dns-785d8bcb8c-dpwxn\" (UID: \"c7490835-3211-4849-83cb-ec2e642df346\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dpwxn" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.600192 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c7490835-3211-4849-83cb-ec2e642df346-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-dpwxn\" (UID: \"c7490835-3211-4849-83cb-ec2e642df346\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dpwxn" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.600218 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c7490835-3211-4849-83cb-ec2e642df346-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-dpwxn\" (UID: \"c7490835-3211-4849-83cb-ec2e642df346\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dpwxn" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.600237 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7490835-3211-4849-83cb-ec2e642df346-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-dpwxn\" (UID: \"c7490835-3211-4849-83cb-ec2e642df346\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dpwxn" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.600254 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqd5c\" (UniqueName: \"kubernetes.io/projected/30f08b08-85f9-4df9-97ce-f0f25238e889-kube-api-access-zqd5c\") pod \"placement-db-sync-825nv\" (UID: \"30f08b08-85f9-4df9-97ce-f0f25238e889\") " pod="openstack/placement-db-sync-825nv" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.600274 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30f08b08-85f9-4df9-97ce-f0f25238e889-config-data\") pod \"placement-db-sync-825nv\" (UID: \"30f08b08-85f9-4df9-97ce-f0f25238e889\") " pod="openstack/placement-db-sync-825nv" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.600302 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c7490835-3211-4849-83cb-ec2e642df346-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-dpwxn\" (UID: \"c7490835-3211-4849-83cb-ec2e642df346\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dpwxn" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.600352 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30f08b08-85f9-4df9-97ce-f0f25238e889-scripts\") pod \"placement-db-sync-825nv\" (UID: \"30f08b08-85f9-4df9-97ce-f0f25238e889\") " pod="openstack/placement-db-sync-825nv" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.600371 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30f08b08-85f9-4df9-97ce-f0f25238e889-combined-ca-bundle\") pod \"placement-db-sync-825nv\" (UID: \"30f08b08-85f9-4df9-97ce-f0f25238e889\") " pod="openstack/placement-db-sync-825nv" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.604107 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-jfbr6"] Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.609389 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-wmh4t" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.637236 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.638762 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.659326 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-sf4wz" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.663547 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.676312 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.682729 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.668890 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.737772 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7490835-3211-4849-83cb-ec2e642df346-config\") pod \"dnsmasq-dns-785d8bcb8c-dpwxn\" (UID: \"c7490835-3211-4849-83cb-ec2e642df346\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dpwxn" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.737850 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c7490835-3211-4849-83cb-ec2e642df346-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-dpwxn\" (UID: \"c7490835-3211-4849-83cb-ec2e642df346\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dpwxn" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.737923 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c7490835-3211-4849-83cb-ec2e642df346-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-dpwxn\" (UID: \"c7490835-3211-4849-83cb-ec2e642df346\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dpwxn" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.737990 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7490835-3211-4849-83cb-ec2e642df346-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-dpwxn\" (UID: \"c7490835-3211-4849-83cb-ec2e642df346\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dpwxn" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.738029 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqd5c\" (UniqueName: \"kubernetes.io/projected/30f08b08-85f9-4df9-97ce-f0f25238e889-kube-api-access-zqd5c\") pod \"placement-db-sync-825nv\" (UID: \"30f08b08-85f9-4df9-97ce-f0f25238e889\") " pod="openstack/placement-db-sync-825nv" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.738074 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30f08b08-85f9-4df9-97ce-f0f25238e889-config-data\") pod \"placement-db-sync-825nv\" (UID: \"30f08b08-85f9-4df9-97ce-f0f25238e889\") " pod="openstack/placement-db-sync-825nv" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.738169 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c7490835-3211-4849-83cb-ec2e642df346-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-dpwxn\" (UID: \"c7490835-3211-4849-83cb-ec2e642df346\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dpwxn" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.738310 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30f08b08-85f9-4df9-97ce-f0f25238e889-scripts\") pod \"placement-db-sync-825nv\" (UID: \"30f08b08-85f9-4df9-97ce-f0f25238e889\") " pod="openstack/placement-db-sync-825nv" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.738342 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30f08b08-85f9-4df9-97ce-f0f25238e889-combined-ca-bundle\") pod \"placement-db-sync-825nv\" (UID: \"30f08b08-85f9-4df9-97ce-f0f25238e889\") " pod="openstack/placement-db-sync-825nv" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.738459 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2vkv\" (UniqueName: \"kubernetes.io/projected/c7490835-3211-4849-83cb-ec2e642df346-kube-api-access-d2vkv\") pod \"dnsmasq-dns-785d8bcb8c-dpwxn\" (UID: \"c7490835-3211-4849-83cb-ec2e642df346\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dpwxn" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.738506 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30f08b08-85f9-4df9-97ce-f0f25238e889-logs\") pod \"placement-db-sync-825nv\" (UID: \"30f08b08-85f9-4df9-97ce-f0f25238e889\") " pod="openstack/placement-db-sync-825nv" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.742116 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-rvw5c" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.746481 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30f08b08-85f9-4df9-97ce-f0f25238e889-logs\") pod \"placement-db-sync-825nv\" (UID: \"30f08b08-85f9-4df9-97ce-f0f25238e889\") " pod="openstack/placement-db-sync-825nv" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.749775 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7490835-3211-4849-83cb-ec2e642df346-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-dpwxn\" (UID: \"c7490835-3211-4849-83cb-ec2e642df346\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dpwxn" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.749804 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c7490835-3211-4849-83cb-ec2e642df346-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-dpwxn\" (UID: \"c7490835-3211-4849-83cb-ec2e642df346\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dpwxn" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.754547 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30f08b08-85f9-4df9-97ce-f0f25238e889-scripts\") pod \"placement-db-sync-825nv\" (UID: \"30f08b08-85f9-4df9-97ce-f0f25238e889\") " pod="openstack/placement-db-sync-825nv" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.758389 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30f08b08-85f9-4df9-97ce-f0f25238e889-config-data\") pod \"placement-db-sync-825nv\" (UID: \"30f08b08-85f9-4df9-97ce-f0f25238e889\") " pod="openstack/placement-db-sync-825nv" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.763132 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c7490835-3211-4849-83cb-ec2e642df346-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-dpwxn\" (UID: \"c7490835-3211-4849-83cb-ec2e642df346\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dpwxn" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.806442 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqd5c\" (UniqueName: \"kubernetes.io/projected/30f08b08-85f9-4df9-97ce-f0f25238e889-kube-api-access-zqd5c\") pod \"placement-db-sync-825nv\" (UID: \"30f08b08-85f9-4df9-97ce-f0f25238e889\") " pod="openstack/placement-db-sync-825nv" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.807818 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2vkv\" (UniqueName: \"kubernetes.io/projected/c7490835-3211-4849-83cb-ec2e642df346-kube-api-access-d2vkv\") pod \"dnsmasq-dns-785d8bcb8c-dpwxn\" (UID: \"c7490835-3211-4849-83cb-ec2e642df346\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dpwxn" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.820174 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7490835-3211-4849-83cb-ec2e642df346-config\") pod \"dnsmasq-dns-785d8bcb8c-dpwxn\" (UID: \"c7490835-3211-4849-83cb-ec2e642df346\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dpwxn" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.820316 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c7490835-3211-4849-83cb-ec2e642df346-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-dpwxn\" (UID: \"c7490835-3211-4849-83cb-ec2e642df346\") " pod="openstack/dnsmasq-dns-785d8bcb8c-dpwxn" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.839793 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-9f9zw"] Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.847455 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"13a57460-238e-4006-aacd-4d7f1f0322a0\") " pod="openstack/glance-default-external-api-0" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.882133 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13a57460-238e-4006-aacd-4d7f1f0322a0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"13a57460-238e-4006-aacd-4d7f1f0322a0\") " pod="openstack/glance-default-external-api-0" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.882228 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkzpl\" (UniqueName: \"kubernetes.io/projected/13a57460-238e-4006-aacd-4d7f1f0322a0-kube-api-access-dkzpl\") pod \"glance-default-external-api-0\" (UID: \"13a57460-238e-4006-aacd-4d7f1f0322a0\") " pod="openstack/glance-default-external-api-0" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.882377 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13a57460-238e-4006-aacd-4d7f1f0322a0-scripts\") pod \"glance-default-external-api-0\" (UID: \"13a57460-238e-4006-aacd-4d7f1f0322a0\") " pod="openstack/glance-default-external-api-0" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.882432 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/13a57460-238e-4006-aacd-4d7f1f0322a0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"13a57460-238e-4006-aacd-4d7f1f0322a0\") " pod="openstack/glance-default-external-api-0" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.882526 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13a57460-238e-4006-aacd-4d7f1f0322a0-logs\") pod \"glance-default-external-api-0\" (UID: \"13a57460-238e-4006-aacd-4d7f1f0322a0\") " pod="openstack/glance-default-external-api-0" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.882578 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/13a57460-238e-4006-aacd-4d7f1f0322a0-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"13a57460-238e-4006-aacd-4d7f1f0322a0\") " pod="openstack/glance-default-external-api-0" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.882610 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13a57460-238e-4006-aacd-4d7f1f0322a0-config-data\") pod \"glance-default-external-api-0\" (UID: \"13a57460-238e-4006-aacd-4d7f1f0322a0\") " pod="openstack/glance-default-external-api-0" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.861286 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.866753 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-dpwxn" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.890623 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.896350 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.905590 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.910051 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30f08b08-85f9-4df9-97ce-f0f25238e889-combined-ca-bundle\") pod \"placement-db-sync-825nv\" (UID: \"30f08b08-85f9-4df9-97ce-f0f25238e889\") " pod="openstack/placement-db-sync-825nv" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.920306 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.987924 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13a57460-238e-4006-aacd-4d7f1f0322a0-logs\") pod \"glance-default-external-api-0\" (UID: \"13a57460-238e-4006-aacd-4d7f1f0322a0\") " pod="openstack/glance-default-external-api-0" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.988034 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/13a57460-238e-4006-aacd-4d7f1f0322a0-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"13a57460-238e-4006-aacd-4d7f1f0322a0\") " pod="openstack/glance-default-external-api-0" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.988082 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13a57460-238e-4006-aacd-4d7f1f0322a0-config-data\") pod \"glance-default-external-api-0\" (UID: \"13a57460-238e-4006-aacd-4d7f1f0322a0\") " pod="openstack/glance-default-external-api-0" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.988120 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"13a57460-238e-4006-aacd-4d7f1f0322a0\") " pod="openstack/glance-default-external-api-0" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.988158 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13a57460-238e-4006-aacd-4d7f1f0322a0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"13a57460-238e-4006-aacd-4d7f1f0322a0\") " pod="openstack/glance-default-external-api-0" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.988181 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkzpl\" (UniqueName: \"kubernetes.io/projected/13a57460-238e-4006-aacd-4d7f1f0322a0-kube-api-access-dkzpl\") pod \"glance-default-external-api-0\" (UID: \"13a57460-238e-4006-aacd-4d7f1f0322a0\") " pod="openstack/glance-default-external-api-0" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.988255 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13a57460-238e-4006-aacd-4d7f1f0322a0-scripts\") pod \"glance-default-external-api-0\" (UID: \"13a57460-238e-4006-aacd-4d7f1f0322a0\") " pod="openstack/glance-default-external-api-0" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.988317 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/13a57460-238e-4006-aacd-4d7f1f0322a0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"13a57460-238e-4006-aacd-4d7f1f0322a0\") " pod="openstack/glance-default-external-api-0" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.988436 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13a57460-238e-4006-aacd-4d7f1f0322a0-logs\") pod \"glance-default-external-api-0\" (UID: \"13a57460-238e-4006-aacd-4d7f1f0322a0\") " pod="openstack/glance-default-external-api-0" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.988707 4585 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"13a57460-238e-4006-aacd-4d7f1f0322a0\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Dec 01 14:14:52 crc kubenswrapper[4585]: I1201 14:14:52.988850 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/13a57460-238e-4006-aacd-4d7f1f0322a0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"13a57460-238e-4006-aacd-4d7f1f0322a0\") " pod="openstack/glance-default-external-api-0" Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:52.999816 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/13a57460-238e-4006-aacd-4d7f1f0322a0-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"13a57460-238e-4006-aacd-4d7f1f0322a0\") " pod="openstack/glance-default-external-api-0" Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.002043 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13a57460-238e-4006-aacd-4d7f1f0322a0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"13a57460-238e-4006-aacd-4d7f1f0322a0\") " pod="openstack/glance-default-external-api-0" Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.018639 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13a57460-238e-4006-aacd-4d7f1f0322a0-config-data\") pod \"glance-default-external-api-0\" (UID: \"13a57460-238e-4006-aacd-4d7f1f0322a0\") " pod="openstack/glance-default-external-api-0" Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.020797 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13a57460-238e-4006-aacd-4d7f1f0322a0-scripts\") pod \"glance-default-external-api-0\" (UID: \"13a57460-238e-4006-aacd-4d7f1f0322a0\") " pod="openstack/glance-default-external-api-0" Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.037140 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkzpl\" (UniqueName: \"kubernetes.io/projected/13a57460-238e-4006-aacd-4d7f1f0322a0-kube-api-access-dkzpl\") pod \"glance-default-external-api-0\" (UID: \"13a57460-238e-4006-aacd-4d7f1f0322a0\") " pod="openstack/glance-default-external-api-0" Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.090128 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/192bd785-3d80-4b5e-8db2-55a0e3846802-scripts\") pod \"ceilometer-0\" (UID: \"192bd785-3d80-4b5e-8db2-55a0e3846802\") " pod="openstack/ceilometer-0" Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.090614 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/192bd785-3d80-4b5e-8db2-55a0e3846802-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"192bd785-3d80-4b5e-8db2-55a0e3846802\") " pod="openstack/ceilometer-0" Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.090679 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/192bd785-3d80-4b5e-8db2-55a0e3846802-run-httpd\") pod \"ceilometer-0\" (UID: \"192bd785-3d80-4b5e-8db2-55a0e3846802\") " pod="openstack/ceilometer-0" Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.090764 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/192bd785-3d80-4b5e-8db2-55a0e3846802-config-data\") pod \"ceilometer-0\" (UID: \"192bd785-3d80-4b5e-8db2-55a0e3846802\") " pod="openstack/ceilometer-0" Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.090903 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/192bd785-3d80-4b5e-8db2-55a0e3846802-log-httpd\") pod \"ceilometer-0\" (UID: \"192bd785-3d80-4b5e-8db2-55a0e3846802\") " pod="openstack/ceilometer-0" Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.091016 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25flm\" (UniqueName: \"kubernetes.io/projected/192bd785-3d80-4b5e-8db2-55a0e3846802-kube-api-access-25flm\") pod \"ceilometer-0\" (UID: \"192bd785-3d80-4b5e-8db2-55a0e3846802\") " pod="openstack/ceilometer-0" Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.091109 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/192bd785-3d80-4b5e-8db2-55a0e3846802-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"192bd785-3d80-4b5e-8db2-55a0e3846802\") " pod="openstack/ceilometer-0" Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.127547 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9f9zw" event={"ID":"2d955d7c-086f-484a-872c-43764c37a3b0","Type":"ContainerStarted","Data":"9e1d52e0491a33c865991e329484ea451f934c9c062454ffa585b9f9c36c7d80"} Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.130605 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-jfbr6" event={"ID":"8b99ad93-74e8-47fc-b11b-df18ab6edf87","Type":"ContainerStarted","Data":"dd2f00ec1079bb58ce0f506d515e3fa3d1f3e6beb1df3160e87e26cf3c8adc62"} Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.136534 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.138575 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.141078 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.141266 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.152863 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"13a57460-238e-4006-aacd-4d7f1f0322a0\") " pod="openstack/glance-default-external-api-0" Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.156496 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.178227 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-825nv" Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.197720 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25flm\" (UniqueName: \"kubernetes.io/projected/192bd785-3d80-4b5e-8db2-55a0e3846802-kube-api-access-25flm\") pod \"ceilometer-0\" (UID: \"192bd785-3d80-4b5e-8db2-55a0e3846802\") " pod="openstack/ceilometer-0" Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.197852 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/192bd785-3d80-4b5e-8db2-55a0e3846802-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"192bd785-3d80-4b5e-8db2-55a0e3846802\") " pod="openstack/ceilometer-0" Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.198218 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/192bd785-3d80-4b5e-8db2-55a0e3846802-scripts\") pod \"ceilometer-0\" (UID: \"192bd785-3d80-4b5e-8db2-55a0e3846802\") " pod="openstack/ceilometer-0" Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.198272 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/192bd785-3d80-4b5e-8db2-55a0e3846802-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"192bd785-3d80-4b5e-8db2-55a0e3846802\") " pod="openstack/ceilometer-0" Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.198290 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/192bd785-3d80-4b5e-8db2-55a0e3846802-run-httpd\") pod \"ceilometer-0\" (UID: \"192bd785-3d80-4b5e-8db2-55a0e3846802\") " pod="openstack/ceilometer-0" Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.198379 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/192bd785-3d80-4b5e-8db2-55a0e3846802-config-data\") pod \"ceilometer-0\" (UID: \"192bd785-3d80-4b5e-8db2-55a0e3846802\") " pod="openstack/ceilometer-0" Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.198541 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/192bd785-3d80-4b5e-8db2-55a0e3846802-log-httpd\") pod \"ceilometer-0\" (UID: \"192bd785-3d80-4b5e-8db2-55a0e3846802\") " pod="openstack/ceilometer-0" Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.201368 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/192bd785-3d80-4b5e-8db2-55a0e3846802-run-httpd\") pod \"ceilometer-0\" (UID: \"192bd785-3d80-4b5e-8db2-55a0e3846802\") " pod="openstack/ceilometer-0" Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.208185 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/192bd785-3d80-4b5e-8db2-55a0e3846802-log-httpd\") pod \"ceilometer-0\" (UID: \"192bd785-3d80-4b5e-8db2-55a0e3846802\") " pod="openstack/ceilometer-0" Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.219308 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/192bd785-3d80-4b5e-8db2-55a0e3846802-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"192bd785-3d80-4b5e-8db2-55a0e3846802\") " pod="openstack/ceilometer-0" Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.226886 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/192bd785-3d80-4b5e-8db2-55a0e3846802-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"192bd785-3d80-4b5e-8db2-55a0e3846802\") " pod="openstack/ceilometer-0" Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.227590 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/192bd785-3d80-4b5e-8db2-55a0e3846802-scripts\") pod \"ceilometer-0\" (UID: \"192bd785-3d80-4b5e-8db2-55a0e3846802\") " pod="openstack/ceilometer-0" Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.229234 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/192bd785-3d80-4b5e-8db2-55a0e3846802-config-data\") pod \"ceilometer-0\" (UID: \"192bd785-3d80-4b5e-8db2-55a0e3846802\") " pod="openstack/ceilometer-0" Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.237723 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25flm\" (UniqueName: \"kubernetes.io/projected/192bd785-3d80-4b5e-8db2-55a0e3846802-kube-api-access-25flm\") pod \"ceilometer-0\" (UID: \"192bd785-3d80-4b5e-8db2-55a0e3846802\") " pod="openstack/ceilometer-0" Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.256825 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.285440 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-54d4dd665-7g5hl"] Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.299838 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a447b254-4b5c-4f2f-924e-ea443f001d7c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a447b254-4b5c-4f2f-924e-ea443f001d7c\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.300064 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a447b254-4b5c-4f2f-924e-ea443f001d7c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a447b254-4b5c-4f2f-924e-ea443f001d7c\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.300110 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a447b254-4b5c-4f2f-924e-ea443f001d7c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a447b254-4b5c-4f2f-924e-ea443f001d7c\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.300152 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6r6j\" (UniqueName: \"kubernetes.io/projected/a447b254-4b5c-4f2f-924e-ea443f001d7c-kube-api-access-l6r6j\") pod \"glance-default-internal-api-0\" (UID: \"a447b254-4b5c-4f2f-924e-ea443f001d7c\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.300171 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a447b254-4b5c-4f2f-924e-ea443f001d7c-logs\") pod \"glance-default-internal-api-0\" (UID: \"a447b254-4b5c-4f2f-924e-ea443f001d7c\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.300250 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a447b254-4b5c-4f2f-924e-ea443f001d7c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a447b254-4b5c-4f2f-924e-ea443f001d7c\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.300305 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"a447b254-4b5c-4f2f-924e-ea443f001d7c\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.300434 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a447b254-4b5c-4f2f-924e-ea443f001d7c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a447b254-4b5c-4f2f-924e-ea443f001d7c\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.310671 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-8556bc9b75-ldp9m"] Dec 01 14:14:53 crc kubenswrapper[4585]: W1201 14:14:53.329119 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a389fb6_e678_4552_8cf8_3aea857a545c.slice/crio-fa34beb20f110db1a486dd8ccaf3f004c4dcbbd684e658a0d5f61f4d952c9817 WatchSource:0}: Error finding container fa34beb20f110db1a486dd8ccaf3f004c4dcbbd684e658a0d5f61f4d952c9817: Status 404 returned error can't find the container with id fa34beb20f110db1a486dd8ccaf3f004c4dcbbd684e658a0d5f61f4d952c9817 Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.402237 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a447b254-4b5c-4f2f-924e-ea443f001d7c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a447b254-4b5c-4f2f-924e-ea443f001d7c\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.402299 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a447b254-4b5c-4f2f-924e-ea443f001d7c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a447b254-4b5c-4f2f-924e-ea443f001d7c\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.402368 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a447b254-4b5c-4f2f-924e-ea443f001d7c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a447b254-4b5c-4f2f-924e-ea443f001d7c\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.402405 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a447b254-4b5c-4f2f-924e-ea443f001d7c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a447b254-4b5c-4f2f-924e-ea443f001d7c\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.402435 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6r6j\" (UniqueName: \"kubernetes.io/projected/a447b254-4b5c-4f2f-924e-ea443f001d7c-kube-api-access-l6r6j\") pod \"glance-default-internal-api-0\" (UID: \"a447b254-4b5c-4f2f-924e-ea443f001d7c\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.402456 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a447b254-4b5c-4f2f-924e-ea443f001d7c-logs\") pod \"glance-default-internal-api-0\" (UID: \"a447b254-4b5c-4f2f-924e-ea443f001d7c\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.402481 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a447b254-4b5c-4f2f-924e-ea443f001d7c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a447b254-4b5c-4f2f-924e-ea443f001d7c\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.402505 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"a447b254-4b5c-4f2f-924e-ea443f001d7c\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.402840 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a447b254-4b5c-4f2f-924e-ea443f001d7c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a447b254-4b5c-4f2f-924e-ea443f001d7c\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.402866 4585 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"a447b254-4b5c-4f2f-924e-ea443f001d7c\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-internal-api-0" Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.403487 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a447b254-4b5c-4f2f-924e-ea443f001d7c-logs\") pod \"glance-default-internal-api-0\" (UID: \"a447b254-4b5c-4f2f-924e-ea443f001d7c\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.408699 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a447b254-4b5c-4f2f-924e-ea443f001d7c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a447b254-4b5c-4f2f-924e-ea443f001d7c\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.415497 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.438947 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a447b254-4b5c-4f2f-924e-ea443f001d7c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a447b254-4b5c-4f2f-924e-ea443f001d7c\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.441718 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a447b254-4b5c-4f2f-924e-ea443f001d7c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a447b254-4b5c-4f2f-924e-ea443f001d7c\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.442734 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6r6j\" (UniqueName: \"kubernetes.io/projected/a447b254-4b5c-4f2f-924e-ea443f001d7c-kube-api-access-l6r6j\") pod \"glance-default-internal-api-0\" (UID: \"a447b254-4b5c-4f2f-924e-ea443f001d7c\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.444127 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a447b254-4b5c-4f2f-924e-ea443f001d7c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a447b254-4b5c-4f2f-924e-ea443f001d7c\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.481470 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"a447b254-4b5c-4f2f-924e-ea443f001d7c\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.512618 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-kg2tb"] Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.515110 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 14:14:53 crc kubenswrapper[4585]: W1201 14:14:53.530092 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcc9ff7ae_64b5_41db_94e0_5b30cb0a923c.slice/crio-8499f1a1f4781ad5db132b68bf5309f8e152571bc043a52b3630d7a6e863e9f7 WatchSource:0}: Error finding container 8499f1a1f4781ad5db132b68bf5309f8e152571bc043a52b3630d7a6e863e9f7: Status 404 returned error can't find the container with id 8499f1a1f4781ad5db132b68bf5309f8e152571bc043a52b3630d7a6e863e9f7 Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.670541 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-wmh4t"] Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.871833 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-sf4wz"] Dec 01 14:14:53 crc kubenswrapper[4585]: I1201 14:14:53.920667 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-dpwxn"] Dec 01 14:14:53 crc kubenswrapper[4585]: W1201 14:14:53.945683 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7490835_3211_4849_83cb_ec2e642df346.slice/crio-3dcef259f9299592021520568ccb57465291599ea7ad0048a4cff820c569c8be WatchSource:0}: Error finding container 3dcef259f9299592021520568ccb57465291599ea7ad0048a4cff820c569c8be: Status 404 returned error can't find the container with id 3dcef259f9299592021520568ccb57465291599ea7ad0048a4cff820c569c8be Dec 01 14:14:54 crc kubenswrapper[4585]: I1201 14:14:54.159894 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9f9zw" event={"ID":"2d955d7c-086f-484a-872c-43764c37a3b0","Type":"ContainerStarted","Data":"994dfe87ff965b12b9062dd5f637b19bf84c6b24024cdd25b602dee97943a437"} Dec 01 14:14:54 crc kubenswrapper[4585]: I1201 14:14:54.162745 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8556bc9b75-ldp9m" event={"ID":"5a389fb6-e678-4552-8cf8-3aea857a545c","Type":"ContainerStarted","Data":"fa34beb20f110db1a486dd8ccaf3f004c4dcbbd684e658a0d5f61f4d952c9817"} Dec 01 14:14:54 crc kubenswrapper[4585]: I1201 14:14:54.164719 4585 generic.go:334] "Generic (PLEG): container finished" podID="8b99ad93-74e8-47fc-b11b-df18ab6edf87" containerID="8f185ba3446ba56fd909af56bdfae39bad6e1094c7d43065894b6a3c39e89854" exitCode=0 Dec 01 14:14:54 crc kubenswrapper[4585]: I1201 14:14:54.164779 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-jfbr6" event={"ID":"8b99ad93-74e8-47fc-b11b-df18ab6edf87","Type":"ContainerDied","Data":"8f185ba3446ba56fd909af56bdfae39bad6e1094c7d43065894b6a3c39e89854"} Dec 01 14:14:54 crc kubenswrapper[4585]: I1201 14:14:54.194510 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54d4dd665-7g5hl" event={"ID":"477277de-eaf5-4536-90d9-9091737cb66c","Type":"ContainerStarted","Data":"4daf9f1042b58fdd080dfd024c4ab2faa26471369bf810a35309b7c12aa1db0c"} Dec 01 14:14:54 crc kubenswrapper[4585]: I1201 14:14:54.204462 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-sf4wz" event={"ID":"7557d295-9d3a-4d0f-933a-77390e7e179e","Type":"ContainerStarted","Data":"fb7b03029bd42f63483b4103dd22a27c4cb66a9899d0b086dd99c0310b175f90"} Dec 01 14:14:54 crc kubenswrapper[4585]: I1201 14:14:54.206265 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-kg2tb" event={"ID":"cc9ff7ae-64b5-41db-94e0-5b30cb0a923c","Type":"ContainerStarted","Data":"9c70defbbb5c1c8842a26ab188f71e6a51ea09a520e94f285f9597e1d7859d01"} Dec 01 14:14:54 crc kubenswrapper[4585]: I1201 14:14:54.206294 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-kg2tb" event={"ID":"cc9ff7ae-64b5-41db-94e0-5b30cb0a923c","Type":"ContainerStarted","Data":"8499f1a1f4781ad5db132b68bf5309f8e152571bc043a52b3630d7a6e863e9f7"} Dec 01 14:14:54 crc kubenswrapper[4585]: I1201 14:14:54.242436 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-dpwxn" event={"ID":"c7490835-3211-4849-83cb-ec2e642df346","Type":"ContainerStarted","Data":"3dcef259f9299592021520568ccb57465291599ea7ad0048a4cff820c569c8be"} Dec 01 14:14:54 crc kubenswrapper[4585]: I1201 14:14:54.253533 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-wmh4t" event={"ID":"7dead942-d6c5-4a4a-aa5e-5c57b6da0c48","Type":"ContainerStarted","Data":"fba5b48e7d41d5e73af29130c40c82c177a2f45ee55d349fef091f6fc7de4d96"} Dec 01 14:14:54 crc kubenswrapper[4585]: I1201 14:14:54.314205 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 14:14:54 crc kubenswrapper[4585]: I1201 14:14:54.314765 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-9f9zw" podStartSLOduration=3.314748157 podStartE2EDuration="3.314748157s" podCreationTimestamp="2025-12-01 14:14:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:14:54.278785269 +0000 UTC m=+1008.262999124" watchObservedRunningTime="2025-12-01 14:14:54.314748157 +0000 UTC m=+1008.298962012" Dec 01 14:14:54 crc kubenswrapper[4585]: I1201 14:14:54.406616 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-54d4dd665-7g5hl"] Dec 01 14:14:54 crc kubenswrapper[4585]: I1201 14:14:54.416676 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-825nv"] Dec 01 14:14:54 crc kubenswrapper[4585]: I1201 14:14:54.599181 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6fdcc48fd5-lvls9"] Dec 01 14:14:54 crc kubenswrapper[4585]: I1201 14:14:54.600835 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6fdcc48fd5-lvls9" Dec 01 14:14:54 crc kubenswrapper[4585]: I1201 14:14:54.620396 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 14:14:54 crc kubenswrapper[4585]: I1201 14:14:54.641528 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6fdcc48fd5-lvls9"] Dec 01 14:14:54 crc kubenswrapper[4585]: I1201 14:14:54.674958 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a5260d09-1e2d-4d28-8c75-a717898864e6-config-data\") pod \"horizon-6fdcc48fd5-lvls9\" (UID: \"a5260d09-1e2d-4d28-8c75-a717898864e6\") " pod="openstack/horizon-6fdcc48fd5-lvls9" Dec 01 14:14:54 crc kubenswrapper[4585]: I1201 14:14:54.675046 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mddlf\" (UniqueName: \"kubernetes.io/projected/a5260d09-1e2d-4d28-8c75-a717898864e6-kube-api-access-mddlf\") pod \"horizon-6fdcc48fd5-lvls9\" (UID: \"a5260d09-1e2d-4d28-8c75-a717898864e6\") " pod="openstack/horizon-6fdcc48fd5-lvls9" Dec 01 14:14:54 crc kubenswrapper[4585]: I1201 14:14:54.675080 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a5260d09-1e2d-4d28-8c75-a717898864e6-scripts\") pod \"horizon-6fdcc48fd5-lvls9\" (UID: \"a5260d09-1e2d-4d28-8c75-a717898864e6\") " pod="openstack/horizon-6fdcc48fd5-lvls9" Dec 01 14:14:54 crc kubenswrapper[4585]: I1201 14:14:54.675106 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5260d09-1e2d-4d28-8c75-a717898864e6-logs\") pod \"horizon-6fdcc48fd5-lvls9\" (UID: \"a5260d09-1e2d-4d28-8c75-a717898864e6\") " pod="openstack/horizon-6fdcc48fd5-lvls9" Dec 01 14:14:54 crc kubenswrapper[4585]: I1201 14:14:54.676446 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a5260d09-1e2d-4d28-8c75-a717898864e6-horizon-secret-key\") pod \"horizon-6fdcc48fd5-lvls9\" (UID: \"a5260d09-1e2d-4d28-8c75-a717898864e6\") " pod="openstack/horizon-6fdcc48fd5-lvls9" Dec 01 14:14:54 crc kubenswrapper[4585]: I1201 14:14:54.714318 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-kg2tb" podStartSLOduration=2.714295508 podStartE2EDuration="2.714295508s" podCreationTimestamp="2025-12-01 14:14:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:14:54.551367145 +0000 UTC m=+1008.535581000" watchObservedRunningTime="2025-12-01 14:14:54.714295508 +0000 UTC m=+1008.698509363" Dec 01 14:14:54 crc kubenswrapper[4585]: I1201 14:14:54.747039 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 14:14:54 crc kubenswrapper[4585]: I1201 14:14:54.775062 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 14:14:54 crc kubenswrapper[4585]: I1201 14:14:54.781669 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a5260d09-1e2d-4d28-8c75-a717898864e6-horizon-secret-key\") pod \"horizon-6fdcc48fd5-lvls9\" (UID: \"a5260d09-1e2d-4d28-8c75-a717898864e6\") " pod="openstack/horizon-6fdcc48fd5-lvls9" Dec 01 14:14:54 crc kubenswrapper[4585]: I1201 14:14:54.782198 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a5260d09-1e2d-4d28-8c75-a717898864e6-config-data\") pod \"horizon-6fdcc48fd5-lvls9\" (UID: \"a5260d09-1e2d-4d28-8c75-a717898864e6\") " pod="openstack/horizon-6fdcc48fd5-lvls9" Dec 01 14:14:54 crc kubenswrapper[4585]: I1201 14:14:54.782305 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mddlf\" (UniqueName: \"kubernetes.io/projected/a5260d09-1e2d-4d28-8c75-a717898864e6-kube-api-access-mddlf\") pod \"horizon-6fdcc48fd5-lvls9\" (UID: \"a5260d09-1e2d-4d28-8c75-a717898864e6\") " pod="openstack/horizon-6fdcc48fd5-lvls9" Dec 01 14:14:54 crc kubenswrapper[4585]: I1201 14:14:54.782379 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a5260d09-1e2d-4d28-8c75-a717898864e6-scripts\") pod \"horizon-6fdcc48fd5-lvls9\" (UID: \"a5260d09-1e2d-4d28-8c75-a717898864e6\") " pod="openstack/horizon-6fdcc48fd5-lvls9" Dec 01 14:14:54 crc kubenswrapper[4585]: I1201 14:14:54.782468 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5260d09-1e2d-4d28-8c75-a717898864e6-logs\") pod \"horizon-6fdcc48fd5-lvls9\" (UID: \"a5260d09-1e2d-4d28-8c75-a717898864e6\") " pod="openstack/horizon-6fdcc48fd5-lvls9" Dec 01 14:14:54 crc kubenswrapper[4585]: I1201 14:14:54.782860 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5260d09-1e2d-4d28-8c75-a717898864e6-logs\") pod \"horizon-6fdcc48fd5-lvls9\" (UID: \"a5260d09-1e2d-4d28-8c75-a717898864e6\") " pod="openstack/horizon-6fdcc48fd5-lvls9" Dec 01 14:14:54 crc kubenswrapper[4585]: I1201 14:14:54.790153 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a5260d09-1e2d-4d28-8c75-a717898864e6-scripts\") pod \"horizon-6fdcc48fd5-lvls9\" (UID: \"a5260d09-1e2d-4d28-8c75-a717898864e6\") " pod="openstack/horizon-6fdcc48fd5-lvls9" Dec 01 14:14:54 crc kubenswrapper[4585]: I1201 14:14:54.791173 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a5260d09-1e2d-4d28-8c75-a717898864e6-config-data\") pod \"horizon-6fdcc48fd5-lvls9\" (UID: \"a5260d09-1e2d-4d28-8c75-a717898864e6\") " pod="openstack/horizon-6fdcc48fd5-lvls9" Dec 01 14:14:54 crc kubenswrapper[4585]: I1201 14:14:54.811998 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a5260d09-1e2d-4d28-8c75-a717898864e6-horizon-secret-key\") pod \"horizon-6fdcc48fd5-lvls9\" (UID: \"a5260d09-1e2d-4d28-8c75-a717898864e6\") " pod="openstack/horizon-6fdcc48fd5-lvls9" Dec 01 14:14:54 crc kubenswrapper[4585]: I1201 14:14:54.865135 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mddlf\" (UniqueName: \"kubernetes.io/projected/a5260d09-1e2d-4d28-8c75-a717898864e6-kube-api-access-mddlf\") pod \"horizon-6fdcc48fd5-lvls9\" (UID: \"a5260d09-1e2d-4d28-8c75-a717898864e6\") " pod="openstack/horizon-6fdcc48fd5-lvls9" Dec 01 14:14:54 crc kubenswrapper[4585]: I1201 14:14:54.918369 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 14:14:54 crc kubenswrapper[4585]: W1201 14:14:54.982827 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13a57460_238e_4006_aacd_4d7f1f0322a0.slice/crio-7359e14150961b88862d859ec23fd32ef8db7cfdcde348d8b3286ec5962ed91c WatchSource:0}: Error finding container 7359e14150961b88862d859ec23fd32ef8db7cfdcde348d8b3286ec5962ed91c: Status 404 returned error can't find the container with id 7359e14150961b88862d859ec23fd32ef8db7cfdcde348d8b3286ec5962ed91c Dec 01 14:14:55 crc kubenswrapper[4585]: I1201 14:14:55.011234 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6fdcc48fd5-lvls9" Dec 01 14:14:55 crc kubenswrapper[4585]: I1201 14:14:55.247360 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-jfbr6" Dec 01 14:14:55 crc kubenswrapper[4585]: I1201 14:14:55.297720 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"192bd785-3d80-4b5e-8db2-55a0e3846802","Type":"ContainerStarted","Data":"300b84db12aa113144ca01a0ef795f55e74e70e671d8419994f67cc9a4ad413b"} Dec 01 14:14:55 crc kubenswrapper[4585]: I1201 14:14:55.314052 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8b99ad93-74e8-47fc-b11b-df18ab6edf87-ovsdbserver-nb\") pod \"8b99ad93-74e8-47fc-b11b-df18ab6edf87\" (UID: \"8b99ad93-74e8-47fc-b11b-df18ab6edf87\") " Dec 01 14:14:55 crc kubenswrapper[4585]: I1201 14:14:55.314133 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8b99ad93-74e8-47fc-b11b-df18ab6edf87-ovsdbserver-sb\") pod \"8b99ad93-74e8-47fc-b11b-df18ab6edf87\" (UID: \"8b99ad93-74e8-47fc-b11b-df18ab6edf87\") " Dec 01 14:14:55 crc kubenswrapper[4585]: I1201 14:14:55.314188 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s785n\" (UniqueName: \"kubernetes.io/projected/8b99ad93-74e8-47fc-b11b-df18ab6edf87-kube-api-access-s785n\") pod \"8b99ad93-74e8-47fc-b11b-df18ab6edf87\" (UID: \"8b99ad93-74e8-47fc-b11b-df18ab6edf87\") " Dec 01 14:14:55 crc kubenswrapper[4585]: I1201 14:14:55.314270 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8b99ad93-74e8-47fc-b11b-df18ab6edf87-dns-swift-storage-0\") pod \"8b99ad93-74e8-47fc-b11b-df18ab6edf87\" (UID: \"8b99ad93-74e8-47fc-b11b-df18ab6edf87\") " Dec 01 14:14:55 crc kubenswrapper[4585]: I1201 14:14:55.314321 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b99ad93-74e8-47fc-b11b-df18ab6edf87-config\") pod \"8b99ad93-74e8-47fc-b11b-df18ab6edf87\" (UID: \"8b99ad93-74e8-47fc-b11b-df18ab6edf87\") " Dec 01 14:14:55 crc kubenswrapper[4585]: I1201 14:14:55.314395 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b99ad93-74e8-47fc-b11b-df18ab6edf87-dns-svc\") pod \"8b99ad93-74e8-47fc-b11b-df18ab6edf87\" (UID: \"8b99ad93-74e8-47fc-b11b-df18ab6edf87\") " Dec 01 14:14:55 crc kubenswrapper[4585]: I1201 14:14:55.332205 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b99ad93-74e8-47fc-b11b-df18ab6edf87-kube-api-access-s785n" (OuterVolumeSpecName: "kube-api-access-s785n") pod "8b99ad93-74e8-47fc-b11b-df18ab6edf87" (UID: "8b99ad93-74e8-47fc-b11b-df18ab6edf87"). InnerVolumeSpecName "kube-api-access-s785n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:14:55 crc kubenswrapper[4585]: I1201 14:14:55.334285 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-jfbr6" event={"ID":"8b99ad93-74e8-47fc-b11b-df18ab6edf87","Type":"ContainerDied","Data":"dd2f00ec1079bb58ce0f506d515e3fa3d1f3e6beb1df3160e87e26cf3c8adc62"} Dec 01 14:14:55 crc kubenswrapper[4585]: I1201 14:14:55.334332 4585 scope.go:117] "RemoveContainer" containerID="8f185ba3446ba56fd909af56bdfae39bad6e1094c7d43065894b6a3c39e89854" Dec 01 14:14:55 crc kubenswrapper[4585]: I1201 14:14:55.334575 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-jfbr6" Dec 01 14:14:55 crc kubenswrapper[4585]: I1201 14:14:55.358198 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b99ad93-74e8-47fc-b11b-df18ab6edf87-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8b99ad93-74e8-47fc-b11b-df18ab6edf87" (UID: "8b99ad93-74e8-47fc-b11b-df18ab6edf87"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:14:55 crc kubenswrapper[4585]: I1201 14:14:55.358395 4585 generic.go:334] "Generic (PLEG): container finished" podID="c7490835-3211-4849-83cb-ec2e642df346" containerID="fb2fb01621a9a6293cd56586b1677b36fd943fabeab2837f5ab0c35aeb045d87" exitCode=0 Dec 01 14:14:55 crc kubenswrapper[4585]: I1201 14:14:55.358469 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-dpwxn" event={"ID":"c7490835-3211-4849-83cb-ec2e642df346","Type":"ContainerDied","Data":"fb2fb01621a9a6293cd56586b1677b36fd943fabeab2837f5ab0c35aeb045d87"} Dec 01 14:14:55 crc kubenswrapper[4585]: I1201 14:14:55.370475 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"13a57460-238e-4006-aacd-4d7f1f0322a0","Type":"ContainerStarted","Data":"7359e14150961b88862d859ec23fd32ef8db7cfdcde348d8b3286ec5962ed91c"} Dec 01 14:14:55 crc kubenswrapper[4585]: I1201 14:14:55.388909 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-825nv" event={"ID":"30f08b08-85f9-4df9-97ce-f0f25238e889","Type":"ContainerStarted","Data":"a0153b7910cad9568eae2fbc14e3466eba38e67f6169a8711dec446857d34d48"} Dec 01 14:14:55 crc kubenswrapper[4585]: I1201 14:14:55.391097 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b99ad93-74e8-47fc-b11b-df18ab6edf87-config" (OuterVolumeSpecName: "config") pod "8b99ad93-74e8-47fc-b11b-df18ab6edf87" (UID: "8b99ad93-74e8-47fc-b11b-df18ab6edf87"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:14:55 crc kubenswrapper[4585]: I1201 14:14:55.391798 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b99ad93-74e8-47fc-b11b-df18ab6edf87-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8b99ad93-74e8-47fc-b11b-df18ab6edf87" (UID: "8b99ad93-74e8-47fc-b11b-df18ab6edf87"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:14:55 crc kubenswrapper[4585]: I1201 14:14:55.400814 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b99ad93-74e8-47fc-b11b-df18ab6edf87-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8b99ad93-74e8-47fc-b11b-df18ab6edf87" (UID: "8b99ad93-74e8-47fc-b11b-df18ab6edf87"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:14:55 crc kubenswrapper[4585]: I1201 14:14:55.420469 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b99ad93-74e8-47fc-b11b-df18ab6edf87-config\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:55 crc kubenswrapper[4585]: I1201 14:14:55.420493 4585 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b99ad93-74e8-47fc-b11b-df18ab6edf87-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:55 crc kubenswrapper[4585]: I1201 14:14:55.420504 4585 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8b99ad93-74e8-47fc-b11b-df18ab6edf87-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:55 crc kubenswrapper[4585]: I1201 14:14:55.420513 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s785n\" (UniqueName: \"kubernetes.io/projected/8b99ad93-74e8-47fc-b11b-df18ab6edf87-kube-api-access-s785n\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:55 crc kubenswrapper[4585]: I1201 14:14:55.420522 4585 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8b99ad93-74e8-47fc-b11b-df18ab6edf87-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:55 crc kubenswrapper[4585]: I1201 14:14:55.474714 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b99ad93-74e8-47fc-b11b-df18ab6edf87-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8b99ad93-74e8-47fc-b11b-df18ab6edf87" (UID: "8b99ad93-74e8-47fc-b11b-df18ab6edf87"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:14:55 crc kubenswrapper[4585]: I1201 14:14:55.522549 4585 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8b99ad93-74e8-47fc-b11b-df18ab6edf87-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:55 crc kubenswrapper[4585]: I1201 14:14:55.750035 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-jfbr6"] Dec 01 14:14:55 crc kubenswrapper[4585]: I1201 14:14:55.760307 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-jfbr6"] Dec 01 14:14:55 crc kubenswrapper[4585]: I1201 14:14:55.806404 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 14:14:55 crc kubenswrapper[4585]: I1201 14:14:55.917644 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6fdcc48fd5-lvls9"] Dec 01 14:14:56 crc kubenswrapper[4585]: I1201 14:14:56.465089 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b99ad93-74e8-47fc-b11b-df18ab6edf87" path="/var/lib/kubelet/pods/8b99ad93-74e8-47fc-b11b-df18ab6edf87/volumes" Dec 01 14:14:56 crc kubenswrapper[4585]: I1201 14:14:56.466498 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-dpwxn" Dec 01 14:14:56 crc kubenswrapper[4585]: I1201 14:14:56.466611 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-dpwxn" event={"ID":"c7490835-3211-4849-83cb-ec2e642df346","Type":"ContainerStarted","Data":"d96cb4613f3afa0936e764d46561752b707eb11424d55283a354ed8643649b96"} Dec 01 14:14:56 crc kubenswrapper[4585]: I1201 14:14:56.466710 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6fdcc48fd5-lvls9" event={"ID":"a5260d09-1e2d-4d28-8c75-a717898864e6","Type":"ContainerStarted","Data":"233d674c97c65e0a30d9aa063a3fca04303e435e86762bbc3afd83a4bad05bb0"} Dec 01 14:14:56 crc kubenswrapper[4585]: I1201 14:14:56.466793 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a447b254-4b5c-4f2f-924e-ea443f001d7c","Type":"ContainerStarted","Data":"e627c42afbdcc7064835aaedc31b6096dfef6ba32900c0b2d3262d969be1b59c"} Dec 01 14:14:56 crc kubenswrapper[4585]: I1201 14:14:56.704022 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-dpwxn" podStartSLOduration=4.704003585 podStartE2EDuration="4.704003585s" podCreationTimestamp="2025-12-01 14:14:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:14:56.688453271 +0000 UTC m=+1010.672667126" watchObservedRunningTime="2025-12-01 14:14:56.704003585 +0000 UTC m=+1010.688217430" Dec 01 14:14:57 crc kubenswrapper[4585]: I1201 14:14:57.473222 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a447b254-4b5c-4f2f-924e-ea443f001d7c","Type":"ContainerStarted","Data":"8114f5d3a4cdb1d058f5985a323b7f1ef194487637963d47f82e57a07a199566"} Dec 01 14:14:57 crc kubenswrapper[4585]: I1201 14:14:57.477239 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"13a57460-238e-4006-aacd-4d7f1f0322a0","Type":"ContainerStarted","Data":"b949f5c5a7be348e38c26a51ebead64616f02c114415799e011e36c8849948d9"} Dec 01 14:14:58 crc kubenswrapper[4585]: I1201 14:14:58.493584 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"13a57460-238e-4006-aacd-4d7f1f0322a0","Type":"ContainerStarted","Data":"04afeb6c6c973ca176b2fc78b62075bdb168640f6065a63c4ae9b576cb30b12b"} Dec 01 14:14:58 crc kubenswrapper[4585]: I1201 14:14:58.495663 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="13a57460-238e-4006-aacd-4d7f1f0322a0" containerName="glance-httpd" containerID="cri-o://04afeb6c6c973ca176b2fc78b62075bdb168640f6065a63c4ae9b576cb30b12b" gracePeriod=30 Dec 01 14:14:58 crc kubenswrapper[4585]: I1201 14:14:58.495812 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="13a57460-238e-4006-aacd-4d7f1f0322a0" containerName="glance-log" containerID="cri-o://b949f5c5a7be348e38c26a51ebead64616f02c114415799e011e36c8849948d9" gracePeriod=30 Dec 01 14:14:58 crc kubenswrapper[4585]: I1201 14:14:58.499635 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="a447b254-4b5c-4f2f-924e-ea443f001d7c" containerName="glance-log" containerID="cri-o://8114f5d3a4cdb1d058f5985a323b7f1ef194487637963d47f82e57a07a199566" gracePeriod=30 Dec 01 14:14:58 crc kubenswrapper[4585]: I1201 14:14:58.499640 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a447b254-4b5c-4f2f-924e-ea443f001d7c","Type":"ContainerStarted","Data":"a2a7161b2b2704763f781ebc42ab3536c720ba6fe1bc349e50d34b12f8d90ee0"} Dec 01 14:14:58 crc kubenswrapper[4585]: I1201 14:14:58.499740 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="a447b254-4b5c-4f2f-924e-ea443f001d7c" containerName="glance-httpd" containerID="cri-o://a2a7161b2b2704763f781ebc42ab3536c720ba6fe1bc349e50d34b12f8d90ee0" gracePeriod=30 Dec 01 14:14:58 crc kubenswrapper[4585]: I1201 14:14:58.514847 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.514831495 podStartE2EDuration="6.514831495s" podCreationTimestamp="2025-12-01 14:14:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:14:58.511222889 +0000 UTC m=+1012.495436744" watchObservedRunningTime="2025-12-01 14:14:58.514831495 +0000 UTC m=+1012.499045350" Dec 01 14:14:58 crc kubenswrapper[4585]: I1201 14:14:58.547540 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.547517546 podStartE2EDuration="6.547517546s" podCreationTimestamp="2025-12-01 14:14:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:14:58.544440704 +0000 UTC m=+1012.528654559" watchObservedRunningTime="2025-12-01 14:14:58.547517546 +0000 UTC m=+1012.531731401" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.367402 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.370101 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.458110 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6r6j\" (UniqueName: \"kubernetes.io/projected/a447b254-4b5c-4f2f-924e-ea443f001d7c-kube-api-access-l6r6j\") pod \"a447b254-4b5c-4f2f-924e-ea443f001d7c\" (UID: \"a447b254-4b5c-4f2f-924e-ea443f001d7c\") " Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.458162 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a447b254-4b5c-4f2f-924e-ea443f001d7c-internal-tls-certs\") pod \"a447b254-4b5c-4f2f-924e-ea443f001d7c\" (UID: \"a447b254-4b5c-4f2f-924e-ea443f001d7c\") " Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.458203 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a447b254-4b5c-4f2f-924e-ea443f001d7c-httpd-run\") pod \"a447b254-4b5c-4f2f-924e-ea443f001d7c\" (UID: \"a447b254-4b5c-4f2f-924e-ea443f001d7c\") " Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.458225 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"a447b254-4b5c-4f2f-924e-ea443f001d7c\" (UID: \"a447b254-4b5c-4f2f-924e-ea443f001d7c\") " Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.458828 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a447b254-4b5c-4f2f-924e-ea443f001d7c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a447b254-4b5c-4f2f-924e-ea443f001d7c" (UID: "a447b254-4b5c-4f2f-924e-ea443f001d7c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.458895 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/13a57460-238e-4006-aacd-4d7f1f0322a0-httpd-run\") pod \"13a57460-238e-4006-aacd-4d7f1f0322a0\" (UID: \"13a57460-238e-4006-aacd-4d7f1f0322a0\") " Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.459172 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13a57460-238e-4006-aacd-4d7f1f0322a0-logs\") pod \"13a57460-238e-4006-aacd-4d7f1f0322a0\" (UID: \"13a57460-238e-4006-aacd-4d7f1f0322a0\") " Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.459222 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a447b254-4b5c-4f2f-924e-ea443f001d7c-scripts\") pod \"a447b254-4b5c-4f2f-924e-ea443f001d7c\" (UID: \"a447b254-4b5c-4f2f-924e-ea443f001d7c\") " Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.459301 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"13a57460-238e-4006-aacd-4d7f1f0322a0\" (UID: \"13a57460-238e-4006-aacd-4d7f1f0322a0\") " Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.459324 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13a57460-238e-4006-aacd-4d7f1f0322a0-scripts\") pod \"13a57460-238e-4006-aacd-4d7f1f0322a0\" (UID: \"13a57460-238e-4006-aacd-4d7f1f0322a0\") " Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.459364 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a447b254-4b5c-4f2f-924e-ea443f001d7c-config-data\") pod \"a447b254-4b5c-4f2f-924e-ea443f001d7c\" (UID: \"a447b254-4b5c-4f2f-924e-ea443f001d7c\") " Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.459385 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13a57460-238e-4006-aacd-4d7f1f0322a0-config-data\") pod \"13a57460-238e-4006-aacd-4d7f1f0322a0\" (UID: \"13a57460-238e-4006-aacd-4d7f1f0322a0\") " Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.459409 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a447b254-4b5c-4f2f-924e-ea443f001d7c-logs\") pod \"a447b254-4b5c-4f2f-924e-ea443f001d7c\" (UID: \"a447b254-4b5c-4f2f-924e-ea443f001d7c\") " Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.459437 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/13a57460-238e-4006-aacd-4d7f1f0322a0-public-tls-certs\") pod \"13a57460-238e-4006-aacd-4d7f1f0322a0\" (UID: \"13a57460-238e-4006-aacd-4d7f1f0322a0\") " Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.459486 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkzpl\" (UniqueName: \"kubernetes.io/projected/13a57460-238e-4006-aacd-4d7f1f0322a0-kube-api-access-dkzpl\") pod \"13a57460-238e-4006-aacd-4d7f1f0322a0\" (UID: \"13a57460-238e-4006-aacd-4d7f1f0322a0\") " Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.459565 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13a57460-238e-4006-aacd-4d7f1f0322a0-combined-ca-bundle\") pod \"13a57460-238e-4006-aacd-4d7f1f0322a0\" (UID: \"13a57460-238e-4006-aacd-4d7f1f0322a0\") " Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.459589 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a447b254-4b5c-4f2f-924e-ea443f001d7c-combined-ca-bundle\") pod \"a447b254-4b5c-4f2f-924e-ea443f001d7c\" (UID: \"a447b254-4b5c-4f2f-924e-ea443f001d7c\") " Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.461203 4585 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a447b254-4b5c-4f2f-924e-ea443f001d7c-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.463989 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13a57460-238e-4006-aacd-4d7f1f0322a0-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "13a57460-238e-4006-aacd-4d7f1f0322a0" (UID: "13a57460-238e-4006-aacd-4d7f1f0322a0"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.465726 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13a57460-238e-4006-aacd-4d7f1f0322a0-logs" (OuterVolumeSpecName: "logs") pod "13a57460-238e-4006-aacd-4d7f1f0322a0" (UID: "13a57460-238e-4006-aacd-4d7f1f0322a0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.467695 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a447b254-4b5c-4f2f-924e-ea443f001d7c-kube-api-access-l6r6j" (OuterVolumeSpecName: "kube-api-access-l6r6j") pod "a447b254-4b5c-4f2f-924e-ea443f001d7c" (UID: "a447b254-4b5c-4f2f-924e-ea443f001d7c"). InnerVolumeSpecName "kube-api-access-l6r6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.467703 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "a447b254-4b5c-4f2f-924e-ea443f001d7c" (UID: "a447b254-4b5c-4f2f-924e-ea443f001d7c"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.467705 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a447b254-4b5c-4f2f-924e-ea443f001d7c-scripts" (OuterVolumeSpecName: "scripts") pod "a447b254-4b5c-4f2f-924e-ea443f001d7c" (UID: "a447b254-4b5c-4f2f-924e-ea443f001d7c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.469367 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a447b254-4b5c-4f2f-924e-ea443f001d7c-logs" (OuterVolumeSpecName: "logs") pod "a447b254-4b5c-4f2f-924e-ea443f001d7c" (UID: "a447b254-4b5c-4f2f-924e-ea443f001d7c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.472112 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "13a57460-238e-4006-aacd-4d7f1f0322a0" (UID: "13a57460-238e-4006-aacd-4d7f1f0322a0"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.475570 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13a57460-238e-4006-aacd-4d7f1f0322a0-scripts" (OuterVolumeSpecName: "scripts") pod "13a57460-238e-4006-aacd-4d7f1f0322a0" (UID: "13a57460-238e-4006-aacd-4d7f1f0322a0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.477860 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13a57460-238e-4006-aacd-4d7f1f0322a0-kube-api-access-dkzpl" (OuterVolumeSpecName: "kube-api-access-dkzpl") pod "13a57460-238e-4006-aacd-4d7f1f0322a0" (UID: "13a57460-238e-4006-aacd-4d7f1f0322a0"). InnerVolumeSpecName "kube-api-access-dkzpl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.495156 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13a57460-238e-4006-aacd-4d7f1f0322a0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "13a57460-238e-4006-aacd-4d7f1f0322a0" (UID: "13a57460-238e-4006-aacd-4d7f1f0322a0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.514240 4585 generic.go:334] "Generic (PLEG): container finished" podID="2d955d7c-086f-484a-872c-43764c37a3b0" containerID="994dfe87ff965b12b9062dd5f637b19bf84c6b24024cdd25b602dee97943a437" exitCode=0 Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.514331 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9f9zw" event={"ID":"2d955d7c-086f-484a-872c-43764c37a3b0","Type":"ContainerDied","Data":"994dfe87ff965b12b9062dd5f637b19bf84c6b24024cdd25b602dee97943a437"} Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.515617 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13a57460-238e-4006-aacd-4d7f1f0322a0-config-data" (OuterVolumeSpecName: "config-data") pod "13a57460-238e-4006-aacd-4d7f1f0322a0" (UID: "13a57460-238e-4006-aacd-4d7f1f0322a0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.518600 4585 generic.go:334] "Generic (PLEG): container finished" podID="a447b254-4b5c-4f2f-924e-ea443f001d7c" containerID="a2a7161b2b2704763f781ebc42ab3536c720ba6fe1bc349e50d34b12f8d90ee0" exitCode=143 Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.518627 4585 generic.go:334] "Generic (PLEG): container finished" podID="a447b254-4b5c-4f2f-924e-ea443f001d7c" containerID="8114f5d3a4cdb1d058f5985a323b7f1ef194487637963d47f82e57a07a199566" exitCode=143 Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.518668 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a447b254-4b5c-4f2f-924e-ea443f001d7c","Type":"ContainerDied","Data":"a2a7161b2b2704763f781ebc42ab3536c720ba6fe1bc349e50d34b12f8d90ee0"} Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.518691 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a447b254-4b5c-4f2f-924e-ea443f001d7c","Type":"ContainerDied","Data":"8114f5d3a4cdb1d058f5985a323b7f1ef194487637963d47f82e57a07a199566"} Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.518702 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a447b254-4b5c-4f2f-924e-ea443f001d7c","Type":"ContainerDied","Data":"e627c42afbdcc7064835aaedc31b6096dfef6ba32900c0b2d3262d969be1b59c"} Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.518717 4585 scope.go:117] "RemoveContainer" containerID="a2a7161b2b2704763f781ebc42ab3536c720ba6fe1bc349e50d34b12f8d90ee0" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.518820 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.535388 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13a57460-238e-4006-aacd-4d7f1f0322a0-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "13a57460-238e-4006-aacd-4d7f1f0322a0" (UID: "13a57460-238e-4006-aacd-4d7f1f0322a0"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.537233 4585 generic.go:334] "Generic (PLEG): container finished" podID="13a57460-238e-4006-aacd-4d7f1f0322a0" containerID="04afeb6c6c973ca176b2fc78b62075bdb168640f6065a63c4ae9b576cb30b12b" exitCode=0 Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.537260 4585 generic.go:334] "Generic (PLEG): container finished" podID="13a57460-238e-4006-aacd-4d7f1f0322a0" containerID="b949f5c5a7be348e38c26a51ebead64616f02c114415799e011e36c8849948d9" exitCode=143 Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.537296 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"13a57460-238e-4006-aacd-4d7f1f0322a0","Type":"ContainerDied","Data":"04afeb6c6c973ca176b2fc78b62075bdb168640f6065a63c4ae9b576cb30b12b"} Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.537335 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"13a57460-238e-4006-aacd-4d7f1f0322a0","Type":"ContainerDied","Data":"b949f5c5a7be348e38c26a51ebead64616f02c114415799e011e36c8849948d9"} Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.537351 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"13a57460-238e-4006-aacd-4d7f1f0322a0","Type":"ContainerDied","Data":"7359e14150961b88862d859ec23fd32ef8db7cfdcde348d8b3286ec5962ed91c"} Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.537417 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.544108 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a447b254-4b5c-4f2f-924e-ea443f001d7c-config-data" (OuterVolumeSpecName: "config-data") pod "a447b254-4b5c-4f2f-924e-ea443f001d7c" (UID: "a447b254-4b5c-4f2f-924e-ea443f001d7c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.544218 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a447b254-4b5c-4f2f-924e-ea443f001d7c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a447b254-4b5c-4f2f-924e-ea443f001d7c" (UID: "a447b254-4b5c-4f2f-924e-ea443f001d7c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.551968 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a447b254-4b5c-4f2f-924e-ea443f001d7c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a447b254-4b5c-4f2f-924e-ea443f001d7c" (UID: "a447b254-4b5c-4f2f-924e-ea443f001d7c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.563627 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13a57460-238e-4006-aacd-4d7f1f0322a0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.563666 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a447b254-4b5c-4f2f-924e-ea443f001d7c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.563681 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6r6j\" (UniqueName: \"kubernetes.io/projected/a447b254-4b5c-4f2f-924e-ea443f001d7c-kube-api-access-l6r6j\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.563693 4585 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a447b254-4b5c-4f2f-924e-ea443f001d7c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.563727 4585 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.563742 4585 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/13a57460-238e-4006-aacd-4d7f1f0322a0-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.563755 4585 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13a57460-238e-4006-aacd-4d7f1f0322a0-logs\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.563769 4585 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a447b254-4b5c-4f2f-924e-ea443f001d7c-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.563792 4585 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.563805 4585 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13a57460-238e-4006-aacd-4d7f1f0322a0-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.563815 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a447b254-4b5c-4f2f-924e-ea443f001d7c-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.563826 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13a57460-238e-4006-aacd-4d7f1f0322a0-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.563836 4585 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a447b254-4b5c-4f2f-924e-ea443f001d7c-logs\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.563846 4585 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/13a57460-238e-4006-aacd-4d7f1f0322a0-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.563856 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkzpl\" (UniqueName: \"kubernetes.io/projected/13a57460-238e-4006-aacd-4d7f1f0322a0-kube-api-access-dkzpl\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.585558 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.593251 4585 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.594507 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.600887 4585 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.624663 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 14:14:59 crc kubenswrapper[4585]: E1201 14:14:59.625293 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b99ad93-74e8-47fc-b11b-df18ab6edf87" containerName="init" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.625311 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b99ad93-74e8-47fc-b11b-df18ab6edf87" containerName="init" Dec 01 14:14:59 crc kubenswrapper[4585]: E1201 14:14:59.625327 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13a57460-238e-4006-aacd-4d7f1f0322a0" containerName="glance-httpd" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.625334 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="13a57460-238e-4006-aacd-4d7f1f0322a0" containerName="glance-httpd" Dec 01 14:14:59 crc kubenswrapper[4585]: E1201 14:14:59.625342 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a447b254-4b5c-4f2f-924e-ea443f001d7c" containerName="glance-log" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.625351 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="a447b254-4b5c-4f2f-924e-ea443f001d7c" containerName="glance-log" Dec 01 14:14:59 crc kubenswrapper[4585]: E1201 14:14:59.625382 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a447b254-4b5c-4f2f-924e-ea443f001d7c" containerName="glance-httpd" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.625389 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="a447b254-4b5c-4f2f-924e-ea443f001d7c" containerName="glance-httpd" Dec 01 14:14:59 crc kubenswrapper[4585]: E1201 14:14:59.625411 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13a57460-238e-4006-aacd-4d7f1f0322a0" containerName="glance-log" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.625417 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="13a57460-238e-4006-aacd-4d7f1f0322a0" containerName="glance-log" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.625607 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="13a57460-238e-4006-aacd-4d7f1f0322a0" containerName="glance-log" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.625624 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b99ad93-74e8-47fc-b11b-df18ab6edf87" containerName="init" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.625635 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="13a57460-238e-4006-aacd-4d7f1f0322a0" containerName="glance-httpd" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.625654 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="a447b254-4b5c-4f2f-924e-ea443f001d7c" containerName="glance-log" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.625666 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="a447b254-4b5c-4f2f-924e-ea443f001d7c" containerName="glance-httpd" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.626760 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.640651 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.663333 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.663610 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.665350 4585 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.665388 4585 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.674744 4585 scope.go:117] "RemoveContainer" containerID="8114f5d3a4cdb1d058f5985a323b7f1ef194487637963d47f82e57a07a199566" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.768279 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01f410ce-dfb4-4188-9a09-d3983f5ef047-scripts\") pod \"glance-default-external-api-0\" (UID: \"01f410ce-dfb4-4188-9a09-d3983f5ef047\") " pod="openstack/glance-default-external-api-0" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.768336 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01f410ce-dfb4-4188-9a09-d3983f5ef047-config-data\") pod \"glance-default-external-api-0\" (UID: \"01f410ce-dfb4-4188-9a09-d3983f5ef047\") " pod="openstack/glance-default-external-api-0" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.768442 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/01f410ce-dfb4-4188-9a09-d3983f5ef047-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"01f410ce-dfb4-4188-9a09-d3983f5ef047\") " pod="openstack/glance-default-external-api-0" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.768477 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"01f410ce-dfb4-4188-9a09-d3983f5ef047\") " pod="openstack/glance-default-external-api-0" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.768503 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01f410ce-dfb4-4188-9a09-d3983f5ef047-logs\") pod \"glance-default-external-api-0\" (UID: \"01f410ce-dfb4-4188-9a09-d3983f5ef047\") " pod="openstack/glance-default-external-api-0" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.768529 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01f410ce-dfb4-4188-9a09-d3983f5ef047-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"01f410ce-dfb4-4188-9a09-d3983f5ef047\") " pod="openstack/glance-default-external-api-0" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.768595 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/01f410ce-dfb4-4188-9a09-d3983f5ef047-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"01f410ce-dfb4-4188-9a09-d3983f5ef047\") " pod="openstack/glance-default-external-api-0" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.768610 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn4tm\" (UniqueName: \"kubernetes.io/projected/01f410ce-dfb4-4188-9a09-d3983f5ef047-kube-api-access-kn4tm\") pod \"glance-default-external-api-0\" (UID: \"01f410ce-dfb4-4188-9a09-d3983f5ef047\") " pod="openstack/glance-default-external-api-0" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.791697 4585 scope.go:117] "RemoveContainer" containerID="a2a7161b2b2704763f781ebc42ab3536c720ba6fe1bc349e50d34b12f8d90ee0" Dec 01 14:14:59 crc kubenswrapper[4585]: E1201 14:14:59.792360 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2a7161b2b2704763f781ebc42ab3536c720ba6fe1bc349e50d34b12f8d90ee0\": container with ID starting with a2a7161b2b2704763f781ebc42ab3536c720ba6fe1bc349e50d34b12f8d90ee0 not found: ID does not exist" containerID="a2a7161b2b2704763f781ebc42ab3536c720ba6fe1bc349e50d34b12f8d90ee0" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.792395 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2a7161b2b2704763f781ebc42ab3536c720ba6fe1bc349e50d34b12f8d90ee0"} err="failed to get container status \"a2a7161b2b2704763f781ebc42ab3536c720ba6fe1bc349e50d34b12f8d90ee0\": rpc error: code = NotFound desc = could not find container \"a2a7161b2b2704763f781ebc42ab3536c720ba6fe1bc349e50d34b12f8d90ee0\": container with ID starting with a2a7161b2b2704763f781ebc42ab3536c720ba6fe1bc349e50d34b12f8d90ee0 not found: ID does not exist" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.792442 4585 scope.go:117] "RemoveContainer" containerID="8114f5d3a4cdb1d058f5985a323b7f1ef194487637963d47f82e57a07a199566" Dec 01 14:14:59 crc kubenswrapper[4585]: E1201 14:14:59.792919 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8114f5d3a4cdb1d058f5985a323b7f1ef194487637963d47f82e57a07a199566\": container with ID starting with 8114f5d3a4cdb1d058f5985a323b7f1ef194487637963d47f82e57a07a199566 not found: ID does not exist" containerID="8114f5d3a4cdb1d058f5985a323b7f1ef194487637963d47f82e57a07a199566" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.792937 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8114f5d3a4cdb1d058f5985a323b7f1ef194487637963d47f82e57a07a199566"} err="failed to get container status \"8114f5d3a4cdb1d058f5985a323b7f1ef194487637963d47f82e57a07a199566\": rpc error: code = NotFound desc = could not find container \"8114f5d3a4cdb1d058f5985a323b7f1ef194487637963d47f82e57a07a199566\": container with ID starting with 8114f5d3a4cdb1d058f5985a323b7f1ef194487637963d47f82e57a07a199566 not found: ID does not exist" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.792953 4585 scope.go:117] "RemoveContainer" containerID="a2a7161b2b2704763f781ebc42ab3536c720ba6fe1bc349e50d34b12f8d90ee0" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.793531 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2a7161b2b2704763f781ebc42ab3536c720ba6fe1bc349e50d34b12f8d90ee0"} err="failed to get container status \"a2a7161b2b2704763f781ebc42ab3536c720ba6fe1bc349e50d34b12f8d90ee0\": rpc error: code = NotFound desc = could not find container \"a2a7161b2b2704763f781ebc42ab3536c720ba6fe1bc349e50d34b12f8d90ee0\": container with ID starting with a2a7161b2b2704763f781ebc42ab3536c720ba6fe1bc349e50d34b12f8d90ee0 not found: ID does not exist" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.793551 4585 scope.go:117] "RemoveContainer" containerID="8114f5d3a4cdb1d058f5985a323b7f1ef194487637963d47f82e57a07a199566" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.793924 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8114f5d3a4cdb1d058f5985a323b7f1ef194487637963d47f82e57a07a199566"} err="failed to get container status \"8114f5d3a4cdb1d058f5985a323b7f1ef194487637963d47f82e57a07a199566\": rpc error: code = NotFound desc = could not find container \"8114f5d3a4cdb1d058f5985a323b7f1ef194487637963d47f82e57a07a199566\": container with ID starting with 8114f5d3a4cdb1d058f5985a323b7f1ef194487637963d47f82e57a07a199566 not found: ID does not exist" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.793962 4585 scope.go:117] "RemoveContainer" containerID="04afeb6c6c973ca176b2fc78b62075bdb168640f6065a63c4ae9b576cb30b12b" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.870382 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01f410ce-dfb4-4188-9a09-d3983f5ef047-scripts\") pod \"glance-default-external-api-0\" (UID: \"01f410ce-dfb4-4188-9a09-d3983f5ef047\") " pod="openstack/glance-default-external-api-0" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.870439 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01f410ce-dfb4-4188-9a09-d3983f5ef047-config-data\") pod \"glance-default-external-api-0\" (UID: \"01f410ce-dfb4-4188-9a09-d3983f5ef047\") " pod="openstack/glance-default-external-api-0" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.870528 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/01f410ce-dfb4-4188-9a09-d3983f5ef047-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"01f410ce-dfb4-4188-9a09-d3983f5ef047\") " pod="openstack/glance-default-external-api-0" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.870557 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"01f410ce-dfb4-4188-9a09-d3983f5ef047\") " pod="openstack/glance-default-external-api-0" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.870583 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01f410ce-dfb4-4188-9a09-d3983f5ef047-logs\") pod \"glance-default-external-api-0\" (UID: \"01f410ce-dfb4-4188-9a09-d3983f5ef047\") " pod="openstack/glance-default-external-api-0" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.870625 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01f410ce-dfb4-4188-9a09-d3983f5ef047-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"01f410ce-dfb4-4188-9a09-d3983f5ef047\") " pod="openstack/glance-default-external-api-0" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.870649 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/01f410ce-dfb4-4188-9a09-d3983f5ef047-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"01f410ce-dfb4-4188-9a09-d3983f5ef047\") " pod="openstack/glance-default-external-api-0" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.870664 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kn4tm\" (UniqueName: \"kubernetes.io/projected/01f410ce-dfb4-4188-9a09-d3983f5ef047-kube-api-access-kn4tm\") pod \"glance-default-external-api-0\" (UID: \"01f410ce-dfb4-4188-9a09-d3983f5ef047\") " pod="openstack/glance-default-external-api-0" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.876220 4585 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"01f410ce-dfb4-4188-9a09-d3983f5ef047\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.877722 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01f410ce-dfb4-4188-9a09-d3983f5ef047-logs\") pod \"glance-default-external-api-0\" (UID: \"01f410ce-dfb4-4188-9a09-d3983f5ef047\") " pod="openstack/glance-default-external-api-0" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.880909 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/01f410ce-dfb4-4188-9a09-d3983f5ef047-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"01f410ce-dfb4-4188-9a09-d3983f5ef047\") " pod="openstack/glance-default-external-api-0" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.885675 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01f410ce-dfb4-4188-9a09-d3983f5ef047-scripts\") pod \"glance-default-external-api-0\" (UID: \"01f410ce-dfb4-4188-9a09-d3983f5ef047\") " pod="openstack/glance-default-external-api-0" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.890759 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01f410ce-dfb4-4188-9a09-d3983f5ef047-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"01f410ce-dfb4-4188-9a09-d3983f5ef047\") " pod="openstack/glance-default-external-api-0" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.893786 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/01f410ce-dfb4-4188-9a09-d3983f5ef047-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"01f410ce-dfb4-4188-9a09-d3983f5ef047\") " pod="openstack/glance-default-external-api-0" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.902025 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn4tm\" (UniqueName: \"kubernetes.io/projected/01f410ce-dfb4-4188-9a09-d3983f5ef047-kube-api-access-kn4tm\") pod \"glance-default-external-api-0\" (UID: \"01f410ce-dfb4-4188-9a09-d3983f5ef047\") " pod="openstack/glance-default-external-api-0" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.903568 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01f410ce-dfb4-4188-9a09-d3983f5ef047-config-data\") pod \"glance-default-external-api-0\" (UID: \"01f410ce-dfb4-4188-9a09-d3983f5ef047\") " pod="openstack/glance-default-external-api-0" Dec 01 14:14:59 crc kubenswrapper[4585]: I1201 14:14:59.941362 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"01f410ce-dfb4-4188-9a09-d3983f5ef047\") " pod="openstack/glance-default-external-api-0" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.013952 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.069636 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.097292 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.121297 4585 scope.go:117] "RemoveContainer" containerID="b949f5c5a7be348e38c26a51ebead64616f02c114415799e011e36c8849948d9" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.132051 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.133466 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.138621 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.138842 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.148726 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.202432 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-8556bc9b75-ldp9m"] Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.250956 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6f5b64975d-2mfhq"] Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.264885 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f5b64975d-2mfhq" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.281801 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.284514 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/defa24de-de6e-4484-b17a-ff8a1cd31465-scripts\") pod \"glance-default-internal-api-0\" (UID: \"defa24de-de6e-4484-b17a-ff8a1cd31465\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.284635 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdrfs\" (UniqueName: \"kubernetes.io/projected/defa24de-de6e-4484-b17a-ff8a1cd31465-kube-api-access-xdrfs\") pod \"glance-default-internal-api-0\" (UID: \"defa24de-de6e-4484-b17a-ff8a1cd31465\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.286999 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6f5b64975d-2mfhq"] Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.289246 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"defa24de-de6e-4484-b17a-ff8a1cd31465\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.289305 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/defa24de-de6e-4484-b17a-ff8a1cd31465-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"defa24de-de6e-4484-b17a-ff8a1cd31465\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.289349 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/defa24de-de6e-4484-b17a-ff8a1cd31465-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"defa24de-de6e-4484-b17a-ff8a1cd31465\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.289383 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/defa24de-de6e-4484-b17a-ff8a1cd31465-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"defa24de-de6e-4484-b17a-ff8a1cd31465\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.289414 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/defa24de-de6e-4484-b17a-ff8a1cd31465-config-data\") pod \"glance-default-internal-api-0\" (UID: \"defa24de-de6e-4484-b17a-ff8a1cd31465\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.289522 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/defa24de-de6e-4484-b17a-ff8a1cd31465-logs\") pod \"glance-default-internal-api-0\" (UID: \"defa24de-de6e-4484-b17a-ff8a1cd31465\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.298615 4585 scope.go:117] "RemoveContainer" containerID="04afeb6c6c973ca176b2fc78b62075bdb168640f6065a63c4ae9b576cb30b12b" Dec 01 14:15:00 crc kubenswrapper[4585]: E1201 14:15:00.304543 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04afeb6c6c973ca176b2fc78b62075bdb168640f6065a63c4ae9b576cb30b12b\": container with ID starting with 04afeb6c6c973ca176b2fc78b62075bdb168640f6065a63c4ae9b576cb30b12b not found: ID does not exist" containerID="04afeb6c6c973ca176b2fc78b62075bdb168640f6065a63c4ae9b576cb30b12b" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.304602 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04afeb6c6c973ca176b2fc78b62075bdb168640f6065a63c4ae9b576cb30b12b"} err="failed to get container status \"04afeb6c6c973ca176b2fc78b62075bdb168640f6065a63c4ae9b576cb30b12b\": rpc error: code = NotFound desc = could not find container \"04afeb6c6c973ca176b2fc78b62075bdb168640f6065a63c4ae9b576cb30b12b\": container with ID starting with 04afeb6c6c973ca176b2fc78b62075bdb168640f6065a63c4ae9b576cb30b12b not found: ID does not exist" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.304648 4585 scope.go:117] "RemoveContainer" containerID="b949f5c5a7be348e38c26a51ebead64616f02c114415799e011e36c8849948d9" Dec 01 14:15:00 crc kubenswrapper[4585]: E1201 14:15:00.306724 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b949f5c5a7be348e38c26a51ebead64616f02c114415799e011e36c8849948d9\": container with ID starting with b949f5c5a7be348e38c26a51ebead64616f02c114415799e011e36c8849948d9 not found: ID does not exist" containerID="b949f5c5a7be348e38c26a51ebead64616f02c114415799e011e36c8849948d9" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.306746 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b949f5c5a7be348e38c26a51ebead64616f02c114415799e011e36c8849948d9"} err="failed to get container status \"b949f5c5a7be348e38c26a51ebead64616f02c114415799e011e36c8849948d9\": rpc error: code = NotFound desc = could not find container \"b949f5c5a7be348e38c26a51ebead64616f02c114415799e011e36c8849948d9\": container with ID starting with b949f5c5a7be348e38c26a51ebead64616f02c114415799e011e36c8849948d9 not found: ID does not exist" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.306837 4585 scope.go:117] "RemoveContainer" containerID="04afeb6c6c973ca176b2fc78b62075bdb168640f6065a63c4ae9b576cb30b12b" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.312869 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04afeb6c6c973ca176b2fc78b62075bdb168640f6065a63c4ae9b576cb30b12b"} err="failed to get container status \"04afeb6c6c973ca176b2fc78b62075bdb168640f6065a63c4ae9b576cb30b12b\": rpc error: code = NotFound desc = could not find container \"04afeb6c6c973ca176b2fc78b62075bdb168640f6065a63c4ae9b576cb30b12b\": container with ID starting with 04afeb6c6c973ca176b2fc78b62075bdb168640f6065a63c4ae9b576cb30b12b not found: ID does not exist" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.312900 4585 scope.go:117] "RemoveContainer" containerID="b949f5c5a7be348e38c26a51ebead64616f02c114415799e011e36c8849948d9" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.313606 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b949f5c5a7be348e38c26a51ebead64616f02c114415799e011e36c8849948d9"} err="failed to get container status \"b949f5c5a7be348e38c26a51ebead64616f02c114415799e011e36c8849948d9\": rpc error: code = NotFound desc = could not find container \"b949f5c5a7be348e38c26a51ebead64616f02c114415799e011e36c8849948d9\": container with ID starting with b949f5c5a7be348e38c26a51ebead64616f02c114415799e011e36c8849948d9 not found: ID does not exist" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.316524 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.349086 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409975-6pxl5"] Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.350288 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409975-6pxl5" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.352280 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.356382 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.357170 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409975-6pxl5"] Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.373046 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6fdcc48fd5-lvls9"] Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.399136 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"defa24de-de6e-4484-b17a-ff8a1cd31465\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.399470 4585 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"defa24de-de6e-4484-b17a-ff8a1cd31465\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-internal-api-0" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.399634 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/defa24de-de6e-4484-b17a-ff8a1cd31465-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"defa24de-de6e-4484-b17a-ff8a1cd31465\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.399775 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/defa24de-de6e-4484-b17a-ff8a1cd31465-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"defa24de-de6e-4484-b17a-ff8a1cd31465\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.399810 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/defa24de-de6e-4484-b17a-ff8a1cd31465-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"defa24de-de6e-4484-b17a-ff8a1cd31465\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.399858 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/defa24de-de6e-4484-b17a-ff8a1cd31465-config-data\") pod \"glance-default-internal-api-0\" (UID: \"defa24de-de6e-4484-b17a-ff8a1cd31465\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.399921 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d6fd44d5-e430-42bb-ad0b-7c78e7a1f288-config-data\") pod \"horizon-6f5b64975d-2mfhq\" (UID: \"d6fd44d5-e430-42bb-ad0b-7c78e7a1f288\") " pod="openstack/horizon-6f5b64975d-2mfhq" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.399994 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6fd44d5-e430-42bb-ad0b-7c78e7a1f288-combined-ca-bundle\") pod \"horizon-6f5b64975d-2mfhq\" (UID: \"d6fd44d5-e430-42bb-ad0b-7c78e7a1f288\") " pod="openstack/horizon-6f5b64975d-2mfhq" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.400021 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/defa24de-de6e-4484-b17a-ff8a1cd31465-logs\") pod \"glance-default-internal-api-0\" (UID: \"defa24de-de6e-4484-b17a-ff8a1cd31465\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.400091 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6fd44d5-e430-42bb-ad0b-7c78e7a1f288-logs\") pod \"horizon-6f5b64975d-2mfhq\" (UID: \"d6fd44d5-e430-42bb-ad0b-7c78e7a1f288\") " pod="openstack/horizon-6f5b64975d-2mfhq" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.400116 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/defa24de-de6e-4484-b17a-ff8a1cd31465-scripts\") pod \"glance-default-internal-api-0\" (UID: \"defa24de-de6e-4484-b17a-ff8a1cd31465\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.400177 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmxvr\" (UniqueName: \"kubernetes.io/projected/d6fd44d5-e430-42bb-ad0b-7c78e7a1f288-kube-api-access-gmxvr\") pod \"horizon-6f5b64975d-2mfhq\" (UID: \"d6fd44d5-e430-42bb-ad0b-7c78e7a1f288\") " pod="openstack/horizon-6f5b64975d-2mfhq" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.400299 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdrfs\" (UniqueName: \"kubernetes.io/projected/defa24de-de6e-4484-b17a-ff8a1cd31465-kube-api-access-xdrfs\") pod \"glance-default-internal-api-0\" (UID: \"defa24de-de6e-4484-b17a-ff8a1cd31465\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.400352 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d6fd44d5-e430-42bb-ad0b-7c78e7a1f288-horizon-secret-key\") pod \"horizon-6f5b64975d-2mfhq\" (UID: \"d6fd44d5-e430-42bb-ad0b-7c78e7a1f288\") " pod="openstack/horizon-6f5b64975d-2mfhq" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.400385 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6fd44d5-e430-42bb-ad0b-7c78e7a1f288-horizon-tls-certs\") pod \"horizon-6f5b64975d-2mfhq\" (UID: \"d6fd44d5-e430-42bb-ad0b-7c78e7a1f288\") " pod="openstack/horizon-6f5b64975d-2mfhq" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.400424 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d6fd44d5-e430-42bb-ad0b-7c78e7a1f288-scripts\") pod \"horizon-6f5b64975d-2mfhq\" (UID: \"d6fd44d5-e430-42bb-ad0b-7c78e7a1f288\") " pod="openstack/horizon-6f5b64975d-2mfhq" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.402848 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/defa24de-de6e-4484-b17a-ff8a1cd31465-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"defa24de-de6e-4484-b17a-ff8a1cd31465\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.414423 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/defa24de-de6e-4484-b17a-ff8a1cd31465-logs\") pod \"glance-default-internal-api-0\" (UID: \"defa24de-de6e-4484-b17a-ff8a1cd31465\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.423289 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/defa24de-de6e-4484-b17a-ff8a1cd31465-scripts\") pod \"glance-default-internal-api-0\" (UID: \"defa24de-de6e-4484-b17a-ff8a1cd31465\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.431144 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13a57460-238e-4006-aacd-4d7f1f0322a0" path="/var/lib/kubelet/pods/13a57460-238e-4006-aacd-4d7f1f0322a0/volumes" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.431763 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a447b254-4b5c-4f2f-924e-ea443f001d7c" path="/var/lib/kubelet/pods/a447b254-4b5c-4f2f-924e-ea443f001d7c/volumes" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.432302 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6bbf659b46-55tth"] Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.435006 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6bbf659b46-55tth" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.452062 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/defa24de-de6e-4484-b17a-ff8a1cd31465-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"defa24de-de6e-4484-b17a-ff8a1cd31465\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.458785 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/defa24de-de6e-4484-b17a-ff8a1cd31465-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"defa24de-de6e-4484-b17a-ff8a1cd31465\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.459183 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/defa24de-de6e-4484-b17a-ff8a1cd31465-config-data\") pod \"glance-default-internal-api-0\" (UID: \"defa24de-de6e-4484-b17a-ff8a1cd31465\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.463484 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 14:15:00 crc kubenswrapper[4585]: E1201 14:15:00.464270 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[glance kube-api-access-xdrfs], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/glance-default-internal-api-0" podUID="defa24de-de6e-4484-b17a-ff8a1cd31465" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.476311 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdrfs\" (UniqueName: \"kubernetes.io/projected/defa24de-de6e-4484-b17a-ff8a1cd31465-kube-api-access-xdrfs\") pod \"glance-default-internal-api-0\" (UID: \"defa24de-de6e-4484-b17a-ff8a1cd31465\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.491365 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6bbf659b46-55tth"] Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.502259 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d6fd44d5-e430-42bb-ad0b-7c78e7a1f288-horizon-secret-key\") pod \"horizon-6f5b64975d-2mfhq\" (UID: \"d6fd44d5-e430-42bb-ad0b-7c78e7a1f288\") " pod="openstack/horizon-6f5b64975d-2mfhq" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.502319 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6fd44d5-e430-42bb-ad0b-7c78e7a1f288-horizon-tls-certs\") pod \"horizon-6f5b64975d-2mfhq\" (UID: \"d6fd44d5-e430-42bb-ad0b-7c78e7a1f288\") " pod="openstack/horizon-6f5b64975d-2mfhq" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.502347 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dab17cce-6a4f-4ca4-9e77-f1451868e1d3-secret-volume\") pod \"collect-profiles-29409975-6pxl5\" (UID: \"dab17cce-6a4f-4ca4-9e77-f1451868e1d3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409975-6pxl5" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.502388 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d6fd44d5-e430-42bb-ad0b-7c78e7a1f288-scripts\") pod \"horizon-6f5b64975d-2mfhq\" (UID: \"d6fd44d5-e430-42bb-ad0b-7c78e7a1f288\") " pod="openstack/horizon-6f5b64975d-2mfhq" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.502477 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dab17cce-6a4f-4ca4-9e77-f1451868e1d3-config-volume\") pod \"collect-profiles-29409975-6pxl5\" (UID: \"dab17cce-6a4f-4ca4-9e77-f1451868e1d3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409975-6pxl5" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.502507 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d6fd44d5-e430-42bb-ad0b-7c78e7a1f288-config-data\") pod \"horizon-6f5b64975d-2mfhq\" (UID: \"d6fd44d5-e430-42bb-ad0b-7c78e7a1f288\") " pod="openstack/horizon-6f5b64975d-2mfhq" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.502561 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6fd44d5-e430-42bb-ad0b-7c78e7a1f288-combined-ca-bundle\") pod \"horizon-6f5b64975d-2mfhq\" (UID: \"d6fd44d5-e430-42bb-ad0b-7c78e7a1f288\") " pod="openstack/horizon-6f5b64975d-2mfhq" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.502600 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6fd44d5-e430-42bb-ad0b-7c78e7a1f288-logs\") pod \"horizon-6f5b64975d-2mfhq\" (UID: \"d6fd44d5-e430-42bb-ad0b-7c78e7a1f288\") " pod="openstack/horizon-6f5b64975d-2mfhq" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.502660 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmxvr\" (UniqueName: \"kubernetes.io/projected/d6fd44d5-e430-42bb-ad0b-7c78e7a1f288-kube-api-access-gmxvr\") pod \"horizon-6f5b64975d-2mfhq\" (UID: \"d6fd44d5-e430-42bb-ad0b-7c78e7a1f288\") " pod="openstack/horizon-6f5b64975d-2mfhq" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.502731 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72s6d\" (UniqueName: \"kubernetes.io/projected/dab17cce-6a4f-4ca4-9e77-f1451868e1d3-kube-api-access-72s6d\") pod \"collect-profiles-29409975-6pxl5\" (UID: \"dab17cce-6a4f-4ca4-9e77-f1451868e1d3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409975-6pxl5" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.504823 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6fd44d5-e430-42bb-ad0b-7c78e7a1f288-logs\") pod \"horizon-6f5b64975d-2mfhq\" (UID: \"d6fd44d5-e430-42bb-ad0b-7c78e7a1f288\") " pod="openstack/horizon-6f5b64975d-2mfhq" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.505007 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d6fd44d5-e430-42bb-ad0b-7c78e7a1f288-scripts\") pod \"horizon-6f5b64975d-2mfhq\" (UID: \"d6fd44d5-e430-42bb-ad0b-7c78e7a1f288\") " pod="openstack/horizon-6f5b64975d-2mfhq" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.506133 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d6fd44d5-e430-42bb-ad0b-7c78e7a1f288-config-data\") pod \"horizon-6f5b64975d-2mfhq\" (UID: \"d6fd44d5-e430-42bb-ad0b-7c78e7a1f288\") " pod="openstack/horizon-6f5b64975d-2mfhq" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.506451 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"defa24de-de6e-4484-b17a-ff8a1cd31465\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.508374 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6fd44d5-e430-42bb-ad0b-7c78e7a1f288-horizon-tls-certs\") pod \"horizon-6f5b64975d-2mfhq\" (UID: \"d6fd44d5-e430-42bb-ad0b-7c78e7a1f288\") " pod="openstack/horizon-6f5b64975d-2mfhq" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.509523 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d6fd44d5-e430-42bb-ad0b-7c78e7a1f288-horizon-secret-key\") pod \"horizon-6f5b64975d-2mfhq\" (UID: \"d6fd44d5-e430-42bb-ad0b-7c78e7a1f288\") " pod="openstack/horizon-6f5b64975d-2mfhq" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.518212 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6fd44d5-e430-42bb-ad0b-7c78e7a1f288-combined-ca-bundle\") pod \"horizon-6f5b64975d-2mfhq\" (UID: \"d6fd44d5-e430-42bb-ad0b-7c78e7a1f288\") " pod="openstack/horizon-6f5b64975d-2mfhq" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.569019 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmxvr\" (UniqueName: \"kubernetes.io/projected/d6fd44d5-e430-42bb-ad0b-7c78e7a1f288-kube-api-access-gmxvr\") pod \"horizon-6f5b64975d-2mfhq\" (UID: \"d6fd44d5-e430-42bb-ad0b-7c78e7a1f288\") " pod="openstack/horizon-6f5b64975d-2mfhq" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.590369 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.604019 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e3b59a9b-bef9-4f7c-b861-bf6bc2a8bac1-config-data\") pod \"horizon-6bbf659b46-55tth\" (UID: \"e3b59a9b-bef9-4f7c-b861-bf6bc2a8bac1\") " pod="openstack/horizon-6bbf659b46-55tth" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.604088 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3b59a9b-bef9-4f7c-b861-bf6bc2a8bac1-horizon-tls-certs\") pod \"horizon-6bbf659b46-55tth\" (UID: \"e3b59a9b-bef9-4f7c-b861-bf6bc2a8bac1\") " pod="openstack/horizon-6bbf659b46-55tth" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.604368 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72s6d\" (UniqueName: \"kubernetes.io/projected/dab17cce-6a4f-4ca4-9e77-f1451868e1d3-kube-api-access-72s6d\") pod \"collect-profiles-29409975-6pxl5\" (UID: \"dab17cce-6a4f-4ca4-9e77-f1451868e1d3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409975-6pxl5" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.604454 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3b59a9b-bef9-4f7c-b861-bf6bc2a8bac1-combined-ca-bundle\") pod \"horizon-6bbf659b46-55tth\" (UID: \"e3b59a9b-bef9-4f7c-b861-bf6bc2a8bac1\") " pod="openstack/horizon-6bbf659b46-55tth" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.604508 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dab17cce-6a4f-4ca4-9e77-f1451868e1d3-secret-volume\") pod \"collect-profiles-29409975-6pxl5\" (UID: \"dab17cce-6a4f-4ca4-9e77-f1451868e1d3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409975-6pxl5" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.604579 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e3b59a9b-bef9-4f7c-b861-bf6bc2a8bac1-horizon-secret-key\") pod \"horizon-6bbf659b46-55tth\" (UID: \"e3b59a9b-bef9-4f7c-b861-bf6bc2a8bac1\") " pod="openstack/horizon-6bbf659b46-55tth" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.604673 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e3b59a9b-bef9-4f7c-b861-bf6bc2a8bac1-scripts\") pod \"horizon-6bbf659b46-55tth\" (UID: \"e3b59a9b-bef9-4f7c-b861-bf6bc2a8bac1\") " pod="openstack/horizon-6bbf659b46-55tth" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.604736 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dab17cce-6a4f-4ca4-9e77-f1451868e1d3-config-volume\") pod \"collect-profiles-29409975-6pxl5\" (UID: \"dab17cce-6a4f-4ca4-9e77-f1451868e1d3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409975-6pxl5" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.604755 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3b59a9b-bef9-4f7c-b861-bf6bc2a8bac1-logs\") pod \"horizon-6bbf659b46-55tth\" (UID: \"e3b59a9b-bef9-4f7c-b861-bf6bc2a8bac1\") " pod="openstack/horizon-6bbf659b46-55tth" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.604770 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmbb9\" (UniqueName: \"kubernetes.io/projected/e3b59a9b-bef9-4f7c-b861-bf6bc2a8bac1-kube-api-access-qmbb9\") pod \"horizon-6bbf659b46-55tth\" (UID: \"e3b59a9b-bef9-4f7c-b861-bf6bc2a8bac1\") " pod="openstack/horizon-6bbf659b46-55tth" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.605729 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dab17cce-6a4f-4ca4-9e77-f1451868e1d3-config-volume\") pod \"collect-profiles-29409975-6pxl5\" (UID: \"dab17cce-6a4f-4ca4-9e77-f1451868e1d3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409975-6pxl5" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.611696 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dab17cce-6a4f-4ca4-9e77-f1451868e1d3-secret-volume\") pod \"collect-profiles-29409975-6pxl5\" (UID: \"dab17cce-6a4f-4ca4-9e77-f1451868e1d3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409975-6pxl5" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.615504 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.631621 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f5b64975d-2mfhq" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.635555 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72s6d\" (UniqueName: \"kubernetes.io/projected/dab17cce-6a4f-4ca4-9e77-f1451868e1d3-kube-api-access-72s6d\") pod \"collect-profiles-29409975-6pxl5\" (UID: \"dab17cce-6a4f-4ca4-9e77-f1451868e1d3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409975-6pxl5" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.690470 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409975-6pxl5" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.705741 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/defa24de-de6e-4484-b17a-ff8a1cd31465-scripts\") pod \"defa24de-de6e-4484-b17a-ff8a1cd31465\" (UID: \"defa24de-de6e-4484-b17a-ff8a1cd31465\") " Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.705804 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/defa24de-de6e-4484-b17a-ff8a1cd31465-logs\") pod \"defa24de-de6e-4484-b17a-ff8a1cd31465\" (UID: \"defa24de-de6e-4484-b17a-ff8a1cd31465\") " Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.705845 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/defa24de-de6e-4484-b17a-ff8a1cd31465-config-data\") pod \"defa24de-de6e-4484-b17a-ff8a1cd31465\" (UID: \"defa24de-de6e-4484-b17a-ff8a1cd31465\") " Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.705873 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/defa24de-de6e-4484-b17a-ff8a1cd31465-httpd-run\") pod \"defa24de-de6e-4484-b17a-ff8a1cd31465\" (UID: \"defa24de-de6e-4484-b17a-ff8a1cd31465\") " Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.705900 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/defa24de-de6e-4484-b17a-ff8a1cd31465-combined-ca-bundle\") pod \"defa24de-de6e-4484-b17a-ff8a1cd31465\" (UID: \"defa24de-de6e-4484-b17a-ff8a1cd31465\") " Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.705940 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"defa24de-de6e-4484-b17a-ff8a1cd31465\" (UID: \"defa24de-de6e-4484-b17a-ff8a1cd31465\") " Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.706055 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdrfs\" (UniqueName: \"kubernetes.io/projected/defa24de-de6e-4484-b17a-ff8a1cd31465-kube-api-access-xdrfs\") pod \"defa24de-de6e-4484-b17a-ff8a1cd31465\" (UID: \"defa24de-de6e-4484-b17a-ff8a1cd31465\") " Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.706093 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/defa24de-de6e-4484-b17a-ff8a1cd31465-internal-tls-certs\") pod \"defa24de-de6e-4484-b17a-ff8a1cd31465\" (UID: \"defa24de-de6e-4484-b17a-ff8a1cd31465\") " Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.706310 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e3b59a9b-bef9-4f7c-b861-bf6bc2a8bac1-config-data\") pod \"horizon-6bbf659b46-55tth\" (UID: \"e3b59a9b-bef9-4f7c-b861-bf6bc2a8bac1\") " pod="openstack/horizon-6bbf659b46-55tth" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.706359 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3b59a9b-bef9-4f7c-b861-bf6bc2a8bac1-horizon-tls-certs\") pod \"horizon-6bbf659b46-55tth\" (UID: \"e3b59a9b-bef9-4f7c-b861-bf6bc2a8bac1\") " pod="openstack/horizon-6bbf659b46-55tth" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.706414 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3b59a9b-bef9-4f7c-b861-bf6bc2a8bac1-combined-ca-bundle\") pod \"horizon-6bbf659b46-55tth\" (UID: \"e3b59a9b-bef9-4f7c-b861-bf6bc2a8bac1\") " pod="openstack/horizon-6bbf659b46-55tth" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.706448 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e3b59a9b-bef9-4f7c-b861-bf6bc2a8bac1-horizon-secret-key\") pod \"horizon-6bbf659b46-55tth\" (UID: \"e3b59a9b-bef9-4f7c-b861-bf6bc2a8bac1\") " pod="openstack/horizon-6bbf659b46-55tth" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.706484 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e3b59a9b-bef9-4f7c-b861-bf6bc2a8bac1-scripts\") pod \"horizon-6bbf659b46-55tth\" (UID: \"e3b59a9b-bef9-4f7c-b861-bf6bc2a8bac1\") " pod="openstack/horizon-6bbf659b46-55tth" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.706510 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3b59a9b-bef9-4f7c-b861-bf6bc2a8bac1-logs\") pod \"horizon-6bbf659b46-55tth\" (UID: \"e3b59a9b-bef9-4f7c-b861-bf6bc2a8bac1\") " pod="openstack/horizon-6bbf659b46-55tth" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.706524 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmbb9\" (UniqueName: \"kubernetes.io/projected/e3b59a9b-bef9-4f7c-b861-bf6bc2a8bac1-kube-api-access-qmbb9\") pod \"horizon-6bbf659b46-55tth\" (UID: \"e3b59a9b-bef9-4f7c-b861-bf6bc2a8bac1\") " pod="openstack/horizon-6bbf659b46-55tth" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.706892 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/defa24de-de6e-4484-b17a-ff8a1cd31465-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "defa24de-de6e-4484-b17a-ff8a1cd31465" (UID: "defa24de-de6e-4484-b17a-ff8a1cd31465"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.707982 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/defa24de-de6e-4484-b17a-ff8a1cd31465-logs" (OuterVolumeSpecName: "logs") pod "defa24de-de6e-4484-b17a-ff8a1cd31465" (UID: "defa24de-de6e-4484-b17a-ff8a1cd31465"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.707999 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e3b59a9b-bef9-4f7c-b861-bf6bc2a8bac1-config-data\") pod \"horizon-6bbf659b46-55tth\" (UID: \"e3b59a9b-bef9-4f7c-b861-bf6bc2a8bac1\") " pod="openstack/horizon-6bbf659b46-55tth" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.715877 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/defa24de-de6e-4484-b17a-ff8a1cd31465-config-data" (OuterVolumeSpecName: "config-data") pod "defa24de-de6e-4484-b17a-ff8a1cd31465" (UID: "defa24de-de6e-4484-b17a-ff8a1cd31465"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.715984 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/defa24de-de6e-4484-b17a-ff8a1cd31465-scripts" (OuterVolumeSpecName: "scripts") pod "defa24de-de6e-4484-b17a-ff8a1cd31465" (UID: "defa24de-de6e-4484-b17a-ff8a1cd31465"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.716492 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e3b59a9b-bef9-4f7c-b861-bf6bc2a8bac1-horizon-secret-key\") pod \"horizon-6bbf659b46-55tth\" (UID: \"e3b59a9b-bef9-4f7c-b861-bf6bc2a8bac1\") " pod="openstack/horizon-6bbf659b46-55tth" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.716717 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3b59a9b-bef9-4f7c-b861-bf6bc2a8bac1-logs\") pod \"horizon-6bbf659b46-55tth\" (UID: \"e3b59a9b-bef9-4f7c-b861-bf6bc2a8bac1\") " pod="openstack/horizon-6bbf659b46-55tth" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.718401 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e3b59a9b-bef9-4f7c-b861-bf6bc2a8bac1-scripts\") pod \"horizon-6bbf659b46-55tth\" (UID: \"e3b59a9b-bef9-4f7c-b861-bf6bc2a8bac1\") " pod="openstack/horizon-6bbf659b46-55tth" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.718746 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/defa24de-de6e-4484-b17a-ff8a1cd31465-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "defa24de-de6e-4484-b17a-ff8a1cd31465" (UID: "defa24de-de6e-4484-b17a-ff8a1cd31465"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.718823 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/defa24de-de6e-4484-b17a-ff8a1cd31465-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "defa24de-de6e-4484-b17a-ff8a1cd31465" (UID: "defa24de-de6e-4484-b17a-ff8a1cd31465"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.721102 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3b59a9b-bef9-4f7c-b861-bf6bc2a8bac1-combined-ca-bundle\") pod \"horizon-6bbf659b46-55tth\" (UID: \"e3b59a9b-bef9-4f7c-b861-bf6bc2a8bac1\") " pod="openstack/horizon-6bbf659b46-55tth" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.730003 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "defa24de-de6e-4484-b17a-ff8a1cd31465" (UID: "defa24de-de6e-4484-b17a-ff8a1cd31465"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.730529 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3b59a9b-bef9-4f7c-b861-bf6bc2a8bac1-horizon-tls-certs\") pod \"horizon-6bbf659b46-55tth\" (UID: \"e3b59a9b-bef9-4f7c-b861-bf6bc2a8bac1\") " pod="openstack/horizon-6bbf659b46-55tth" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.734754 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/defa24de-de6e-4484-b17a-ff8a1cd31465-kube-api-access-xdrfs" (OuterVolumeSpecName: "kube-api-access-xdrfs") pod "defa24de-de6e-4484-b17a-ff8a1cd31465" (UID: "defa24de-de6e-4484-b17a-ff8a1cd31465"). InnerVolumeSpecName "kube-api-access-xdrfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.741423 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmbb9\" (UniqueName: \"kubernetes.io/projected/e3b59a9b-bef9-4f7c-b861-bf6bc2a8bac1-kube-api-access-qmbb9\") pod \"horizon-6bbf659b46-55tth\" (UID: \"e3b59a9b-bef9-4f7c-b861-bf6bc2a8bac1\") " pod="openstack/horizon-6bbf659b46-55tth" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.789914 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6bbf659b46-55tth" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.808191 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdrfs\" (UniqueName: \"kubernetes.io/projected/defa24de-de6e-4484-b17a-ff8a1cd31465-kube-api-access-xdrfs\") on node \"crc\" DevicePath \"\"" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.808387 4585 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/defa24de-de6e-4484-b17a-ff8a1cd31465-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.808485 4585 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/defa24de-de6e-4484-b17a-ff8a1cd31465-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.808552 4585 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/defa24de-de6e-4484-b17a-ff8a1cd31465-logs\") on node \"crc\" DevicePath \"\"" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.808614 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/defa24de-de6e-4484-b17a-ff8a1cd31465-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.808678 4585 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/defa24de-de6e-4484-b17a-ff8a1cd31465-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.808739 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/defa24de-de6e-4484-b17a-ff8a1cd31465-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.808818 4585 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.874565 4585 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Dec 01 14:15:00 crc kubenswrapper[4585]: I1201 14:15:00.913704 4585 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Dec 01 14:15:01 crc kubenswrapper[4585]: I1201 14:15:01.366364 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 14:15:01 crc kubenswrapper[4585]: I1201 14:15:01.613531 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 14:15:01 crc kubenswrapper[4585]: I1201 14:15:01.620172 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6f5b64975d-2mfhq"] Dec 01 14:15:01 crc kubenswrapper[4585]: I1201 14:15:01.708086 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 14:15:01 crc kubenswrapper[4585]: I1201 14:15:01.730951 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 14:15:01 crc kubenswrapper[4585]: I1201 14:15:01.755539 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 14:15:01 crc kubenswrapper[4585]: I1201 14:15:01.759029 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 14:15:01 crc kubenswrapper[4585]: I1201 14:15:01.762232 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 01 14:15:01 crc kubenswrapper[4585]: I1201 14:15:01.764133 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 01 14:15:01 crc kubenswrapper[4585]: I1201 14:15:01.783145 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 14:15:01 crc kubenswrapper[4585]: I1201 14:15:01.830758 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409975-6pxl5"] Dec 01 14:15:01 crc kubenswrapper[4585]: I1201 14:15:01.840354 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a302cf5-b263-4654-b7cc-e7122f4b11cb-logs\") pod \"glance-default-internal-api-0\" (UID: \"6a302cf5-b263-4654-b7cc-e7122f4b11cb\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:15:01 crc kubenswrapper[4585]: I1201 14:15:01.840421 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a302cf5-b263-4654-b7cc-e7122f4b11cb-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6a302cf5-b263-4654-b7cc-e7122f4b11cb\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:15:01 crc kubenswrapper[4585]: I1201 14:15:01.840454 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"6a302cf5-b263-4654-b7cc-e7122f4b11cb\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:15:01 crc kubenswrapper[4585]: I1201 14:15:01.840484 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a302cf5-b263-4654-b7cc-e7122f4b11cb-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6a302cf5-b263-4654-b7cc-e7122f4b11cb\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:15:01 crc kubenswrapper[4585]: I1201 14:15:01.840512 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a302cf5-b263-4654-b7cc-e7122f4b11cb-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6a302cf5-b263-4654-b7cc-e7122f4b11cb\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:15:01 crc kubenswrapper[4585]: I1201 14:15:01.840550 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a302cf5-b263-4654-b7cc-e7122f4b11cb-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6a302cf5-b263-4654-b7cc-e7122f4b11cb\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:15:01 crc kubenswrapper[4585]: I1201 14:15:01.840611 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mj55m\" (UniqueName: \"kubernetes.io/projected/6a302cf5-b263-4654-b7cc-e7122f4b11cb-kube-api-access-mj55m\") pod \"glance-default-internal-api-0\" (UID: \"6a302cf5-b263-4654-b7cc-e7122f4b11cb\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:15:01 crc kubenswrapper[4585]: I1201 14:15:01.840639 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6a302cf5-b263-4654-b7cc-e7122f4b11cb-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6a302cf5-b263-4654-b7cc-e7122f4b11cb\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:15:01 crc kubenswrapper[4585]: I1201 14:15:01.850705 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6bbf659b46-55tth"] Dec 01 14:15:01 crc kubenswrapper[4585]: I1201 14:15:01.942054 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a302cf5-b263-4654-b7cc-e7122f4b11cb-logs\") pod \"glance-default-internal-api-0\" (UID: \"6a302cf5-b263-4654-b7cc-e7122f4b11cb\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:15:01 crc kubenswrapper[4585]: I1201 14:15:01.942094 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a302cf5-b263-4654-b7cc-e7122f4b11cb-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6a302cf5-b263-4654-b7cc-e7122f4b11cb\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:15:01 crc kubenswrapper[4585]: I1201 14:15:01.942120 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"6a302cf5-b263-4654-b7cc-e7122f4b11cb\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:15:01 crc kubenswrapper[4585]: I1201 14:15:01.942144 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a302cf5-b263-4654-b7cc-e7122f4b11cb-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6a302cf5-b263-4654-b7cc-e7122f4b11cb\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:15:01 crc kubenswrapper[4585]: I1201 14:15:01.942166 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a302cf5-b263-4654-b7cc-e7122f4b11cb-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6a302cf5-b263-4654-b7cc-e7122f4b11cb\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:15:01 crc kubenswrapper[4585]: I1201 14:15:01.942191 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a302cf5-b263-4654-b7cc-e7122f4b11cb-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6a302cf5-b263-4654-b7cc-e7122f4b11cb\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:15:01 crc kubenswrapper[4585]: I1201 14:15:01.942238 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mj55m\" (UniqueName: \"kubernetes.io/projected/6a302cf5-b263-4654-b7cc-e7122f4b11cb-kube-api-access-mj55m\") pod \"glance-default-internal-api-0\" (UID: \"6a302cf5-b263-4654-b7cc-e7122f4b11cb\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:15:01 crc kubenswrapper[4585]: I1201 14:15:01.942259 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6a302cf5-b263-4654-b7cc-e7122f4b11cb-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6a302cf5-b263-4654-b7cc-e7122f4b11cb\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:15:01 crc kubenswrapper[4585]: I1201 14:15:01.944112 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6a302cf5-b263-4654-b7cc-e7122f4b11cb-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6a302cf5-b263-4654-b7cc-e7122f4b11cb\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:15:01 crc kubenswrapper[4585]: I1201 14:15:01.949747 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a302cf5-b263-4654-b7cc-e7122f4b11cb-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6a302cf5-b263-4654-b7cc-e7122f4b11cb\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:15:01 crc kubenswrapper[4585]: I1201 14:15:01.951489 4585 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"6a302cf5-b263-4654-b7cc-e7122f4b11cb\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-internal-api-0" Dec 01 14:15:01 crc kubenswrapper[4585]: I1201 14:15:01.954820 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a302cf5-b263-4654-b7cc-e7122f4b11cb-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6a302cf5-b263-4654-b7cc-e7122f4b11cb\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:15:01 crc kubenswrapper[4585]: I1201 14:15:01.957336 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a302cf5-b263-4654-b7cc-e7122f4b11cb-logs\") pod \"glance-default-internal-api-0\" (UID: \"6a302cf5-b263-4654-b7cc-e7122f4b11cb\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:15:01 crc kubenswrapper[4585]: I1201 14:15:01.959840 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a302cf5-b263-4654-b7cc-e7122f4b11cb-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6a302cf5-b263-4654-b7cc-e7122f4b11cb\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:15:01 crc kubenswrapper[4585]: I1201 14:15:01.970261 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a302cf5-b263-4654-b7cc-e7122f4b11cb-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6a302cf5-b263-4654-b7cc-e7122f4b11cb\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:15:01 crc kubenswrapper[4585]: I1201 14:15:01.978413 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mj55m\" (UniqueName: \"kubernetes.io/projected/6a302cf5-b263-4654-b7cc-e7122f4b11cb-kube-api-access-mj55m\") pod \"glance-default-internal-api-0\" (UID: \"6a302cf5-b263-4654-b7cc-e7122f4b11cb\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:15:01 crc kubenswrapper[4585]: I1201 14:15:01.996295 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"6a302cf5-b263-4654-b7cc-e7122f4b11cb\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:15:02 crc kubenswrapper[4585]: I1201 14:15:02.098104 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 14:15:02 crc kubenswrapper[4585]: I1201 14:15:02.433750 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="defa24de-de6e-4484-b17a-ff8a1cd31465" path="/var/lib/kubelet/pods/defa24de-de6e-4484-b17a-ff8a1cd31465/volumes" Dec 01 14:15:02 crc kubenswrapper[4585]: I1201 14:15:02.868773 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-785d8bcb8c-dpwxn" Dec 01 14:15:02 crc kubenswrapper[4585]: I1201 14:15:02.955535 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-5hcms"] Dec 01 14:15:02 crc kubenswrapper[4585]: I1201 14:15:02.955750 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74f6bcbc87-5hcms" podUID="36a1f1e1-f620-422c-b66d-db401c8be015" containerName="dnsmasq-dns" containerID="cri-o://3dd39e7994be5549a91ddb1139b0379927b4d57c3046dd7791b50a8ed3f81d28" gracePeriod=10 Dec 01 14:15:03 crc kubenswrapper[4585]: I1201 14:15:03.658230 4585 generic.go:334] "Generic (PLEG): container finished" podID="36a1f1e1-f620-422c-b66d-db401c8be015" containerID="3dd39e7994be5549a91ddb1139b0379927b4d57c3046dd7791b50a8ed3f81d28" exitCode=0 Dec 01 14:15:03 crc kubenswrapper[4585]: I1201 14:15:03.658275 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-5hcms" event={"ID":"36a1f1e1-f620-422c-b66d-db401c8be015","Type":"ContainerDied","Data":"3dd39e7994be5549a91ddb1139b0379927b4d57c3046dd7791b50a8ed3f81d28"} Dec 01 14:15:04 crc kubenswrapper[4585]: W1201 14:15:04.307721 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01f410ce_dfb4_4188_9a09_d3983f5ef047.slice/crio-741bac4f5e07646f774dacc888fd3bc488b0bab3dcb4dfbe28925a1754d0db49 WatchSource:0}: Error finding container 741bac4f5e07646f774dacc888fd3bc488b0bab3dcb4dfbe28925a1754d0db49: Status 404 returned error can't find the container with id 741bac4f5e07646f774dacc888fd3bc488b0bab3dcb4dfbe28925a1754d0db49 Dec 01 14:15:04 crc kubenswrapper[4585]: I1201 14:15:04.671419 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"01f410ce-dfb4-4188-9a09-d3983f5ef047","Type":"ContainerStarted","Data":"741bac4f5e07646f774dacc888fd3bc488b0bab3dcb4dfbe28925a1754d0db49"} Dec 01 14:15:04 crc kubenswrapper[4585]: I1201 14:15:04.965170 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9f9zw" Dec 01 14:15:05 crc kubenswrapper[4585]: I1201 14:15:05.024278 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-msvcz\" (UniqueName: \"kubernetes.io/projected/2d955d7c-086f-484a-872c-43764c37a3b0-kube-api-access-msvcz\") pod \"2d955d7c-086f-484a-872c-43764c37a3b0\" (UID: \"2d955d7c-086f-484a-872c-43764c37a3b0\") " Dec 01 14:15:05 crc kubenswrapper[4585]: I1201 14:15:05.024370 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2d955d7c-086f-484a-872c-43764c37a3b0-credential-keys\") pod \"2d955d7c-086f-484a-872c-43764c37a3b0\" (UID: \"2d955d7c-086f-484a-872c-43764c37a3b0\") " Dec 01 14:15:05 crc kubenswrapper[4585]: I1201 14:15:05.024406 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d955d7c-086f-484a-872c-43764c37a3b0-combined-ca-bundle\") pod \"2d955d7c-086f-484a-872c-43764c37a3b0\" (UID: \"2d955d7c-086f-484a-872c-43764c37a3b0\") " Dec 01 14:15:05 crc kubenswrapper[4585]: I1201 14:15:05.024503 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d955d7c-086f-484a-872c-43764c37a3b0-scripts\") pod \"2d955d7c-086f-484a-872c-43764c37a3b0\" (UID: \"2d955d7c-086f-484a-872c-43764c37a3b0\") " Dec 01 14:15:05 crc kubenswrapper[4585]: I1201 14:15:05.024623 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d955d7c-086f-484a-872c-43764c37a3b0-config-data\") pod \"2d955d7c-086f-484a-872c-43764c37a3b0\" (UID: \"2d955d7c-086f-484a-872c-43764c37a3b0\") " Dec 01 14:15:05 crc kubenswrapper[4585]: I1201 14:15:05.024661 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2d955d7c-086f-484a-872c-43764c37a3b0-fernet-keys\") pod \"2d955d7c-086f-484a-872c-43764c37a3b0\" (UID: \"2d955d7c-086f-484a-872c-43764c37a3b0\") " Dec 01 14:15:05 crc kubenswrapper[4585]: I1201 14:15:05.030666 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d955d7c-086f-484a-872c-43764c37a3b0-scripts" (OuterVolumeSpecName: "scripts") pod "2d955d7c-086f-484a-872c-43764c37a3b0" (UID: "2d955d7c-086f-484a-872c-43764c37a3b0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:15:05 crc kubenswrapper[4585]: I1201 14:15:05.030886 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d955d7c-086f-484a-872c-43764c37a3b0-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "2d955d7c-086f-484a-872c-43764c37a3b0" (UID: "2d955d7c-086f-484a-872c-43764c37a3b0"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:15:05 crc kubenswrapper[4585]: I1201 14:15:05.032107 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d955d7c-086f-484a-872c-43764c37a3b0-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "2d955d7c-086f-484a-872c-43764c37a3b0" (UID: "2d955d7c-086f-484a-872c-43764c37a3b0"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:15:05 crc kubenswrapper[4585]: I1201 14:15:05.041004 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d955d7c-086f-484a-872c-43764c37a3b0-kube-api-access-msvcz" (OuterVolumeSpecName: "kube-api-access-msvcz") pod "2d955d7c-086f-484a-872c-43764c37a3b0" (UID: "2d955d7c-086f-484a-872c-43764c37a3b0"). InnerVolumeSpecName "kube-api-access-msvcz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:15:05 crc kubenswrapper[4585]: I1201 14:15:05.052849 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d955d7c-086f-484a-872c-43764c37a3b0-config-data" (OuterVolumeSpecName: "config-data") pod "2d955d7c-086f-484a-872c-43764c37a3b0" (UID: "2d955d7c-086f-484a-872c-43764c37a3b0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:15:05 crc kubenswrapper[4585]: I1201 14:15:05.072842 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d955d7c-086f-484a-872c-43764c37a3b0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2d955d7c-086f-484a-872c-43764c37a3b0" (UID: "2d955d7c-086f-484a-872c-43764c37a3b0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:15:05 crc kubenswrapper[4585]: I1201 14:15:05.126506 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-msvcz\" (UniqueName: \"kubernetes.io/projected/2d955d7c-086f-484a-872c-43764c37a3b0-kube-api-access-msvcz\") on node \"crc\" DevicePath \"\"" Dec 01 14:15:05 crc kubenswrapper[4585]: I1201 14:15:05.126546 4585 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2d955d7c-086f-484a-872c-43764c37a3b0-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 01 14:15:05 crc kubenswrapper[4585]: I1201 14:15:05.126557 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d955d7c-086f-484a-872c-43764c37a3b0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 14:15:05 crc kubenswrapper[4585]: I1201 14:15:05.126566 4585 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d955d7c-086f-484a-872c-43764c37a3b0-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 14:15:05 crc kubenswrapper[4585]: I1201 14:15:05.126575 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d955d7c-086f-484a-872c-43764c37a3b0-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 14:15:05 crc kubenswrapper[4585]: I1201 14:15:05.126584 4585 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2d955d7c-086f-484a-872c-43764c37a3b0-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 01 14:15:05 crc kubenswrapper[4585]: I1201 14:15:05.680607 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9f9zw" event={"ID":"2d955d7c-086f-484a-872c-43764c37a3b0","Type":"ContainerDied","Data":"9e1d52e0491a33c865991e329484ea451f934c9c062454ffa585b9f9c36c7d80"} Dec 01 14:15:05 crc kubenswrapper[4585]: I1201 14:15:05.680649 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e1d52e0491a33c865991e329484ea451f934c9c062454ffa585b9f9c36c7d80" Dec 01 14:15:05 crc kubenswrapper[4585]: I1201 14:15:05.680660 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9f9zw" Dec 01 14:15:06 crc kubenswrapper[4585]: I1201 14:15:06.041420 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-9f9zw"] Dec 01 14:15:06 crc kubenswrapper[4585]: I1201 14:15:06.052497 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-9f9zw"] Dec 01 14:15:06 crc kubenswrapper[4585]: I1201 14:15:06.146828 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-xbp2z"] Dec 01 14:15:06 crc kubenswrapper[4585]: E1201 14:15:06.147574 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d955d7c-086f-484a-872c-43764c37a3b0" containerName="keystone-bootstrap" Dec 01 14:15:06 crc kubenswrapper[4585]: I1201 14:15:06.147592 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d955d7c-086f-484a-872c-43764c37a3b0" containerName="keystone-bootstrap" Dec 01 14:15:06 crc kubenswrapper[4585]: I1201 14:15:06.148296 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d955d7c-086f-484a-872c-43764c37a3b0" containerName="keystone-bootstrap" Dec 01 14:15:06 crc kubenswrapper[4585]: I1201 14:15:06.149480 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xbp2z" Dec 01 14:15:06 crc kubenswrapper[4585]: I1201 14:15:06.154695 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 01 14:15:06 crc kubenswrapper[4585]: I1201 14:15:06.154777 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-mj6w8" Dec 01 14:15:06 crc kubenswrapper[4585]: I1201 14:15:06.160483 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 01 14:15:06 crc kubenswrapper[4585]: I1201 14:15:06.160744 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 01 14:15:06 crc kubenswrapper[4585]: I1201 14:15:06.163051 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 01 14:15:06 crc kubenswrapper[4585]: I1201 14:15:06.166679 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xbp2z"] Dec 01 14:15:06 crc kubenswrapper[4585]: I1201 14:15:06.257375 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lbwv\" (UniqueName: \"kubernetes.io/projected/1b34175d-932c-4b94-b6cf-b164891fc965-kube-api-access-9lbwv\") pod \"keystone-bootstrap-xbp2z\" (UID: \"1b34175d-932c-4b94-b6cf-b164891fc965\") " pod="openstack/keystone-bootstrap-xbp2z" Dec 01 14:15:06 crc kubenswrapper[4585]: I1201 14:15:06.257428 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b34175d-932c-4b94-b6cf-b164891fc965-config-data\") pod \"keystone-bootstrap-xbp2z\" (UID: \"1b34175d-932c-4b94-b6cf-b164891fc965\") " pod="openstack/keystone-bootstrap-xbp2z" Dec 01 14:15:06 crc kubenswrapper[4585]: I1201 14:15:06.257460 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b34175d-932c-4b94-b6cf-b164891fc965-combined-ca-bundle\") pod \"keystone-bootstrap-xbp2z\" (UID: \"1b34175d-932c-4b94-b6cf-b164891fc965\") " pod="openstack/keystone-bootstrap-xbp2z" Dec 01 14:15:06 crc kubenswrapper[4585]: I1201 14:15:06.257517 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1b34175d-932c-4b94-b6cf-b164891fc965-fernet-keys\") pod \"keystone-bootstrap-xbp2z\" (UID: \"1b34175d-932c-4b94-b6cf-b164891fc965\") " pod="openstack/keystone-bootstrap-xbp2z" Dec 01 14:15:06 crc kubenswrapper[4585]: I1201 14:15:06.257552 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1b34175d-932c-4b94-b6cf-b164891fc965-credential-keys\") pod \"keystone-bootstrap-xbp2z\" (UID: \"1b34175d-932c-4b94-b6cf-b164891fc965\") " pod="openstack/keystone-bootstrap-xbp2z" Dec 01 14:15:06 crc kubenswrapper[4585]: I1201 14:15:06.257580 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b34175d-932c-4b94-b6cf-b164891fc965-scripts\") pod \"keystone-bootstrap-xbp2z\" (UID: \"1b34175d-932c-4b94-b6cf-b164891fc965\") " pod="openstack/keystone-bootstrap-xbp2z" Dec 01 14:15:06 crc kubenswrapper[4585]: I1201 14:15:06.359268 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lbwv\" (UniqueName: \"kubernetes.io/projected/1b34175d-932c-4b94-b6cf-b164891fc965-kube-api-access-9lbwv\") pod \"keystone-bootstrap-xbp2z\" (UID: \"1b34175d-932c-4b94-b6cf-b164891fc965\") " pod="openstack/keystone-bootstrap-xbp2z" Dec 01 14:15:06 crc kubenswrapper[4585]: I1201 14:15:06.359343 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b34175d-932c-4b94-b6cf-b164891fc965-config-data\") pod \"keystone-bootstrap-xbp2z\" (UID: \"1b34175d-932c-4b94-b6cf-b164891fc965\") " pod="openstack/keystone-bootstrap-xbp2z" Dec 01 14:15:06 crc kubenswrapper[4585]: I1201 14:15:06.359386 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b34175d-932c-4b94-b6cf-b164891fc965-combined-ca-bundle\") pod \"keystone-bootstrap-xbp2z\" (UID: \"1b34175d-932c-4b94-b6cf-b164891fc965\") " pod="openstack/keystone-bootstrap-xbp2z" Dec 01 14:15:06 crc kubenswrapper[4585]: I1201 14:15:06.359423 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1b34175d-932c-4b94-b6cf-b164891fc965-fernet-keys\") pod \"keystone-bootstrap-xbp2z\" (UID: \"1b34175d-932c-4b94-b6cf-b164891fc965\") " pod="openstack/keystone-bootstrap-xbp2z" Dec 01 14:15:06 crc kubenswrapper[4585]: I1201 14:15:06.359459 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1b34175d-932c-4b94-b6cf-b164891fc965-credential-keys\") pod \"keystone-bootstrap-xbp2z\" (UID: \"1b34175d-932c-4b94-b6cf-b164891fc965\") " pod="openstack/keystone-bootstrap-xbp2z" Dec 01 14:15:06 crc kubenswrapper[4585]: I1201 14:15:06.359489 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b34175d-932c-4b94-b6cf-b164891fc965-scripts\") pod \"keystone-bootstrap-xbp2z\" (UID: \"1b34175d-932c-4b94-b6cf-b164891fc965\") " pod="openstack/keystone-bootstrap-xbp2z" Dec 01 14:15:06 crc kubenswrapper[4585]: I1201 14:15:06.361014 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 01 14:15:06 crc kubenswrapper[4585]: I1201 14:15:06.361253 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 01 14:15:06 crc kubenswrapper[4585]: I1201 14:15:06.361445 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 01 14:15:06 crc kubenswrapper[4585]: I1201 14:15:06.375279 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b34175d-932c-4b94-b6cf-b164891fc965-scripts\") pod \"keystone-bootstrap-xbp2z\" (UID: \"1b34175d-932c-4b94-b6cf-b164891fc965\") " pod="openstack/keystone-bootstrap-xbp2z" Dec 01 14:15:06 crc kubenswrapper[4585]: I1201 14:15:06.375646 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b34175d-932c-4b94-b6cf-b164891fc965-config-data\") pod \"keystone-bootstrap-xbp2z\" (UID: \"1b34175d-932c-4b94-b6cf-b164891fc965\") " pod="openstack/keystone-bootstrap-xbp2z" Dec 01 14:15:06 crc kubenswrapper[4585]: I1201 14:15:06.376023 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b34175d-932c-4b94-b6cf-b164891fc965-combined-ca-bundle\") pod \"keystone-bootstrap-xbp2z\" (UID: \"1b34175d-932c-4b94-b6cf-b164891fc965\") " pod="openstack/keystone-bootstrap-xbp2z" Dec 01 14:15:06 crc kubenswrapper[4585]: I1201 14:15:06.377236 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1b34175d-932c-4b94-b6cf-b164891fc965-fernet-keys\") pod \"keystone-bootstrap-xbp2z\" (UID: \"1b34175d-932c-4b94-b6cf-b164891fc965\") " pod="openstack/keystone-bootstrap-xbp2z" Dec 01 14:15:06 crc kubenswrapper[4585]: I1201 14:15:06.379629 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lbwv\" (UniqueName: \"kubernetes.io/projected/1b34175d-932c-4b94-b6cf-b164891fc965-kube-api-access-9lbwv\") pod \"keystone-bootstrap-xbp2z\" (UID: \"1b34175d-932c-4b94-b6cf-b164891fc965\") " pod="openstack/keystone-bootstrap-xbp2z" Dec 01 14:15:06 crc kubenswrapper[4585]: I1201 14:15:06.382136 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1b34175d-932c-4b94-b6cf-b164891fc965-credential-keys\") pod \"keystone-bootstrap-xbp2z\" (UID: \"1b34175d-932c-4b94-b6cf-b164891fc965\") " pod="openstack/keystone-bootstrap-xbp2z" Dec 01 14:15:06 crc kubenswrapper[4585]: I1201 14:15:06.431261 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d955d7c-086f-484a-872c-43764c37a3b0" path="/var/lib/kubelet/pods/2d955d7c-086f-484a-872c-43764c37a3b0/volumes" Dec 01 14:15:06 crc kubenswrapper[4585]: I1201 14:15:06.472521 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-mj6w8" Dec 01 14:15:06 crc kubenswrapper[4585]: I1201 14:15:06.483726 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xbp2z" Dec 01 14:15:10 crc kubenswrapper[4585]: I1201 14:15:10.732507 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f5b64975d-2mfhq" event={"ID":"d6fd44d5-e430-42bb-ad0b-7c78e7a1f288","Type":"ContainerStarted","Data":"88b369e3178655e086e9a0801f919b9233e05bd6833be87fe93a8dc38fae50b6"} Dec 01 14:15:11 crc kubenswrapper[4585]: W1201 14:15:11.378088 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3b59a9b_bef9_4f7c_b861_bf6bc2a8bac1.slice/crio-9bb642d77558aaeb456fa0ab55e6e49ce9b3cedbf544610e266caf6303b3cbbc WatchSource:0}: Error finding container 9bb642d77558aaeb456fa0ab55e6e49ce9b3cedbf544610e266caf6303b3cbbc: Status 404 returned error can't find the container with id 9bb642d77558aaeb456fa0ab55e6e49ce9b3cedbf544610e266caf6303b3cbbc Dec 01 14:15:11 crc kubenswrapper[4585]: I1201 14:15:11.681174 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-5hcms" podUID="36a1f1e1-f620-422c-b66d-db401c8be015" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.124:5353: i/o timeout" Dec 01 14:15:11 crc kubenswrapper[4585]: I1201 14:15:11.746475 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6bbf659b46-55tth" event={"ID":"e3b59a9b-bef9-4f7c-b861-bf6bc2a8bac1","Type":"ContainerStarted","Data":"9bb642d77558aaeb456fa0ab55e6e49ce9b3cedbf544610e266caf6303b3cbbc"} Dec 01 14:15:13 crc kubenswrapper[4585]: E1201 14:15:13.370564 4585 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Dec 01 14:15:13 crc kubenswrapper[4585]: E1201 14:15:13.370832 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n565h56h77h5ddh86h66h57dh5ffh56h57ch568h584h544h6bh654h57fh576h5d4h5d9h5bch56bh5f4h97h5bh5f9h598h66h98h676h54fh59bh594q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mddlf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-6fdcc48fd5-lvls9_openstack(a5260d09-1e2d-4d28-8c75-a717898864e6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 14:15:13 crc kubenswrapper[4585]: E1201 14:15:13.397063 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-6fdcc48fd5-lvls9" podUID="a5260d09-1e2d-4d28-8c75-a717898864e6" Dec 01 14:15:13 crc kubenswrapper[4585]: I1201 14:15:13.716903 4585 patch_prober.go:28] interesting pod/machine-config-daemon-lj9gs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 14:15:13 crc kubenswrapper[4585]: I1201 14:15:13.717053 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 14:15:13 crc kubenswrapper[4585]: I1201 14:15:13.717153 4585 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" Dec 01 14:15:13 crc kubenswrapper[4585]: I1201 14:15:13.717949 4585 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"041ce578b922e9949f0fa3c528cdc2179672e360c800f9ad0c54e96def5e8b8a"} pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 14:15:13 crc kubenswrapper[4585]: I1201 14:15:13.718041 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerName="machine-config-daemon" containerID="cri-o://041ce578b922e9949f0fa3c528cdc2179672e360c800f9ad0c54e96def5e8b8a" gracePeriod=600 Dec 01 14:15:13 crc kubenswrapper[4585]: I1201 14:15:13.763090 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409975-6pxl5" event={"ID":"dab17cce-6a4f-4ca4-9e77-f1451868e1d3","Type":"ContainerStarted","Data":"cf2ea527bb557c984fe759483efed00494bfc0789ef56cf8c1af2cbb8bafffaf"} Dec 01 14:15:14 crc kubenswrapper[4585]: E1201 14:15:14.051353 4585 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Dec 01 14:15:14 crc kubenswrapper[4585]: E1201 14:15:14.051911 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fv5nz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-sf4wz_openstack(7557d295-9d3a-4d0f-933a-77390e7e179e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 14:15:14 crc kubenswrapper[4585]: E1201 14:15:14.053126 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-sf4wz" podUID="7557d295-9d3a-4d0f-933a-77390e7e179e" Dec 01 14:15:14 crc kubenswrapper[4585]: I1201 14:15:14.188352 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-5hcms" Dec 01 14:15:14 crc kubenswrapper[4585]: I1201 14:15:14.330513 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36a1f1e1-f620-422c-b66d-db401c8be015-config\") pod \"36a1f1e1-f620-422c-b66d-db401c8be015\" (UID: \"36a1f1e1-f620-422c-b66d-db401c8be015\") " Dec 01 14:15:14 crc kubenswrapper[4585]: I1201 14:15:14.330585 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/36a1f1e1-f620-422c-b66d-db401c8be015-dns-swift-storage-0\") pod \"36a1f1e1-f620-422c-b66d-db401c8be015\" (UID: \"36a1f1e1-f620-422c-b66d-db401c8be015\") " Dec 01 14:15:14 crc kubenswrapper[4585]: I1201 14:15:14.330640 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/36a1f1e1-f620-422c-b66d-db401c8be015-ovsdbserver-sb\") pod \"36a1f1e1-f620-422c-b66d-db401c8be015\" (UID: \"36a1f1e1-f620-422c-b66d-db401c8be015\") " Dec 01 14:15:14 crc kubenswrapper[4585]: I1201 14:15:14.330782 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzg4f\" (UniqueName: \"kubernetes.io/projected/36a1f1e1-f620-422c-b66d-db401c8be015-kube-api-access-gzg4f\") pod \"36a1f1e1-f620-422c-b66d-db401c8be015\" (UID: \"36a1f1e1-f620-422c-b66d-db401c8be015\") " Dec 01 14:15:14 crc kubenswrapper[4585]: I1201 14:15:14.330830 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/36a1f1e1-f620-422c-b66d-db401c8be015-ovsdbserver-nb\") pod \"36a1f1e1-f620-422c-b66d-db401c8be015\" (UID: \"36a1f1e1-f620-422c-b66d-db401c8be015\") " Dec 01 14:15:14 crc kubenswrapper[4585]: I1201 14:15:14.330880 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/36a1f1e1-f620-422c-b66d-db401c8be015-dns-svc\") pod \"36a1f1e1-f620-422c-b66d-db401c8be015\" (UID: \"36a1f1e1-f620-422c-b66d-db401c8be015\") " Dec 01 14:15:14 crc kubenswrapper[4585]: I1201 14:15:14.352073 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36a1f1e1-f620-422c-b66d-db401c8be015-kube-api-access-gzg4f" (OuterVolumeSpecName: "kube-api-access-gzg4f") pod "36a1f1e1-f620-422c-b66d-db401c8be015" (UID: "36a1f1e1-f620-422c-b66d-db401c8be015"). InnerVolumeSpecName "kube-api-access-gzg4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:15:14 crc kubenswrapper[4585]: I1201 14:15:14.382526 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36a1f1e1-f620-422c-b66d-db401c8be015-config" (OuterVolumeSpecName: "config") pod "36a1f1e1-f620-422c-b66d-db401c8be015" (UID: "36a1f1e1-f620-422c-b66d-db401c8be015"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:15:14 crc kubenswrapper[4585]: I1201 14:15:14.382563 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36a1f1e1-f620-422c-b66d-db401c8be015-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "36a1f1e1-f620-422c-b66d-db401c8be015" (UID: "36a1f1e1-f620-422c-b66d-db401c8be015"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:15:14 crc kubenswrapper[4585]: I1201 14:15:14.383569 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36a1f1e1-f620-422c-b66d-db401c8be015-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "36a1f1e1-f620-422c-b66d-db401c8be015" (UID: "36a1f1e1-f620-422c-b66d-db401c8be015"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:15:14 crc kubenswrapper[4585]: I1201 14:15:14.385784 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36a1f1e1-f620-422c-b66d-db401c8be015-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "36a1f1e1-f620-422c-b66d-db401c8be015" (UID: "36a1f1e1-f620-422c-b66d-db401c8be015"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:15:14 crc kubenswrapper[4585]: I1201 14:15:14.387946 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36a1f1e1-f620-422c-b66d-db401c8be015-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "36a1f1e1-f620-422c-b66d-db401c8be015" (UID: "36a1f1e1-f620-422c-b66d-db401c8be015"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:15:14 crc kubenswrapper[4585]: I1201 14:15:14.433074 4585 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/36a1f1e1-f620-422c-b66d-db401c8be015-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 14:15:14 crc kubenswrapper[4585]: I1201 14:15:14.433100 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36a1f1e1-f620-422c-b66d-db401c8be015-config\") on node \"crc\" DevicePath \"\"" Dec 01 14:15:14 crc kubenswrapper[4585]: I1201 14:15:14.433110 4585 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/36a1f1e1-f620-422c-b66d-db401c8be015-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 01 14:15:14 crc kubenswrapper[4585]: I1201 14:15:14.433121 4585 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/36a1f1e1-f620-422c-b66d-db401c8be015-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 14:15:14 crc kubenswrapper[4585]: I1201 14:15:14.433130 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzg4f\" (UniqueName: \"kubernetes.io/projected/36a1f1e1-f620-422c-b66d-db401c8be015-kube-api-access-gzg4f\") on node \"crc\" DevicePath \"\"" Dec 01 14:15:14 crc kubenswrapper[4585]: I1201 14:15:14.433138 4585 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/36a1f1e1-f620-422c-b66d-db401c8be015-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 14:15:14 crc kubenswrapper[4585]: I1201 14:15:14.777439 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-5hcms" event={"ID":"36a1f1e1-f620-422c-b66d-db401c8be015","Type":"ContainerDied","Data":"92ea8f40ebe9db56fa69f0a445b1cf0e5d0df8fa85aa52080388d91f808fe172"} Dec 01 14:15:14 crc kubenswrapper[4585]: I1201 14:15:14.777821 4585 scope.go:117] "RemoveContainer" containerID="3dd39e7994be5549a91ddb1139b0379927b4d57c3046dd7791b50a8ed3f81d28" Dec 01 14:15:14 crc kubenswrapper[4585]: I1201 14:15:14.777482 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-5hcms" Dec 01 14:15:14 crc kubenswrapper[4585]: I1201 14:15:14.782573 4585 generic.go:334] "Generic (PLEG): container finished" podID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerID="041ce578b922e9949f0fa3c528cdc2179672e360c800f9ad0c54e96def5e8b8a" exitCode=0 Dec 01 14:15:14 crc kubenswrapper[4585]: I1201 14:15:14.782646 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" event={"ID":"f7beb40d-bcd0-43c8-a9fe-c32408790a4c","Type":"ContainerDied","Data":"041ce578b922e9949f0fa3c528cdc2179672e360c800f9ad0c54e96def5e8b8a"} Dec 01 14:15:14 crc kubenswrapper[4585]: E1201 14:15:14.784799 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-sf4wz" podUID="7557d295-9d3a-4d0f-933a-77390e7e179e" Dec 01 14:15:14 crc kubenswrapper[4585]: I1201 14:15:14.821484 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-5hcms"] Dec 01 14:15:14 crc kubenswrapper[4585]: I1201 14:15:14.828860 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-5hcms"] Dec 01 14:15:16 crc kubenswrapper[4585]: I1201 14:15:16.426664 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36a1f1e1-f620-422c-b66d-db401c8be015" path="/var/lib/kubelet/pods/36a1f1e1-f620-422c-b66d-db401c8be015/volumes" Dec 01 14:15:16 crc kubenswrapper[4585]: I1201 14:15:16.682596 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-5hcms" podUID="36a1f1e1-f620-422c-b66d-db401c8be015" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.124:5353: i/o timeout" Dec 01 14:15:16 crc kubenswrapper[4585]: I1201 14:15:16.800690 4585 generic.go:334] "Generic (PLEG): container finished" podID="cc9ff7ae-64b5-41db-94e0-5b30cb0a923c" containerID="9c70defbbb5c1c8842a26ab188f71e6a51ea09a520e94f285f9597e1d7859d01" exitCode=0 Dec 01 14:15:16 crc kubenswrapper[4585]: I1201 14:15:16.800987 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-kg2tb" event={"ID":"cc9ff7ae-64b5-41db-94e0-5b30cb0a923c","Type":"ContainerDied","Data":"9c70defbbb5c1c8842a26ab188f71e6a51ea09a520e94f285f9597e1d7859d01"} Dec 01 14:15:21 crc kubenswrapper[4585]: E1201 14:15:21.469789 4585 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Dec 01 14:15:21 crc kubenswrapper[4585]: E1201 14:15:21.470605 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5b7h589h67ch58ch5c8h565h559hf5h55bh5f6h56h678h75h688h65fhd6h5ffhf8h5b6h599hdbhdfh5d5h5c4h8h5d6h5c4h5b6h648h644hfh59bq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xqpbj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-54d4dd665-7g5hl_openstack(477277de-eaf5-4536-90d9-9091737cb66c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 14:15:21 crc kubenswrapper[4585]: I1201 14:15:21.583461 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6fdcc48fd5-lvls9" Dec 01 14:15:21 crc kubenswrapper[4585]: I1201 14:15:21.614840 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-kg2tb" Dec 01 14:15:21 crc kubenswrapper[4585]: I1201 14:15:21.689640 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc9ff7ae-64b5-41db-94e0-5b30cb0a923c-combined-ca-bundle\") pod \"cc9ff7ae-64b5-41db-94e0-5b30cb0a923c\" (UID: \"cc9ff7ae-64b5-41db-94e0-5b30cb0a923c\") " Dec 01 14:15:21 crc kubenswrapper[4585]: I1201 14:15:21.689731 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mddlf\" (UniqueName: \"kubernetes.io/projected/a5260d09-1e2d-4d28-8c75-a717898864e6-kube-api-access-mddlf\") pod \"a5260d09-1e2d-4d28-8c75-a717898864e6\" (UID: \"a5260d09-1e2d-4d28-8c75-a717898864e6\") " Dec 01 14:15:21 crc kubenswrapper[4585]: I1201 14:15:21.689763 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5260d09-1e2d-4d28-8c75-a717898864e6-logs\") pod \"a5260d09-1e2d-4d28-8c75-a717898864e6\" (UID: \"a5260d09-1e2d-4d28-8c75-a717898864e6\") " Dec 01 14:15:21 crc kubenswrapper[4585]: I1201 14:15:21.689914 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a5260d09-1e2d-4d28-8c75-a717898864e6-horizon-secret-key\") pod \"a5260d09-1e2d-4d28-8c75-a717898864e6\" (UID: \"a5260d09-1e2d-4d28-8c75-a717898864e6\") " Dec 01 14:15:21 crc kubenswrapper[4585]: I1201 14:15:21.690293 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5260d09-1e2d-4d28-8c75-a717898864e6-logs" (OuterVolumeSpecName: "logs") pod "a5260d09-1e2d-4d28-8c75-a717898864e6" (UID: "a5260d09-1e2d-4d28-8c75-a717898864e6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:15:21 crc kubenswrapper[4585]: I1201 14:15:21.690625 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5260d09-1e2d-4d28-8c75-a717898864e6-config-data" (OuterVolumeSpecName: "config-data") pod "a5260d09-1e2d-4d28-8c75-a717898864e6" (UID: "a5260d09-1e2d-4d28-8c75-a717898864e6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:15:21 crc kubenswrapper[4585]: I1201 14:15:21.692567 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a5260d09-1e2d-4d28-8c75-a717898864e6-config-data\") pod \"a5260d09-1e2d-4d28-8c75-a717898864e6\" (UID: \"a5260d09-1e2d-4d28-8c75-a717898864e6\") " Dec 01 14:15:21 crc kubenswrapper[4585]: I1201 14:15:21.692622 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cc9ff7ae-64b5-41db-94e0-5b30cb0a923c-config\") pod \"cc9ff7ae-64b5-41db-94e0-5b30cb0a923c\" (UID: \"cc9ff7ae-64b5-41db-94e0-5b30cb0a923c\") " Dec 01 14:15:21 crc kubenswrapper[4585]: I1201 14:15:21.692646 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a5260d09-1e2d-4d28-8c75-a717898864e6-scripts\") pod \"a5260d09-1e2d-4d28-8c75-a717898864e6\" (UID: \"a5260d09-1e2d-4d28-8c75-a717898864e6\") " Dec 01 14:15:21 crc kubenswrapper[4585]: I1201 14:15:21.692664 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdqs6\" (UniqueName: \"kubernetes.io/projected/cc9ff7ae-64b5-41db-94e0-5b30cb0a923c-kube-api-access-bdqs6\") pod \"cc9ff7ae-64b5-41db-94e0-5b30cb0a923c\" (UID: \"cc9ff7ae-64b5-41db-94e0-5b30cb0a923c\") " Dec 01 14:15:21 crc kubenswrapper[4585]: I1201 14:15:21.693812 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5260d09-1e2d-4d28-8c75-a717898864e6-scripts" (OuterVolumeSpecName: "scripts") pod "a5260d09-1e2d-4d28-8c75-a717898864e6" (UID: "a5260d09-1e2d-4d28-8c75-a717898864e6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:15:21 crc kubenswrapper[4585]: I1201 14:15:21.696155 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5260d09-1e2d-4d28-8c75-a717898864e6-kube-api-access-mddlf" (OuterVolumeSpecName: "kube-api-access-mddlf") pod "a5260d09-1e2d-4d28-8c75-a717898864e6" (UID: "a5260d09-1e2d-4d28-8c75-a717898864e6"). InnerVolumeSpecName "kube-api-access-mddlf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:15:21 crc kubenswrapper[4585]: I1201 14:15:21.696350 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5260d09-1e2d-4d28-8c75-a717898864e6-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "a5260d09-1e2d-4d28-8c75-a717898864e6" (UID: "a5260d09-1e2d-4d28-8c75-a717898864e6"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:15:21 crc kubenswrapper[4585]: I1201 14:15:21.696549 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc9ff7ae-64b5-41db-94e0-5b30cb0a923c-kube-api-access-bdqs6" (OuterVolumeSpecName: "kube-api-access-bdqs6") pod "cc9ff7ae-64b5-41db-94e0-5b30cb0a923c" (UID: "cc9ff7ae-64b5-41db-94e0-5b30cb0a923c"). InnerVolumeSpecName "kube-api-access-bdqs6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:15:21 crc kubenswrapper[4585]: I1201 14:15:21.719805 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc9ff7ae-64b5-41db-94e0-5b30cb0a923c-config" (OuterVolumeSpecName: "config") pod "cc9ff7ae-64b5-41db-94e0-5b30cb0a923c" (UID: "cc9ff7ae-64b5-41db-94e0-5b30cb0a923c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:15:21 crc kubenswrapper[4585]: I1201 14:15:21.720593 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc9ff7ae-64b5-41db-94e0-5b30cb0a923c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cc9ff7ae-64b5-41db-94e0-5b30cb0a923c" (UID: "cc9ff7ae-64b5-41db-94e0-5b30cb0a923c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:15:21 crc kubenswrapper[4585]: I1201 14:15:21.794462 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a5260d09-1e2d-4d28-8c75-a717898864e6-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 14:15:21 crc kubenswrapper[4585]: I1201 14:15:21.794491 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/cc9ff7ae-64b5-41db-94e0-5b30cb0a923c-config\") on node \"crc\" DevicePath \"\"" Dec 01 14:15:21 crc kubenswrapper[4585]: I1201 14:15:21.794500 4585 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a5260d09-1e2d-4d28-8c75-a717898864e6-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 14:15:21 crc kubenswrapper[4585]: I1201 14:15:21.794510 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdqs6\" (UniqueName: \"kubernetes.io/projected/cc9ff7ae-64b5-41db-94e0-5b30cb0a923c-kube-api-access-bdqs6\") on node \"crc\" DevicePath \"\"" Dec 01 14:15:21 crc kubenswrapper[4585]: I1201 14:15:21.794519 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc9ff7ae-64b5-41db-94e0-5b30cb0a923c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 14:15:21 crc kubenswrapper[4585]: I1201 14:15:21.794528 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mddlf\" (UniqueName: \"kubernetes.io/projected/a5260d09-1e2d-4d28-8c75-a717898864e6-kube-api-access-mddlf\") on node \"crc\" DevicePath \"\"" Dec 01 14:15:21 crc kubenswrapper[4585]: I1201 14:15:21.794536 4585 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5260d09-1e2d-4d28-8c75-a717898864e6-logs\") on node \"crc\" DevicePath \"\"" Dec 01 14:15:21 crc kubenswrapper[4585]: I1201 14:15:21.794545 4585 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a5260d09-1e2d-4d28-8c75-a717898864e6-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 01 14:15:21 crc kubenswrapper[4585]: I1201 14:15:21.842775 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-kg2tb" event={"ID":"cc9ff7ae-64b5-41db-94e0-5b30cb0a923c","Type":"ContainerDied","Data":"8499f1a1f4781ad5db132b68bf5309f8e152571bc043a52b3630d7a6e863e9f7"} Dec 01 14:15:21 crc kubenswrapper[4585]: I1201 14:15:21.842848 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8499f1a1f4781ad5db132b68bf5309f8e152571bc043a52b3630d7a6e863e9f7" Dec 01 14:15:21 crc kubenswrapper[4585]: I1201 14:15:21.842906 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-kg2tb" Dec 01 14:15:21 crc kubenswrapper[4585]: I1201 14:15:21.855496 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6fdcc48fd5-lvls9" event={"ID":"a5260d09-1e2d-4d28-8c75-a717898864e6","Type":"ContainerDied","Data":"233d674c97c65e0a30d9aa063a3fca04303e435e86762bbc3afd83a4bad05bb0"} Dec 01 14:15:21 crc kubenswrapper[4585]: I1201 14:15:21.855631 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6fdcc48fd5-lvls9" Dec 01 14:15:21 crc kubenswrapper[4585]: I1201 14:15:21.930408 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6fdcc48fd5-lvls9"] Dec 01 14:15:21 crc kubenswrapper[4585]: I1201 14:15:21.940690 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6fdcc48fd5-lvls9"] Dec 01 14:15:22 crc kubenswrapper[4585]: I1201 14:15:22.004089 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 14:15:22 crc kubenswrapper[4585]: I1201 14:15:22.424463 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5260d09-1e2d-4d28-8c75-a717898864e6" path="/var/lib/kubelet/pods/a5260d09-1e2d-4d28-8c75-a717898864e6/volumes" Dec 01 14:15:22 crc kubenswrapper[4585]: I1201 14:15:22.911786 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-szrfs"] Dec 01 14:15:22 crc kubenswrapper[4585]: E1201 14:15:22.912180 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36a1f1e1-f620-422c-b66d-db401c8be015" containerName="dnsmasq-dns" Dec 01 14:15:22 crc kubenswrapper[4585]: I1201 14:15:22.912194 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="36a1f1e1-f620-422c-b66d-db401c8be015" containerName="dnsmasq-dns" Dec 01 14:15:22 crc kubenswrapper[4585]: E1201 14:15:22.912212 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36a1f1e1-f620-422c-b66d-db401c8be015" containerName="init" Dec 01 14:15:22 crc kubenswrapper[4585]: I1201 14:15:22.912218 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="36a1f1e1-f620-422c-b66d-db401c8be015" containerName="init" Dec 01 14:15:22 crc kubenswrapper[4585]: E1201 14:15:22.917707 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc9ff7ae-64b5-41db-94e0-5b30cb0a923c" containerName="neutron-db-sync" Dec 01 14:15:22 crc kubenswrapper[4585]: I1201 14:15:22.917728 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc9ff7ae-64b5-41db-94e0-5b30cb0a923c" containerName="neutron-db-sync" Dec 01 14:15:22 crc kubenswrapper[4585]: I1201 14:15:22.917956 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc9ff7ae-64b5-41db-94e0-5b30cb0a923c" containerName="neutron-db-sync" Dec 01 14:15:22 crc kubenswrapper[4585]: I1201 14:15:22.917993 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="36a1f1e1-f620-422c-b66d-db401c8be015" containerName="dnsmasq-dns" Dec 01 14:15:22 crc kubenswrapper[4585]: I1201 14:15:22.918881 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-szrfs" Dec 01 14:15:22 crc kubenswrapper[4585]: I1201 14:15:22.937260 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-szrfs"] Dec 01 14:15:22 crc kubenswrapper[4585]: I1201 14:15:22.968350 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-84c9dcff9d-xn9xw"] Dec 01 14:15:22 crc kubenswrapper[4585]: I1201 14:15:22.972185 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-84c9dcff9d-xn9xw" Dec 01 14:15:22 crc kubenswrapper[4585]: I1201 14:15:22.977015 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 01 14:15:22 crc kubenswrapper[4585]: I1201 14:15:22.978067 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 01 14:15:22 crc kubenswrapper[4585]: I1201 14:15:22.978138 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 01 14:15:22 crc kubenswrapper[4585]: I1201 14:15:22.978285 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-4p2xv" Dec 01 14:15:22 crc kubenswrapper[4585]: I1201 14:15:22.985845 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-84c9dcff9d-xn9xw"] Dec 01 14:15:23 crc kubenswrapper[4585]: I1201 14:15:23.017724 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f1901264-234a-4675-9867-3fb1f2689592-dns-svc\") pod \"dnsmasq-dns-55f844cf75-szrfs\" (UID: \"f1901264-234a-4675-9867-3fb1f2689592\") " pod="openstack/dnsmasq-dns-55f844cf75-szrfs" Dec 01 14:15:23 crc kubenswrapper[4585]: I1201 14:15:23.017813 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e9edaf4-b918-4370-8184-79de4b087dfc-ovndb-tls-certs\") pod \"neutron-84c9dcff9d-xn9xw\" (UID: \"1e9edaf4-b918-4370-8184-79de4b087dfc\") " pod="openstack/neutron-84c9dcff9d-xn9xw" Dec 01 14:15:23 crc kubenswrapper[4585]: I1201 14:15:23.017832 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1901264-234a-4675-9867-3fb1f2689592-config\") pod \"dnsmasq-dns-55f844cf75-szrfs\" (UID: \"f1901264-234a-4675-9867-3fb1f2689592\") " pod="openstack/dnsmasq-dns-55f844cf75-szrfs" Dec 01 14:15:23 crc kubenswrapper[4585]: I1201 14:15:23.017866 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1e9edaf4-b918-4370-8184-79de4b087dfc-config\") pod \"neutron-84c9dcff9d-xn9xw\" (UID: \"1e9edaf4-b918-4370-8184-79de4b087dfc\") " pod="openstack/neutron-84c9dcff9d-xn9xw" Dec 01 14:15:23 crc kubenswrapper[4585]: I1201 14:15:23.017885 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f1901264-234a-4675-9867-3fb1f2689592-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-szrfs\" (UID: \"f1901264-234a-4675-9867-3fb1f2689592\") " pod="openstack/dnsmasq-dns-55f844cf75-szrfs" Dec 01 14:15:23 crc kubenswrapper[4585]: I1201 14:15:23.017914 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1e9edaf4-b918-4370-8184-79de4b087dfc-httpd-config\") pod \"neutron-84c9dcff9d-xn9xw\" (UID: \"1e9edaf4-b918-4370-8184-79de4b087dfc\") " pod="openstack/neutron-84c9dcff9d-xn9xw" Dec 01 14:15:23 crc kubenswrapper[4585]: I1201 14:15:23.017946 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x84tt\" (UniqueName: \"kubernetes.io/projected/1e9edaf4-b918-4370-8184-79de4b087dfc-kube-api-access-x84tt\") pod \"neutron-84c9dcff9d-xn9xw\" (UID: \"1e9edaf4-b918-4370-8184-79de4b087dfc\") " pod="openstack/neutron-84c9dcff9d-xn9xw" Dec 01 14:15:23 crc kubenswrapper[4585]: I1201 14:15:23.017988 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzgm2\" (UniqueName: \"kubernetes.io/projected/f1901264-234a-4675-9867-3fb1f2689592-kube-api-access-wzgm2\") pod \"dnsmasq-dns-55f844cf75-szrfs\" (UID: \"f1901264-234a-4675-9867-3fb1f2689592\") " pod="openstack/dnsmasq-dns-55f844cf75-szrfs" Dec 01 14:15:23 crc kubenswrapper[4585]: I1201 14:15:23.018089 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f1901264-234a-4675-9867-3fb1f2689592-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-szrfs\" (UID: \"f1901264-234a-4675-9867-3fb1f2689592\") " pod="openstack/dnsmasq-dns-55f844cf75-szrfs" Dec 01 14:15:23 crc kubenswrapper[4585]: I1201 14:15:23.018123 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e9edaf4-b918-4370-8184-79de4b087dfc-combined-ca-bundle\") pod \"neutron-84c9dcff9d-xn9xw\" (UID: \"1e9edaf4-b918-4370-8184-79de4b087dfc\") " pod="openstack/neutron-84c9dcff9d-xn9xw" Dec 01 14:15:23 crc kubenswrapper[4585]: I1201 14:15:23.018154 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f1901264-234a-4675-9867-3fb1f2689592-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-szrfs\" (UID: \"f1901264-234a-4675-9867-3fb1f2689592\") " pod="openstack/dnsmasq-dns-55f844cf75-szrfs" Dec 01 14:15:23 crc kubenswrapper[4585]: I1201 14:15:23.120872 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f1901264-234a-4675-9867-3fb1f2689592-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-szrfs\" (UID: \"f1901264-234a-4675-9867-3fb1f2689592\") " pod="openstack/dnsmasq-dns-55f844cf75-szrfs" Dec 01 14:15:23 crc kubenswrapper[4585]: I1201 14:15:23.120924 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e9edaf4-b918-4370-8184-79de4b087dfc-combined-ca-bundle\") pod \"neutron-84c9dcff9d-xn9xw\" (UID: \"1e9edaf4-b918-4370-8184-79de4b087dfc\") " pod="openstack/neutron-84c9dcff9d-xn9xw" Dec 01 14:15:23 crc kubenswrapper[4585]: I1201 14:15:23.120960 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f1901264-234a-4675-9867-3fb1f2689592-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-szrfs\" (UID: \"f1901264-234a-4675-9867-3fb1f2689592\") " pod="openstack/dnsmasq-dns-55f844cf75-szrfs" Dec 01 14:15:23 crc kubenswrapper[4585]: I1201 14:15:23.121010 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f1901264-234a-4675-9867-3fb1f2689592-dns-svc\") pod \"dnsmasq-dns-55f844cf75-szrfs\" (UID: \"f1901264-234a-4675-9867-3fb1f2689592\") " pod="openstack/dnsmasq-dns-55f844cf75-szrfs" Dec 01 14:15:23 crc kubenswrapper[4585]: I1201 14:15:23.121055 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e9edaf4-b918-4370-8184-79de4b087dfc-ovndb-tls-certs\") pod \"neutron-84c9dcff9d-xn9xw\" (UID: \"1e9edaf4-b918-4370-8184-79de4b087dfc\") " pod="openstack/neutron-84c9dcff9d-xn9xw" Dec 01 14:15:23 crc kubenswrapper[4585]: I1201 14:15:23.121070 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1901264-234a-4675-9867-3fb1f2689592-config\") pod \"dnsmasq-dns-55f844cf75-szrfs\" (UID: \"f1901264-234a-4675-9867-3fb1f2689592\") " pod="openstack/dnsmasq-dns-55f844cf75-szrfs" Dec 01 14:15:23 crc kubenswrapper[4585]: I1201 14:15:23.121094 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1e9edaf4-b918-4370-8184-79de4b087dfc-config\") pod \"neutron-84c9dcff9d-xn9xw\" (UID: \"1e9edaf4-b918-4370-8184-79de4b087dfc\") " pod="openstack/neutron-84c9dcff9d-xn9xw" Dec 01 14:15:23 crc kubenswrapper[4585]: I1201 14:15:23.121110 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f1901264-234a-4675-9867-3fb1f2689592-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-szrfs\" (UID: \"f1901264-234a-4675-9867-3fb1f2689592\") " pod="openstack/dnsmasq-dns-55f844cf75-szrfs" Dec 01 14:15:23 crc kubenswrapper[4585]: I1201 14:15:23.121134 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1e9edaf4-b918-4370-8184-79de4b087dfc-httpd-config\") pod \"neutron-84c9dcff9d-xn9xw\" (UID: \"1e9edaf4-b918-4370-8184-79de4b087dfc\") " pod="openstack/neutron-84c9dcff9d-xn9xw" Dec 01 14:15:23 crc kubenswrapper[4585]: I1201 14:15:23.121159 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x84tt\" (UniqueName: \"kubernetes.io/projected/1e9edaf4-b918-4370-8184-79de4b087dfc-kube-api-access-x84tt\") pod \"neutron-84c9dcff9d-xn9xw\" (UID: \"1e9edaf4-b918-4370-8184-79de4b087dfc\") " pod="openstack/neutron-84c9dcff9d-xn9xw" Dec 01 14:15:23 crc kubenswrapper[4585]: I1201 14:15:23.121182 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzgm2\" (UniqueName: \"kubernetes.io/projected/f1901264-234a-4675-9867-3fb1f2689592-kube-api-access-wzgm2\") pod \"dnsmasq-dns-55f844cf75-szrfs\" (UID: \"f1901264-234a-4675-9867-3fb1f2689592\") " pod="openstack/dnsmasq-dns-55f844cf75-szrfs" Dec 01 14:15:23 crc kubenswrapper[4585]: I1201 14:15:23.122762 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1901264-234a-4675-9867-3fb1f2689592-config\") pod \"dnsmasq-dns-55f844cf75-szrfs\" (UID: \"f1901264-234a-4675-9867-3fb1f2689592\") " pod="openstack/dnsmasq-dns-55f844cf75-szrfs" Dec 01 14:15:23 crc kubenswrapper[4585]: I1201 14:15:23.123593 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f1901264-234a-4675-9867-3fb1f2689592-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-szrfs\" (UID: \"f1901264-234a-4675-9867-3fb1f2689592\") " pod="openstack/dnsmasq-dns-55f844cf75-szrfs" Dec 01 14:15:23 crc kubenswrapper[4585]: I1201 14:15:23.125311 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f1901264-234a-4675-9867-3fb1f2689592-dns-svc\") pod \"dnsmasq-dns-55f844cf75-szrfs\" (UID: \"f1901264-234a-4675-9867-3fb1f2689592\") " pod="openstack/dnsmasq-dns-55f844cf75-szrfs" Dec 01 14:15:23 crc kubenswrapper[4585]: I1201 14:15:23.125860 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f1901264-234a-4675-9867-3fb1f2689592-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-szrfs\" (UID: \"f1901264-234a-4675-9867-3fb1f2689592\") " pod="openstack/dnsmasq-dns-55f844cf75-szrfs" Dec 01 14:15:23 crc kubenswrapper[4585]: I1201 14:15:23.130665 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f1901264-234a-4675-9867-3fb1f2689592-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-szrfs\" (UID: \"f1901264-234a-4675-9867-3fb1f2689592\") " pod="openstack/dnsmasq-dns-55f844cf75-szrfs" Dec 01 14:15:23 crc kubenswrapper[4585]: I1201 14:15:23.141394 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e9edaf4-b918-4370-8184-79de4b087dfc-combined-ca-bundle\") pod \"neutron-84c9dcff9d-xn9xw\" (UID: \"1e9edaf4-b918-4370-8184-79de4b087dfc\") " pod="openstack/neutron-84c9dcff9d-xn9xw" Dec 01 14:15:23 crc kubenswrapper[4585]: I1201 14:15:23.153479 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/1e9edaf4-b918-4370-8184-79de4b087dfc-config\") pod \"neutron-84c9dcff9d-xn9xw\" (UID: \"1e9edaf4-b918-4370-8184-79de4b087dfc\") " pod="openstack/neutron-84c9dcff9d-xn9xw" Dec 01 14:15:23 crc kubenswrapper[4585]: I1201 14:15:23.153508 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1e9edaf4-b918-4370-8184-79de4b087dfc-httpd-config\") pod \"neutron-84c9dcff9d-xn9xw\" (UID: \"1e9edaf4-b918-4370-8184-79de4b087dfc\") " pod="openstack/neutron-84c9dcff9d-xn9xw" Dec 01 14:15:23 crc kubenswrapper[4585]: I1201 14:15:23.157673 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x84tt\" (UniqueName: \"kubernetes.io/projected/1e9edaf4-b918-4370-8184-79de4b087dfc-kube-api-access-x84tt\") pod \"neutron-84c9dcff9d-xn9xw\" (UID: \"1e9edaf4-b918-4370-8184-79de4b087dfc\") " pod="openstack/neutron-84c9dcff9d-xn9xw" Dec 01 14:15:23 crc kubenswrapper[4585]: I1201 14:15:23.164044 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzgm2\" (UniqueName: \"kubernetes.io/projected/f1901264-234a-4675-9867-3fb1f2689592-kube-api-access-wzgm2\") pod \"dnsmasq-dns-55f844cf75-szrfs\" (UID: \"f1901264-234a-4675-9867-3fb1f2689592\") " pod="openstack/dnsmasq-dns-55f844cf75-szrfs" Dec 01 14:15:23 crc kubenswrapper[4585]: I1201 14:15:23.164059 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e9edaf4-b918-4370-8184-79de4b087dfc-ovndb-tls-certs\") pod \"neutron-84c9dcff9d-xn9xw\" (UID: \"1e9edaf4-b918-4370-8184-79de4b087dfc\") " pod="openstack/neutron-84c9dcff9d-xn9xw" Dec 01 14:15:23 crc kubenswrapper[4585]: I1201 14:15:23.264496 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-szrfs" Dec 01 14:15:23 crc kubenswrapper[4585]: I1201 14:15:23.315591 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-84c9dcff9d-xn9xw" Dec 01 14:15:23 crc kubenswrapper[4585]: I1201 14:15:23.338394 4585 scope.go:117] "RemoveContainer" containerID="454e37522701b59d5c7376872f5534c7878a7027ca4dfb4e37d6f4044ec79619" Dec 01 14:15:23 crc kubenswrapper[4585]: W1201 14:15:23.357766 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a302cf5_b263_4654_b7cc_e7122f4b11cb.slice/crio-77c3b35fca68462f875efade018683047afcdbd79fbbaf2aa8a2840fd6ad625a WatchSource:0}: Error finding container 77c3b35fca68462f875efade018683047afcdbd79fbbaf2aa8a2840fd6ad625a: Status 404 returned error can't find the container with id 77c3b35fca68462f875efade018683047afcdbd79fbbaf2aa8a2840fd6ad625a Dec 01 14:15:23 crc kubenswrapper[4585]: E1201 14:15:23.367126 4585 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Dec 01 14:15:23 crc kubenswrapper[4585]: E1201 14:15:23.367319 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m8c27,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-wmh4t_openstack(7dead942-d6c5-4a4a-aa5e-5c57b6da0c48): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 14:15:23 crc kubenswrapper[4585]: E1201 14:15:23.368480 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-wmh4t" podUID="7dead942-d6c5-4a4a-aa5e-5c57b6da0c48" Dec 01 14:15:23 crc kubenswrapper[4585]: I1201 14:15:23.707887 4585 scope.go:117] "RemoveContainer" containerID="9c565360e1e1b852f24cf87ad3ed2b80ca20fd43a45c1f1f0ee3553f5b1d6b02" Dec 01 14:15:23 crc kubenswrapper[4585]: I1201 14:15:23.909533 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6a302cf5-b263-4654-b7cc-e7122f4b11cb","Type":"ContainerStarted","Data":"77c3b35fca68462f875efade018683047afcdbd79fbbaf2aa8a2840fd6ad625a"} Dec 01 14:15:23 crc kubenswrapper[4585]: I1201 14:15:23.946010 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409975-6pxl5" event={"ID":"dab17cce-6a4f-4ca4-9e77-f1451868e1d3","Type":"ContainerStarted","Data":"5df85fd39531ca2cf08712fe18c70d2665ca45b8156a399546334e4f9deaee47"} Dec 01 14:15:23 crc kubenswrapper[4585]: I1201 14:15:23.982313 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29409975-6pxl5" podStartSLOduration=23.982294396 podStartE2EDuration="23.982294396s" podCreationTimestamp="2025-12-01 14:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:15:23.978475924 +0000 UTC m=+1037.962689779" watchObservedRunningTime="2025-12-01 14:15:23.982294396 +0000 UTC m=+1037.966508251" Dec 01 14:15:24 crc kubenswrapper[4585]: E1201 14:15:24.017823 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-wmh4t" podUID="7dead942-d6c5-4a4a-aa5e-5c57b6da0c48" Dec 01 14:15:24 crc kubenswrapper[4585]: I1201 14:15:24.047851 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xbp2z"] Dec 01 14:15:24 crc kubenswrapper[4585]: I1201 14:15:24.202715 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-szrfs"] Dec 01 14:15:24 crc kubenswrapper[4585]: I1201 14:15:24.344440 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 01 14:15:24 crc kubenswrapper[4585]: E1201 14:15:24.615047 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/horizon-54d4dd665-7g5hl" podUID="477277de-eaf5-4536-90d9-9091737cb66c" Dec 01 14:15:24 crc kubenswrapper[4585]: W1201 14:15:24.643138 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e9edaf4_b918_4370_8184_79de4b087dfc.slice/crio-59dc3cfa7006b8d561f40a55fed7dd5f9e8d0e9302623e9c33c850d19201eab9 WatchSource:0}: Error finding container 59dc3cfa7006b8d561f40a55fed7dd5f9e8d0e9302623e9c33c850d19201eab9: Status 404 returned error can't find the container with id 59dc3cfa7006b8d561f40a55fed7dd5f9e8d0e9302623e9c33c850d19201eab9 Dec 01 14:15:24 crc kubenswrapper[4585]: I1201 14:15:24.648123 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-84c9dcff9d-xn9xw"] Dec 01 14:15:24 crc kubenswrapper[4585]: I1201 14:15:24.965167 4585 generic.go:334] "Generic (PLEG): container finished" podID="dab17cce-6a4f-4ca4-9e77-f1451868e1d3" containerID="5df85fd39531ca2cf08712fe18c70d2665ca45b8156a399546334e4f9deaee47" exitCode=0 Dec 01 14:15:24 crc kubenswrapper[4585]: I1201 14:15:24.966192 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409975-6pxl5" event={"ID":"dab17cce-6a4f-4ca4-9e77-f1451868e1d3","Type":"ContainerDied","Data":"5df85fd39531ca2cf08712fe18c70d2665ca45b8156a399546334e4f9deaee47"} Dec 01 14:15:24 crc kubenswrapper[4585]: I1201 14:15:24.983436 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" event={"ID":"f7beb40d-bcd0-43c8-a9fe-c32408790a4c","Type":"ContainerStarted","Data":"ad6574d507c0610da07eb42cf40383d7aa7800bda84bee35a347684dc954f810"} Dec 01 14:15:24 crc kubenswrapper[4585]: I1201 14:15:24.987145 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-szrfs" event={"ID":"f1901264-234a-4675-9867-3fb1f2689592","Type":"ContainerStarted","Data":"b903e90908ddc4624c5f822114c3013766cc1d5db491b5554ccb41b61659ab5c"} Dec 01 14:15:24 crc kubenswrapper[4585]: I1201 14:15:24.995771 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xbp2z" event={"ID":"1b34175d-932c-4b94-b6cf-b164891fc965","Type":"ContainerStarted","Data":"a465f46c7cb3d74a7cb7bf6557e4e930e276f13055c7a923f54e2bfeae106ea4"} Dec 01 14:15:25 crc kubenswrapper[4585]: I1201 14:15:25.001030 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8556bc9b75-ldp9m" event={"ID":"5a389fb6-e678-4552-8cf8-3aea857a545c","Type":"ContainerStarted","Data":"d838107b5f7119f26afc1894fb492640a7f5d895edcec345d4fda163059ae883"} Dec 01 14:15:25 crc kubenswrapper[4585]: I1201 14:15:25.010093 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-84c9dcff9d-xn9xw" event={"ID":"1e9edaf4-b918-4370-8184-79de4b087dfc","Type":"ContainerStarted","Data":"59dc3cfa7006b8d561f40a55fed7dd5f9e8d0e9302623e9c33c850d19201eab9"} Dec 01 14:15:25 crc kubenswrapper[4585]: I1201 14:15:25.015299 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6bbf659b46-55tth" event={"ID":"e3b59a9b-bef9-4f7c-b861-bf6bc2a8bac1","Type":"ContainerStarted","Data":"25f0e1bcb3e80cc3c32608909f5c68214ea329a31fa77c713e018387b354e694"} Dec 01 14:15:25 crc kubenswrapper[4585]: I1201 14:15:25.018587 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54d4dd665-7g5hl" event={"ID":"477277de-eaf5-4536-90d9-9091737cb66c","Type":"ContainerStarted","Data":"4dce655e753291e91d944ab90d6893b8a27eccfedd887ce33bdc93ab7812d057"} Dec 01 14:15:25 crc kubenswrapper[4585]: I1201 14:15:25.018762 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-54d4dd665-7g5hl" podUID="477277de-eaf5-4536-90d9-9091737cb66c" containerName="horizon" containerID="cri-o://4dce655e753291e91d944ab90d6893b8a27eccfedd887ce33bdc93ab7812d057" gracePeriod=30 Dec 01 14:15:26 crc kubenswrapper[4585]: I1201 14:15:26.029673 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"01f410ce-dfb4-4188-9a09-d3983f5ef047","Type":"ContainerStarted","Data":"1a1df266f646ed1f47b13f2f16437eedb70c940bc92b3bf42e2fa1b8ef3c0bff"} Dec 01 14:15:26 crc kubenswrapper[4585]: I1201 14:15:26.032168 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8556bc9b75-ldp9m" event={"ID":"5a389fb6-e678-4552-8cf8-3aea857a545c","Type":"ContainerStarted","Data":"ccdf936df12a71d20f624a17f99cc73a7a22471ec69998e0bd3b3bb0444d8792"} Dec 01 14:15:26 crc kubenswrapper[4585]: I1201 14:15:26.032312 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-8556bc9b75-ldp9m" podUID="5a389fb6-e678-4552-8cf8-3aea857a545c" containerName="horizon-log" containerID="cri-o://d838107b5f7119f26afc1894fb492640a7f5d895edcec345d4fda163059ae883" gracePeriod=30 Dec 01 14:15:26 crc kubenswrapper[4585]: I1201 14:15:26.032887 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-8556bc9b75-ldp9m" podUID="5a389fb6-e678-4552-8cf8-3aea857a545c" containerName="horizon" containerID="cri-o://ccdf936df12a71d20f624a17f99cc73a7a22471ec69998e0bd3b3bb0444d8792" gracePeriod=30 Dec 01 14:15:26 crc kubenswrapper[4585]: I1201 14:15:26.037806 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-84c9dcff9d-xn9xw" event={"ID":"1e9edaf4-b918-4370-8184-79de4b087dfc","Type":"ContainerStarted","Data":"8456e5822a825f57ff78e6f6cdf5ebaf37da8d4e69fe80ac6039bc9517e07e80"} Dec 01 14:15:26 crc kubenswrapper[4585]: I1201 14:15:26.040228 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6bbf659b46-55tth" event={"ID":"e3b59a9b-bef9-4f7c-b861-bf6bc2a8bac1","Type":"ContainerStarted","Data":"37ec8f025e77f78f9cd14818cc5d1fa36d23c0908d41f30996a803ffc9bcaa9d"} Dec 01 14:15:26 crc kubenswrapper[4585]: I1201 14:15:26.043698 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6a302cf5-b263-4654-b7cc-e7122f4b11cb","Type":"ContainerStarted","Data":"7889205e17509d9adb2f2146398b2a92c586e75952713cd7b30905412665187d"} Dec 01 14:15:26 crc kubenswrapper[4585]: I1201 14:15:26.050806 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-8556bc9b75-ldp9m" podStartSLOduration=5.005184887 podStartE2EDuration="35.050784863s" podCreationTimestamp="2025-12-01 14:14:51 +0000 UTC" firstStartedPulling="2025-12-01 14:14:53.339620284 +0000 UTC m=+1007.323834139" lastFinishedPulling="2025-12-01 14:15:23.38522026 +0000 UTC m=+1037.369434115" observedRunningTime="2025-12-01 14:15:26.048374329 +0000 UTC m=+1040.032588184" watchObservedRunningTime="2025-12-01 14:15:26.050784863 +0000 UTC m=+1040.034998718" Dec 01 14:15:26 crc kubenswrapper[4585]: I1201 14:15:26.060415 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f5b64975d-2mfhq" event={"ID":"d6fd44d5-e430-42bb-ad0b-7c78e7a1f288","Type":"ContainerStarted","Data":"b777eaee6c922789f9a336f3ca54a96b3c11e519ca4993ef5e19721ca1c35cf5"} Dec 01 14:15:26 crc kubenswrapper[4585]: I1201 14:15:26.063361 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-825nv" event={"ID":"30f08b08-85f9-4df9-97ce-f0f25238e889","Type":"ContainerStarted","Data":"084e10a06a307eb11f7434765a84865afe81b8eb1a3e6453eb7a9fdfd8da628e"} Dec 01 14:15:26 crc kubenswrapper[4585]: I1201 14:15:26.065922 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xbp2z" event={"ID":"1b34175d-932c-4b94-b6cf-b164891fc965","Type":"ContainerStarted","Data":"7b5fe4f21372621fcdc69c13b5289393dffb7bef4fab18f7188c18ce70125f59"} Dec 01 14:15:26 crc kubenswrapper[4585]: I1201 14:15:26.068220 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"192bd785-3d80-4b5e-8db2-55a0e3846802","Type":"ContainerStarted","Data":"a78817c1306272222990b2fafaaff52eb95a6582122a7f87b0e25cce4a8798d6"} Dec 01 14:15:26 crc kubenswrapper[4585]: I1201 14:15:26.072146 4585 generic.go:334] "Generic (PLEG): container finished" podID="f1901264-234a-4675-9867-3fb1f2689592" containerID="6558d1be25687287e0ed4143bbb3ddd6185caa9bb95debecc04ba71fdece357f" exitCode=0 Dec 01 14:15:26 crc kubenswrapper[4585]: I1201 14:15:26.076184 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-szrfs" event={"ID":"f1901264-234a-4675-9867-3fb1f2689592","Type":"ContainerDied","Data":"6558d1be25687287e0ed4143bbb3ddd6185caa9bb95debecc04ba71fdece357f"} Dec 01 14:15:26 crc kubenswrapper[4585]: I1201 14:15:26.122833 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-xbp2z" podStartSLOduration=20.122811933 podStartE2EDuration="20.122811933s" podCreationTimestamp="2025-12-01 14:15:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:15:26.118114358 +0000 UTC m=+1040.102328223" watchObservedRunningTime="2025-12-01 14:15:26.122811933 +0000 UTC m=+1040.107025788" Dec 01 14:15:26 crc kubenswrapper[4585]: I1201 14:15:26.141716 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6bbf659b46-55tth" podStartSLOduration=14.088959817 podStartE2EDuration="26.141695016s" podCreationTimestamp="2025-12-01 14:15:00 +0000 UTC" firstStartedPulling="2025-12-01 14:15:11.383317166 +0000 UTC m=+1025.367531021" lastFinishedPulling="2025-12-01 14:15:23.436052365 +0000 UTC m=+1037.420266220" observedRunningTime="2025-12-01 14:15:26.089661669 +0000 UTC m=+1040.073875544" watchObservedRunningTime="2025-12-01 14:15:26.141695016 +0000 UTC m=+1040.125908861" Dec 01 14:15:26 crc kubenswrapper[4585]: I1201 14:15:26.181113 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-825nv" podStartSLOduration=5.349122512 podStartE2EDuration="34.181097047s" podCreationTimestamp="2025-12-01 14:14:52 +0000 UTC" firstStartedPulling="2025-12-01 14:14:54.540126815 +0000 UTC m=+1008.524340670" lastFinishedPulling="2025-12-01 14:15:23.37210135 +0000 UTC m=+1037.356315205" observedRunningTime="2025-12-01 14:15:26.156269475 +0000 UTC m=+1040.140483350" watchObservedRunningTime="2025-12-01 14:15:26.181097047 +0000 UTC m=+1040.165310902" Dec 01 14:15:26 crc kubenswrapper[4585]: I1201 14:15:26.668809 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-fc9dbdd9-h6k6z"] Dec 01 14:15:26 crc kubenswrapper[4585]: I1201 14:15:26.670927 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fc9dbdd9-h6k6z" Dec 01 14:15:26 crc kubenswrapper[4585]: I1201 14:15:26.677343 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 01 14:15:26 crc kubenswrapper[4585]: I1201 14:15:26.678670 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 01 14:15:26 crc kubenswrapper[4585]: I1201 14:15:26.693479 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-fc9dbdd9-h6k6z"] Dec 01 14:15:26 crc kubenswrapper[4585]: I1201 14:15:26.700998 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/197759e4-0035-4430-9bee-483578d6804e-public-tls-certs\") pod \"neutron-fc9dbdd9-h6k6z\" (UID: \"197759e4-0035-4430-9bee-483578d6804e\") " pod="openstack/neutron-fc9dbdd9-h6k6z" Dec 01 14:15:26 crc kubenswrapper[4585]: I1201 14:15:26.701163 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4shx8\" (UniqueName: \"kubernetes.io/projected/197759e4-0035-4430-9bee-483578d6804e-kube-api-access-4shx8\") pod \"neutron-fc9dbdd9-h6k6z\" (UID: \"197759e4-0035-4430-9bee-483578d6804e\") " pod="openstack/neutron-fc9dbdd9-h6k6z" Dec 01 14:15:26 crc kubenswrapper[4585]: I1201 14:15:26.701254 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/197759e4-0035-4430-9bee-483578d6804e-config\") pod \"neutron-fc9dbdd9-h6k6z\" (UID: \"197759e4-0035-4430-9bee-483578d6804e\") " pod="openstack/neutron-fc9dbdd9-h6k6z" Dec 01 14:15:26 crc kubenswrapper[4585]: I1201 14:15:26.701284 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/197759e4-0035-4430-9bee-483578d6804e-ovndb-tls-certs\") pod \"neutron-fc9dbdd9-h6k6z\" (UID: \"197759e4-0035-4430-9bee-483578d6804e\") " pod="openstack/neutron-fc9dbdd9-h6k6z" Dec 01 14:15:26 crc kubenswrapper[4585]: I1201 14:15:26.701348 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/197759e4-0035-4430-9bee-483578d6804e-internal-tls-certs\") pod \"neutron-fc9dbdd9-h6k6z\" (UID: \"197759e4-0035-4430-9bee-483578d6804e\") " pod="openstack/neutron-fc9dbdd9-h6k6z" Dec 01 14:15:26 crc kubenswrapper[4585]: I1201 14:15:26.701461 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/197759e4-0035-4430-9bee-483578d6804e-combined-ca-bundle\") pod \"neutron-fc9dbdd9-h6k6z\" (UID: \"197759e4-0035-4430-9bee-483578d6804e\") " pod="openstack/neutron-fc9dbdd9-h6k6z" Dec 01 14:15:26 crc kubenswrapper[4585]: I1201 14:15:26.701585 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/197759e4-0035-4430-9bee-483578d6804e-httpd-config\") pod \"neutron-fc9dbdd9-h6k6z\" (UID: \"197759e4-0035-4430-9bee-483578d6804e\") " pod="openstack/neutron-fc9dbdd9-h6k6z" Dec 01 14:15:26 crc kubenswrapper[4585]: I1201 14:15:26.803632 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/197759e4-0035-4430-9bee-483578d6804e-combined-ca-bundle\") pod \"neutron-fc9dbdd9-h6k6z\" (UID: \"197759e4-0035-4430-9bee-483578d6804e\") " pod="openstack/neutron-fc9dbdd9-h6k6z" Dec 01 14:15:26 crc kubenswrapper[4585]: I1201 14:15:26.804092 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/197759e4-0035-4430-9bee-483578d6804e-httpd-config\") pod \"neutron-fc9dbdd9-h6k6z\" (UID: \"197759e4-0035-4430-9bee-483578d6804e\") " pod="openstack/neutron-fc9dbdd9-h6k6z" Dec 01 14:15:26 crc kubenswrapper[4585]: I1201 14:15:26.804227 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/197759e4-0035-4430-9bee-483578d6804e-public-tls-certs\") pod \"neutron-fc9dbdd9-h6k6z\" (UID: \"197759e4-0035-4430-9bee-483578d6804e\") " pod="openstack/neutron-fc9dbdd9-h6k6z" Dec 01 14:15:26 crc kubenswrapper[4585]: I1201 14:15:26.804495 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4shx8\" (UniqueName: \"kubernetes.io/projected/197759e4-0035-4430-9bee-483578d6804e-kube-api-access-4shx8\") pod \"neutron-fc9dbdd9-h6k6z\" (UID: \"197759e4-0035-4430-9bee-483578d6804e\") " pod="openstack/neutron-fc9dbdd9-h6k6z" Dec 01 14:15:26 crc kubenswrapper[4585]: I1201 14:15:26.805155 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/197759e4-0035-4430-9bee-483578d6804e-config\") pod \"neutron-fc9dbdd9-h6k6z\" (UID: \"197759e4-0035-4430-9bee-483578d6804e\") " pod="openstack/neutron-fc9dbdd9-h6k6z" Dec 01 14:15:26 crc kubenswrapper[4585]: I1201 14:15:26.805192 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/197759e4-0035-4430-9bee-483578d6804e-ovndb-tls-certs\") pod \"neutron-fc9dbdd9-h6k6z\" (UID: \"197759e4-0035-4430-9bee-483578d6804e\") " pod="openstack/neutron-fc9dbdd9-h6k6z" Dec 01 14:15:26 crc kubenswrapper[4585]: I1201 14:15:26.805257 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/197759e4-0035-4430-9bee-483578d6804e-internal-tls-certs\") pod \"neutron-fc9dbdd9-h6k6z\" (UID: \"197759e4-0035-4430-9bee-483578d6804e\") " pod="openstack/neutron-fc9dbdd9-h6k6z" Dec 01 14:15:26 crc kubenswrapper[4585]: I1201 14:15:26.828357 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/197759e4-0035-4430-9bee-483578d6804e-internal-tls-certs\") pod \"neutron-fc9dbdd9-h6k6z\" (UID: \"197759e4-0035-4430-9bee-483578d6804e\") " pod="openstack/neutron-fc9dbdd9-h6k6z" Dec 01 14:15:26 crc kubenswrapper[4585]: I1201 14:15:26.830920 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/197759e4-0035-4430-9bee-483578d6804e-combined-ca-bundle\") pod \"neutron-fc9dbdd9-h6k6z\" (UID: \"197759e4-0035-4430-9bee-483578d6804e\") " pod="openstack/neutron-fc9dbdd9-h6k6z" Dec 01 14:15:26 crc kubenswrapper[4585]: I1201 14:15:26.845430 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/197759e4-0035-4430-9bee-483578d6804e-httpd-config\") pod \"neutron-fc9dbdd9-h6k6z\" (UID: \"197759e4-0035-4430-9bee-483578d6804e\") " pod="openstack/neutron-fc9dbdd9-h6k6z" Dec 01 14:15:26 crc kubenswrapper[4585]: I1201 14:15:26.847729 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4shx8\" (UniqueName: \"kubernetes.io/projected/197759e4-0035-4430-9bee-483578d6804e-kube-api-access-4shx8\") pod \"neutron-fc9dbdd9-h6k6z\" (UID: \"197759e4-0035-4430-9bee-483578d6804e\") " pod="openstack/neutron-fc9dbdd9-h6k6z" Dec 01 14:15:26 crc kubenswrapper[4585]: I1201 14:15:26.851099 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/197759e4-0035-4430-9bee-483578d6804e-config\") pod \"neutron-fc9dbdd9-h6k6z\" (UID: \"197759e4-0035-4430-9bee-483578d6804e\") " pod="openstack/neutron-fc9dbdd9-h6k6z" Dec 01 14:15:26 crc kubenswrapper[4585]: I1201 14:15:26.853754 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/197759e4-0035-4430-9bee-483578d6804e-public-tls-certs\") pod \"neutron-fc9dbdd9-h6k6z\" (UID: \"197759e4-0035-4430-9bee-483578d6804e\") " pod="openstack/neutron-fc9dbdd9-h6k6z" Dec 01 14:15:26 crc kubenswrapper[4585]: I1201 14:15:26.876832 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/197759e4-0035-4430-9bee-483578d6804e-ovndb-tls-certs\") pod \"neutron-fc9dbdd9-h6k6z\" (UID: \"197759e4-0035-4430-9bee-483578d6804e\") " pod="openstack/neutron-fc9dbdd9-h6k6z" Dec 01 14:15:26 crc kubenswrapper[4585]: I1201 14:15:26.989483 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409975-6pxl5" Dec 01 14:15:27 crc kubenswrapper[4585]: I1201 14:15:27.003353 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fc9dbdd9-h6k6z" Dec 01 14:15:27 crc kubenswrapper[4585]: I1201 14:15:27.011443 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dab17cce-6a4f-4ca4-9e77-f1451868e1d3-config-volume\") pod \"dab17cce-6a4f-4ca4-9e77-f1451868e1d3\" (UID: \"dab17cce-6a4f-4ca4-9e77-f1451868e1d3\") " Dec 01 14:15:27 crc kubenswrapper[4585]: I1201 14:15:27.011548 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dab17cce-6a4f-4ca4-9e77-f1451868e1d3-secret-volume\") pod \"dab17cce-6a4f-4ca4-9e77-f1451868e1d3\" (UID: \"dab17cce-6a4f-4ca4-9e77-f1451868e1d3\") " Dec 01 14:15:27 crc kubenswrapper[4585]: I1201 14:15:27.011669 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72s6d\" (UniqueName: \"kubernetes.io/projected/dab17cce-6a4f-4ca4-9e77-f1451868e1d3-kube-api-access-72s6d\") pod \"dab17cce-6a4f-4ca4-9e77-f1451868e1d3\" (UID: \"dab17cce-6a4f-4ca4-9e77-f1451868e1d3\") " Dec 01 14:15:27 crc kubenswrapper[4585]: I1201 14:15:27.014246 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dab17cce-6a4f-4ca4-9e77-f1451868e1d3-config-volume" (OuterVolumeSpecName: "config-volume") pod "dab17cce-6a4f-4ca4-9e77-f1451868e1d3" (UID: "dab17cce-6a4f-4ca4-9e77-f1451868e1d3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:15:27 crc kubenswrapper[4585]: I1201 14:15:27.028470 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dab17cce-6a4f-4ca4-9e77-f1451868e1d3-kube-api-access-72s6d" (OuterVolumeSpecName: "kube-api-access-72s6d") pod "dab17cce-6a4f-4ca4-9e77-f1451868e1d3" (UID: "dab17cce-6a4f-4ca4-9e77-f1451868e1d3"). InnerVolumeSpecName "kube-api-access-72s6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:15:27 crc kubenswrapper[4585]: I1201 14:15:27.044357 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dab17cce-6a4f-4ca4-9e77-f1451868e1d3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "dab17cce-6a4f-4ca4-9e77-f1451868e1d3" (UID: "dab17cce-6a4f-4ca4-9e77-f1451868e1d3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:15:27 crc kubenswrapper[4585]: I1201 14:15:27.098334 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6a302cf5-b263-4654-b7cc-e7122f4b11cb","Type":"ContainerStarted","Data":"a0f5bda5a277365c87c89fc629c83a60030e06321be8230e8eafb8508497289b"} Dec 01 14:15:27 crc kubenswrapper[4585]: I1201 14:15:27.107631 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f5b64975d-2mfhq" event={"ID":"d6fd44d5-e430-42bb-ad0b-7c78e7a1f288","Type":"ContainerStarted","Data":"93a074bee351885d31ad8a67b537a1943c95441154a2cae193cda36b50b6191f"} Dec 01 14:15:27 crc kubenswrapper[4585]: I1201 14:15:27.114348 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72s6d\" (UniqueName: \"kubernetes.io/projected/dab17cce-6a4f-4ca4-9e77-f1451868e1d3-kube-api-access-72s6d\") on node \"crc\" DevicePath \"\"" Dec 01 14:15:27 crc kubenswrapper[4585]: I1201 14:15:27.114382 4585 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dab17cce-6a4f-4ca4-9e77-f1451868e1d3-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 14:15:27 crc kubenswrapper[4585]: I1201 14:15:27.114391 4585 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dab17cce-6a4f-4ca4-9e77-f1451868e1d3-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 14:15:27 crc kubenswrapper[4585]: I1201 14:15:27.127196 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409975-6pxl5" event={"ID":"dab17cce-6a4f-4ca4-9e77-f1451868e1d3","Type":"ContainerDied","Data":"cf2ea527bb557c984fe759483efed00494bfc0789ef56cf8c1af2cbb8bafffaf"} Dec 01 14:15:27 crc kubenswrapper[4585]: I1201 14:15:27.127235 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf2ea527bb557c984fe759483efed00494bfc0789ef56cf8c1af2cbb8bafffaf" Dec 01 14:15:27 crc kubenswrapper[4585]: I1201 14:15:27.127294 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409975-6pxl5" Dec 01 14:15:27 crc kubenswrapper[4585]: I1201 14:15:27.132496 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="01f410ce-dfb4-4188-9a09-d3983f5ef047" containerName="glance-log" containerID="cri-o://1a1df266f646ed1f47b13f2f16437eedb70c940bc92b3bf42e2fa1b8ef3c0bff" gracePeriod=30 Dec 01 14:15:27 crc kubenswrapper[4585]: I1201 14:15:27.132818 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"01f410ce-dfb4-4188-9a09-d3983f5ef047","Type":"ContainerStarted","Data":"39b2a8a68acee63044ad0e0b1b528666c633d611abc7c1ece6720035d4762902"} Dec 01 14:15:27 crc kubenswrapper[4585]: I1201 14:15:27.133349 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="01f410ce-dfb4-4188-9a09-d3983f5ef047" containerName="glance-httpd" containerID="cri-o://39b2a8a68acee63044ad0e0b1b528666c633d611abc7c1ece6720035d4762902" gracePeriod=30 Dec 01 14:15:27 crc kubenswrapper[4585]: I1201 14:15:27.144639 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=26.144614421 podStartE2EDuration="26.144614421s" podCreationTimestamp="2025-12-01 14:15:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:15:27.13521915 +0000 UTC m=+1041.119433005" watchObservedRunningTime="2025-12-01 14:15:27.144614421 +0000 UTC m=+1041.128828276" Dec 01 14:15:27 crc kubenswrapper[4585]: I1201 14:15:27.187923 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=28.187901505 podStartE2EDuration="28.187901505s" podCreationTimestamp="2025-12-01 14:14:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:15:27.177073126 +0000 UTC m=+1041.161286981" watchObservedRunningTime="2025-12-01 14:15:27.187901505 +0000 UTC m=+1041.172115360" Dec 01 14:15:27 crc kubenswrapper[4585]: I1201 14:15:27.214827 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6f5b64975d-2mfhq" podStartSLOduration=13.432275585 podStartE2EDuration="27.214804562s" podCreationTimestamp="2025-12-01 14:15:00 +0000 UTC" firstStartedPulling="2025-12-01 14:15:09.833949277 +0000 UTC m=+1023.818163132" lastFinishedPulling="2025-12-01 14:15:23.616478254 +0000 UTC m=+1037.600692109" observedRunningTime="2025-12-01 14:15:27.204061785 +0000 UTC m=+1041.188275640" watchObservedRunningTime="2025-12-01 14:15:27.214804562 +0000 UTC m=+1041.199018427" Dec 01 14:15:27 crc kubenswrapper[4585]: I1201 14:15:27.905448 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-fc9dbdd9-h6k6z"] Dec 01 14:15:28 crc kubenswrapper[4585]: I1201 14:15:28.152256 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-szrfs" event={"ID":"f1901264-234a-4675-9867-3fb1f2689592","Type":"ContainerStarted","Data":"13d5e365d0fef2c5c4fcf1847fb20ef4fcd1cbf13e70aea0e07d21661c48f846"} Dec 01 14:15:28 crc kubenswrapper[4585]: I1201 14:15:28.153502 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55f844cf75-szrfs" Dec 01 14:15:28 crc kubenswrapper[4585]: I1201 14:15:28.157338 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fc9dbdd9-h6k6z" event={"ID":"197759e4-0035-4430-9bee-483578d6804e","Type":"ContainerStarted","Data":"859f97fd341991f7d675ca0736b9286dc7e40b615f4d403e5b37fbba4b5dfd05"} Dec 01 14:15:28 crc kubenswrapper[4585]: I1201 14:15:28.162372 4585 generic.go:334] "Generic (PLEG): container finished" podID="01f410ce-dfb4-4188-9a09-d3983f5ef047" containerID="39b2a8a68acee63044ad0e0b1b528666c633d611abc7c1ece6720035d4762902" exitCode=143 Dec 01 14:15:28 crc kubenswrapper[4585]: I1201 14:15:28.162396 4585 generic.go:334] "Generic (PLEG): container finished" podID="01f410ce-dfb4-4188-9a09-d3983f5ef047" containerID="1a1df266f646ed1f47b13f2f16437eedb70c940bc92b3bf42e2fa1b8ef3c0bff" exitCode=143 Dec 01 14:15:28 crc kubenswrapper[4585]: I1201 14:15:28.162540 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"01f410ce-dfb4-4188-9a09-d3983f5ef047","Type":"ContainerDied","Data":"39b2a8a68acee63044ad0e0b1b528666c633d611abc7c1ece6720035d4762902"} Dec 01 14:15:28 crc kubenswrapper[4585]: I1201 14:15:28.162574 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"01f410ce-dfb4-4188-9a09-d3983f5ef047","Type":"ContainerDied","Data":"1a1df266f646ed1f47b13f2f16437eedb70c940bc92b3bf42e2fa1b8ef3c0bff"} Dec 01 14:15:28 crc kubenswrapper[4585]: I1201 14:15:28.189160 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55f844cf75-szrfs" podStartSLOduration=6.189143803 podStartE2EDuration="6.189143803s" podCreationTimestamp="2025-12-01 14:15:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:15:28.186137043 +0000 UTC m=+1042.170350898" watchObservedRunningTime="2025-12-01 14:15:28.189143803 +0000 UTC m=+1042.173357658" Dec 01 14:15:28 crc kubenswrapper[4585]: I1201 14:15:28.201211 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-84c9dcff9d-xn9xw" event={"ID":"1e9edaf4-b918-4370-8184-79de4b087dfc","Type":"ContainerStarted","Data":"2d386bedb990d1f77ddf7db9f242792eaf5889a9e32ada4df5a3416b96df2c0d"} Dec 01 14:15:28 crc kubenswrapper[4585]: I1201 14:15:28.202527 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-84c9dcff9d-xn9xw" Dec 01 14:15:28 crc kubenswrapper[4585]: I1201 14:15:28.218375 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 14:15:28 crc kubenswrapper[4585]: I1201 14:15:28.219131 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-sf4wz" event={"ID":"7557d295-9d3a-4d0f-933a-77390e7e179e","Type":"ContainerStarted","Data":"8159ec23eb18a08fdc106c8e732cbfa469362438df2218ee2363245f975bf8cb"} Dec 01 14:15:28 crc kubenswrapper[4585]: I1201 14:15:28.228697 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-84c9dcff9d-xn9xw" podStartSLOduration=6.228658266 podStartE2EDuration="6.228658266s" podCreationTimestamp="2025-12-01 14:15:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:15:28.224322391 +0000 UTC m=+1042.208536246" watchObservedRunningTime="2025-12-01 14:15:28.228658266 +0000 UTC m=+1042.212872121" Dec 01 14:15:28 crc kubenswrapper[4585]: I1201 14:15:28.334942 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-sf4wz" podStartSLOduration=3.396299566 podStartE2EDuration="36.334922779s" podCreationTimestamp="2025-12-01 14:14:52 +0000 UTC" firstStartedPulling="2025-12-01 14:14:53.911543419 +0000 UTC m=+1007.895757274" lastFinishedPulling="2025-12-01 14:15:26.850166632 +0000 UTC m=+1040.834380487" observedRunningTime="2025-12-01 14:15:28.249896302 +0000 UTC m=+1042.234110157" watchObservedRunningTime="2025-12-01 14:15:28.334922779 +0000 UTC m=+1042.319136634" Dec 01 14:15:28 crc kubenswrapper[4585]: I1201 14:15:28.348997 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/01f410ce-dfb4-4188-9a09-d3983f5ef047-public-tls-certs\") pod \"01f410ce-dfb4-4188-9a09-d3983f5ef047\" (UID: \"01f410ce-dfb4-4188-9a09-d3983f5ef047\") " Dec 01 14:15:28 crc kubenswrapper[4585]: I1201 14:15:28.349098 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/01f410ce-dfb4-4188-9a09-d3983f5ef047-httpd-run\") pod \"01f410ce-dfb4-4188-9a09-d3983f5ef047\" (UID: \"01f410ce-dfb4-4188-9a09-d3983f5ef047\") " Dec 01 14:15:28 crc kubenswrapper[4585]: I1201 14:15:28.349157 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01f410ce-dfb4-4188-9a09-d3983f5ef047-scripts\") pod \"01f410ce-dfb4-4188-9a09-d3983f5ef047\" (UID: \"01f410ce-dfb4-4188-9a09-d3983f5ef047\") " Dec 01 14:15:28 crc kubenswrapper[4585]: I1201 14:15:28.349227 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"01f410ce-dfb4-4188-9a09-d3983f5ef047\" (UID: \"01f410ce-dfb4-4188-9a09-d3983f5ef047\") " Dec 01 14:15:28 crc kubenswrapper[4585]: I1201 14:15:28.349262 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01f410ce-dfb4-4188-9a09-d3983f5ef047-combined-ca-bundle\") pod \"01f410ce-dfb4-4188-9a09-d3983f5ef047\" (UID: \"01f410ce-dfb4-4188-9a09-d3983f5ef047\") " Dec 01 14:15:28 crc kubenswrapper[4585]: I1201 14:15:28.349318 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01f410ce-dfb4-4188-9a09-d3983f5ef047-logs\") pod \"01f410ce-dfb4-4188-9a09-d3983f5ef047\" (UID: \"01f410ce-dfb4-4188-9a09-d3983f5ef047\") " Dec 01 14:15:28 crc kubenswrapper[4585]: I1201 14:15:28.349356 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01f410ce-dfb4-4188-9a09-d3983f5ef047-config-data\") pod \"01f410ce-dfb4-4188-9a09-d3983f5ef047\" (UID: \"01f410ce-dfb4-4188-9a09-d3983f5ef047\") " Dec 01 14:15:28 crc kubenswrapper[4585]: I1201 14:15:28.349401 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kn4tm\" (UniqueName: \"kubernetes.io/projected/01f410ce-dfb4-4188-9a09-d3983f5ef047-kube-api-access-kn4tm\") pod \"01f410ce-dfb4-4188-9a09-d3983f5ef047\" (UID: \"01f410ce-dfb4-4188-9a09-d3983f5ef047\") " Dec 01 14:15:28 crc kubenswrapper[4585]: I1201 14:15:28.355715 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01f410ce-dfb4-4188-9a09-d3983f5ef047-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "01f410ce-dfb4-4188-9a09-d3983f5ef047" (UID: "01f410ce-dfb4-4188-9a09-d3983f5ef047"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:15:28 crc kubenswrapper[4585]: I1201 14:15:28.381552 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01f410ce-dfb4-4188-9a09-d3983f5ef047-logs" (OuterVolumeSpecName: "logs") pod "01f410ce-dfb4-4188-9a09-d3983f5ef047" (UID: "01f410ce-dfb4-4188-9a09-d3983f5ef047"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:15:28 crc kubenswrapper[4585]: I1201 14:15:28.388320 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01f410ce-dfb4-4188-9a09-d3983f5ef047-scripts" (OuterVolumeSpecName: "scripts") pod "01f410ce-dfb4-4188-9a09-d3983f5ef047" (UID: "01f410ce-dfb4-4188-9a09-d3983f5ef047"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:15:28 crc kubenswrapper[4585]: I1201 14:15:28.397223 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "01f410ce-dfb4-4188-9a09-d3983f5ef047" (UID: "01f410ce-dfb4-4188-9a09-d3983f5ef047"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 14:15:28 crc kubenswrapper[4585]: I1201 14:15:28.398843 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01f410ce-dfb4-4188-9a09-d3983f5ef047-kube-api-access-kn4tm" (OuterVolumeSpecName: "kube-api-access-kn4tm") pod "01f410ce-dfb4-4188-9a09-d3983f5ef047" (UID: "01f410ce-dfb4-4188-9a09-d3983f5ef047"). InnerVolumeSpecName "kube-api-access-kn4tm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:15:28 crc kubenswrapper[4585]: I1201 14:15:28.454021 4585 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/01f410ce-dfb4-4188-9a09-d3983f5ef047-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 14:15:28 crc kubenswrapper[4585]: I1201 14:15:28.454125 4585 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01f410ce-dfb4-4188-9a09-d3983f5ef047-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 14:15:28 crc kubenswrapper[4585]: I1201 14:15:28.454200 4585 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 01 14:15:28 crc kubenswrapper[4585]: I1201 14:15:28.454215 4585 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01f410ce-dfb4-4188-9a09-d3983f5ef047-logs\") on node \"crc\" DevicePath \"\"" Dec 01 14:15:28 crc kubenswrapper[4585]: I1201 14:15:28.454224 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kn4tm\" (UniqueName: \"kubernetes.io/projected/01f410ce-dfb4-4188-9a09-d3983f5ef047-kube-api-access-kn4tm\") on node \"crc\" DevicePath \"\"" Dec 01 14:15:28 crc kubenswrapper[4585]: I1201 14:15:28.474772 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01f410ce-dfb4-4188-9a09-d3983f5ef047-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "01f410ce-dfb4-4188-9a09-d3983f5ef047" (UID: "01f410ce-dfb4-4188-9a09-d3983f5ef047"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:15:28 crc kubenswrapper[4585]: I1201 14:15:28.511664 4585 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 01 14:15:28 crc kubenswrapper[4585]: I1201 14:15:28.546776 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01f410ce-dfb4-4188-9a09-d3983f5ef047-config-data" (OuterVolumeSpecName: "config-data") pod "01f410ce-dfb4-4188-9a09-d3983f5ef047" (UID: "01f410ce-dfb4-4188-9a09-d3983f5ef047"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:15:28 crc kubenswrapper[4585]: I1201 14:15:28.562668 4585 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 01 14:15:28 crc kubenswrapper[4585]: I1201 14:15:28.562797 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01f410ce-dfb4-4188-9a09-d3983f5ef047-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 14:15:28 crc kubenswrapper[4585]: I1201 14:15:28.562874 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01f410ce-dfb4-4188-9a09-d3983f5ef047-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 14:15:28 crc kubenswrapper[4585]: I1201 14:15:28.604068 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01f410ce-dfb4-4188-9a09-d3983f5ef047-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "01f410ce-dfb4-4188-9a09-d3983f5ef047" (UID: "01f410ce-dfb4-4188-9a09-d3983f5ef047"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:15:28 crc kubenswrapper[4585]: I1201 14:15:28.668564 4585 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/01f410ce-dfb4-4188-9a09-d3983f5ef047-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 14:15:29 crc kubenswrapper[4585]: I1201 14:15:29.234393 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fc9dbdd9-h6k6z" event={"ID":"197759e4-0035-4430-9bee-483578d6804e","Type":"ContainerStarted","Data":"fefd1a603cd2cb903f8ecc62fdc9b71f91f45798fa451f6aac2d7ee168b3fcbd"} Dec 01 14:15:29 crc kubenswrapper[4585]: I1201 14:15:29.238350 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 14:15:29 crc kubenswrapper[4585]: I1201 14:15:29.238560 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"01f410ce-dfb4-4188-9a09-d3983f5ef047","Type":"ContainerDied","Data":"741bac4f5e07646f774dacc888fd3bc488b0bab3dcb4dfbe28925a1754d0db49"} Dec 01 14:15:29 crc kubenswrapper[4585]: I1201 14:15:29.238600 4585 scope.go:117] "RemoveContainer" containerID="39b2a8a68acee63044ad0e0b1b528666c633d611abc7c1ece6720035d4762902" Dec 01 14:15:29 crc kubenswrapper[4585]: I1201 14:15:29.285485 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 14:15:29 crc kubenswrapper[4585]: I1201 14:15:29.292820 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 14:15:29 crc kubenswrapper[4585]: I1201 14:15:29.324375 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 14:15:29 crc kubenswrapper[4585]: E1201 14:15:29.325437 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dab17cce-6a4f-4ca4-9e77-f1451868e1d3" containerName="collect-profiles" Dec 01 14:15:29 crc kubenswrapper[4585]: I1201 14:15:29.325481 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="dab17cce-6a4f-4ca4-9e77-f1451868e1d3" containerName="collect-profiles" Dec 01 14:15:29 crc kubenswrapper[4585]: E1201 14:15:29.325496 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01f410ce-dfb4-4188-9a09-d3983f5ef047" containerName="glance-log" Dec 01 14:15:29 crc kubenswrapper[4585]: I1201 14:15:29.325504 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="01f410ce-dfb4-4188-9a09-d3983f5ef047" containerName="glance-log" Dec 01 14:15:29 crc kubenswrapper[4585]: E1201 14:15:29.325544 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01f410ce-dfb4-4188-9a09-d3983f5ef047" containerName="glance-httpd" Dec 01 14:15:29 crc kubenswrapper[4585]: I1201 14:15:29.325552 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="01f410ce-dfb4-4188-9a09-d3983f5ef047" containerName="glance-httpd" Dec 01 14:15:29 crc kubenswrapper[4585]: I1201 14:15:29.325940 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="01f410ce-dfb4-4188-9a09-d3983f5ef047" containerName="glance-log" Dec 01 14:15:29 crc kubenswrapper[4585]: I1201 14:15:29.326009 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="dab17cce-6a4f-4ca4-9e77-f1451868e1d3" containerName="collect-profiles" Dec 01 14:15:29 crc kubenswrapper[4585]: I1201 14:15:29.326027 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="01f410ce-dfb4-4188-9a09-d3983f5ef047" containerName="glance-httpd" Dec 01 14:15:29 crc kubenswrapper[4585]: I1201 14:15:29.327366 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 14:15:29 crc kubenswrapper[4585]: I1201 14:15:29.332399 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 01 14:15:29 crc kubenswrapper[4585]: I1201 14:15:29.335554 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 01 14:15:29 crc kubenswrapper[4585]: I1201 14:15:29.367532 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 14:15:29 crc kubenswrapper[4585]: I1201 14:15:29.488959 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/25feba7f-466b-4d39-9096-b5101c68502b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"25feba7f-466b-4d39-9096-b5101c68502b\") " pod="openstack/glance-default-external-api-0" Dec 01 14:15:29 crc kubenswrapper[4585]: I1201 14:15:29.489048 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25feba7f-466b-4d39-9096-b5101c68502b-config-data\") pod \"glance-default-external-api-0\" (UID: \"25feba7f-466b-4d39-9096-b5101c68502b\") " pod="openstack/glance-default-external-api-0" Dec 01 14:15:29 crc kubenswrapper[4585]: I1201 14:15:29.489068 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvvzt\" (UniqueName: \"kubernetes.io/projected/25feba7f-466b-4d39-9096-b5101c68502b-kube-api-access-mvvzt\") pod \"glance-default-external-api-0\" (UID: \"25feba7f-466b-4d39-9096-b5101c68502b\") " pod="openstack/glance-default-external-api-0" Dec 01 14:15:29 crc kubenswrapper[4585]: I1201 14:15:29.489096 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25feba7f-466b-4d39-9096-b5101c68502b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"25feba7f-466b-4d39-9096-b5101c68502b\") " pod="openstack/glance-default-external-api-0" Dec 01 14:15:29 crc kubenswrapper[4585]: I1201 14:15:29.489492 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"25feba7f-466b-4d39-9096-b5101c68502b\") " pod="openstack/glance-default-external-api-0" Dec 01 14:15:29 crc kubenswrapper[4585]: I1201 14:15:29.489588 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25feba7f-466b-4d39-9096-b5101c68502b-scripts\") pod \"glance-default-external-api-0\" (UID: \"25feba7f-466b-4d39-9096-b5101c68502b\") " pod="openstack/glance-default-external-api-0" Dec 01 14:15:29 crc kubenswrapper[4585]: I1201 14:15:29.489654 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25feba7f-466b-4d39-9096-b5101c68502b-logs\") pod \"glance-default-external-api-0\" (UID: \"25feba7f-466b-4d39-9096-b5101c68502b\") " pod="openstack/glance-default-external-api-0" Dec 01 14:15:29 crc kubenswrapper[4585]: I1201 14:15:29.489692 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/25feba7f-466b-4d39-9096-b5101c68502b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"25feba7f-466b-4d39-9096-b5101c68502b\") " pod="openstack/glance-default-external-api-0" Dec 01 14:15:29 crc kubenswrapper[4585]: I1201 14:15:29.591238 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25feba7f-466b-4d39-9096-b5101c68502b-scripts\") pod \"glance-default-external-api-0\" (UID: \"25feba7f-466b-4d39-9096-b5101c68502b\") " pod="openstack/glance-default-external-api-0" Dec 01 14:15:29 crc kubenswrapper[4585]: I1201 14:15:29.591302 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25feba7f-466b-4d39-9096-b5101c68502b-logs\") pod \"glance-default-external-api-0\" (UID: \"25feba7f-466b-4d39-9096-b5101c68502b\") " pod="openstack/glance-default-external-api-0" Dec 01 14:15:29 crc kubenswrapper[4585]: I1201 14:15:29.591332 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/25feba7f-466b-4d39-9096-b5101c68502b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"25feba7f-466b-4d39-9096-b5101c68502b\") " pod="openstack/glance-default-external-api-0" Dec 01 14:15:29 crc kubenswrapper[4585]: I1201 14:15:29.591412 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/25feba7f-466b-4d39-9096-b5101c68502b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"25feba7f-466b-4d39-9096-b5101c68502b\") " pod="openstack/glance-default-external-api-0" Dec 01 14:15:29 crc kubenswrapper[4585]: I1201 14:15:29.591444 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25feba7f-466b-4d39-9096-b5101c68502b-config-data\") pod \"glance-default-external-api-0\" (UID: \"25feba7f-466b-4d39-9096-b5101c68502b\") " pod="openstack/glance-default-external-api-0" Dec 01 14:15:29 crc kubenswrapper[4585]: I1201 14:15:29.591469 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvvzt\" (UniqueName: \"kubernetes.io/projected/25feba7f-466b-4d39-9096-b5101c68502b-kube-api-access-mvvzt\") pod \"glance-default-external-api-0\" (UID: \"25feba7f-466b-4d39-9096-b5101c68502b\") " pod="openstack/glance-default-external-api-0" Dec 01 14:15:29 crc kubenswrapper[4585]: I1201 14:15:29.591500 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25feba7f-466b-4d39-9096-b5101c68502b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"25feba7f-466b-4d39-9096-b5101c68502b\") " pod="openstack/glance-default-external-api-0" Dec 01 14:15:29 crc kubenswrapper[4585]: I1201 14:15:29.591590 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"25feba7f-466b-4d39-9096-b5101c68502b\") " pod="openstack/glance-default-external-api-0" Dec 01 14:15:29 crc kubenswrapper[4585]: I1201 14:15:29.591849 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25feba7f-466b-4d39-9096-b5101c68502b-logs\") pod \"glance-default-external-api-0\" (UID: \"25feba7f-466b-4d39-9096-b5101c68502b\") " pod="openstack/glance-default-external-api-0" Dec 01 14:15:29 crc kubenswrapper[4585]: I1201 14:15:29.591893 4585 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"25feba7f-466b-4d39-9096-b5101c68502b\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Dec 01 14:15:29 crc kubenswrapper[4585]: I1201 14:15:29.592692 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/25feba7f-466b-4d39-9096-b5101c68502b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"25feba7f-466b-4d39-9096-b5101c68502b\") " pod="openstack/glance-default-external-api-0" Dec 01 14:15:29 crc kubenswrapper[4585]: I1201 14:15:29.609731 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25feba7f-466b-4d39-9096-b5101c68502b-scripts\") pod \"glance-default-external-api-0\" (UID: \"25feba7f-466b-4d39-9096-b5101c68502b\") " pod="openstack/glance-default-external-api-0" Dec 01 14:15:29 crc kubenswrapper[4585]: I1201 14:15:29.610707 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvvzt\" (UniqueName: \"kubernetes.io/projected/25feba7f-466b-4d39-9096-b5101c68502b-kube-api-access-mvvzt\") pod \"glance-default-external-api-0\" (UID: \"25feba7f-466b-4d39-9096-b5101c68502b\") " pod="openstack/glance-default-external-api-0" Dec 01 14:15:29 crc kubenswrapper[4585]: I1201 14:15:29.611438 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25feba7f-466b-4d39-9096-b5101c68502b-config-data\") pod \"glance-default-external-api-0\" (UID: \"25feba7f-466b-4d39-9096-b5101c68502b\") " pod="openstack/glance-default-external-api-0" Dec 01 14:15:29 crc kubenswrapper[4585]: I1201 14:15:29.612824 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25feba7f-466b-4d39-9096-b5101c68502b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"25feba7f-466b-4d39-9096-b5101c68502b\") " pod="openstack/glance-default-external-api-0" Dec 01 14:15:29 crc kubenswrapper[4585]: I1201 14:15:29.613678 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/25feba7f-466b-4d39-9096-b5101c68502b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"25feba7f-466b-4d39-9096-b5101c68502b\") " pod="openstack/glance-default-external-api-0" Dec 01 14:15:29 crc kubenswrapper[4585]: I1201 14:15:29.629718 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"25feba7f-466b-4d39-9096-b5101c68502b\") " pod="openstack/glance-default-external-api-0" Dec 01 14:15:29 crc kubenswrapper[4585]: I1201 14:15:29.674747 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 14:15:30 crc kubenswrapper[4585]: I1201 14:15:30.332269 4585 scope.go:117] "RemoveContainer" containerID="1a1df266f646ed1f47b13f2f16437eedb70c940bc92b3bf42e2fa1b8ef3c0bff" Dec 01 14:15:30 crc kubenswrapper[4585]: I1201 14:15:30.436241 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01f410ce-dfb4-4188-9a09-d3983f5ef047" path="/var/lib/kubelet/pods/01f410ce-dfb4-4188-9a09-d3983f5ef047/volumes" Dec 01 14:15:30 crc kubenswrapper[4585]: I1201 14:15:30.633128 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6f5b64975d-2mfhq" Dec 01 14:15:30 crc kubenswrapper[4585]: I1201 14:15:30.635409 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6f5b64975d-2mfhq" Dec 01 14:15:30 crc kubenswrapper[4585]: I1201 14:15:30.790201 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6bbf659b46-55tth" Dec 01 14:15:30 crc kubenswrapper[4585]: I1201 14:15:30.790912 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6bbf659b46-55tth" Dec 01 14:15:31 crc kubenswrapper[4585]: I1201 14:15:31.004601 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 14:15:31 crc kubenswrapper[4585]: I1201 14:15:31.275166 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fc9dbdd9-h6k6z" event={"ID":"197759e4-0035-4430-9bee-483578d6804e","Type":"ContainerStarted","Data":"0d9aa028941de1bab495e663d987bf1c2e1998bb1c31c1c2c4b092ef512682b6"} Dec 01 14:15:31 crc kubenswrapper[4585]: I1201 14:15:31.276656 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-fc9dbdd9-h6k6z" Dec 01 14:15:31 crc kubenswrapper[4585]: I1201 14:15:31.329453 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-fc9dbdd9-h6k6z" podStartSLOduration=5.329437092 podStartE2EDuration="5.329437092s" podCreationTimestamp="2025-12-01 14:15:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:15:31.303622634 +0000 UTC m=+1045.287836489" watchObservedRunningTime="2025-12-01 14:15:31.329437092 +0000 UTC m=+1045.313650947" Dec 01 14:15:31 crc kubenswrapper[4585]: I1201 14:15:31.342256 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"192bd785-3d80-4b5e-8db2-55a0e3846802","Type":"ContainerStarted","Data":"192f8a3c9f0df4f6db339068d880c601748747fe695b451b112e9dbbdef72909"} Dec 01 14:15:31 crc kubenswrapper[4585]: I1201 14:15:31.382674 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"25feba7f-466b-4d39-9096-b5101c68502b","Type":"ContainerStarted","Data":"10b39410aa15cd2cd268c8c6a7d764a663a82ce26fef82aa409c44bb59811eaa"} Dec 01 14:15:31 crc kubenswrapper[4585]: I1201 14:15:31.973455 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-54d4dd665-7g5hl" Dec 01 14:15:32 crc kubenswrapper[4585]: I1201 14:15:32.001868 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-8556bc9b75-ldp9m" Dec 01 14:15:32 crc kubenswrapper[4585]: I1201 14:15:32.103457 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 01 14:15:32 crc kubenswrapper[4585]: I1201 14:15:32.103798 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 01 14:15:32 crc kubenswrapper[4585]: I1201 14:15:32.103815 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 01 14:15:32 crc kubenswrapper[4585]: I1201 14:15:32.103827 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 01 14:15:32 crc kubenswrapper[4585]: I1201 14:15:32.237879 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 01 14:15:32 crc kubenswrapper[4585]: I1201 14:15:32.241304 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 01 14:15:32 crc kubenswrapper[4585]: I1201 14:15:32.390771 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"25feba7f-466b-4d39-9096-b5101c68502b","Type":"ContainerStarted","Data":"54858fd5198f9e7bae6e73d060e6cf805b39e71f39dcdcd87fbed8a918271822"} Dec 01 14:15:33 crc kubenswrapper[4585]: I1201 14:15:33.266116 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55f844cf75-szrfs" Dec 01 14:15:33 crc kubenswrapper[4585]: I1201 14:15:33.325314 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-dpwxn"] Dec 01 14:15:33 crc kubenswrapper[4585]: I1201 14:15:33.325680 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-dpwxn" podUID="c7490835-3211-4849-83cb-ec2e642df346" containerName="dnsmasq-dns" containerID="cri-o://d96cb4613f3afa0936e764d46561752b707eb11424d55283a354ed8643649b96" gracePeriod=10 Dec 01 14:15:33 crc kubenswrapper[4585]: I1201 14:15:33.421170 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"25feba7f-466b-4d39-9096-b5101c68502b","Type":"ContainerStarted","Data":"012370f09a16ff589a0d74c0ddb711e53d3edc2bbffa015966c5aaf1c9be3c39"} Dec 01 14:15:33 crc kubenswrapper[4585]: I1201 14:15:33.453622 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.453602683 podStartE2EDuration="4.453602683s" podCreationTimestamp="2025-12-01 14:15:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:15:33.449381121 +0000 UTC m=+1047.433594976" watchObservedRunningTime="2025-12-01 14:15:33.453602683 +0000 UTC m=+1047.437816538" Dec 01 14:15:34 crc kubenswrapper[4585]: I1201 14:15:34.080511 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-dpwxn" Dec 01 14:15:34 crc kubenswrapper[4585]: I1201 14:15:34.176960 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2vkv\" (UniqueName: \"kubernetes.io/projected/c7490835-3211-4849-83cb-ec2e642df346-kube-api-access-d2vkv\") pod \"c7490835-3211-4849-83cb-ec2e642df346\" (UID: \"c7490835-3211-4849-83cb-ec2e642df346\") " Dec 01 14:15:34 crc kubenswrapper[4585]: I1201 14:15:34.177474 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c7490835-3211-4849-83cb-ec2e642df346-ovsdbserver-sb\") pod \"c7490835-3211-4849-83cb-ec2e642df346\" (UID: \"c7490835-3211-4849-83cb-ec2e642df346\") " Dec 01 14:15:34 crc kubenswrapper[4585]: I1201 14:15:34.177572 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7490835-3211-4849-83cb-ec2e642df346-dns-svc\") pod \"c7490835-3211-4849-83cb-ec2e642df346\" (UID: \"c7490835-3211-4849-83cb-ec2e642df346\") " Dec 01 14:15:34 crc kubenswrapper[4585]: I1201 14:15:34.177647 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c7490835-3211-4849-83cb-ec2e642df346-dns-swift-storage-0\") pod \"c7490835-3211-4849-83cb-ec2e642df346\" (UID: \"c7490835-3211-4849-83cb-ec2e642df346\") " Dec 01 14:15:34 crc kubenswrapper[4585]: I1201 14:15:34.177741 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7490835-3211-4849-83cb-ec2e642df346-config\") pod \"c7490835-3211-4849-83cb-ec2e642df346\" (UID: \"c7490835-3211-4849-83cb-ec2e642df346\") " Dec 01 14:15:34 crc kubenswrapper[4585]: I1201 14:15:34.177832 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c7490835-3211-4849-83cb-ec2e642df346-ovsdbserver-nb\") pod \"c7490835-3211-4849-83cb-ec2e642df346\" (UID: \"c7490835-3211-4849-83cb-ec2e642df346\") " Dec 01 14:15:34 crc kubenswrapper[4585]: I1201 14:15:34.212256 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7490835-3211-4849-83cb-ec2e642df346-kube-api-access-d2vkv" (OuterVolumeSpecName: "kube-api-access-d2vkv") pod "c7490835-3211-4849-83cb-ec2e642df346" (UID: "c7490835-3211-4849-83cb-ec2e642df346"). InnerVolumeSpecName "kube-api-access-d2vkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:15:34 crc kubenswrapper[4585]: I1201 14:15:34.241647 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7490835-3211-4849-83cb-ec2e642df346-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c7490835-3211-4849-83cb-ec2e642df346" (UID: "c7490835-3211-4849-83cb-ec2e642df346"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:15:34 crc kubenswrapper[4585]: I1201 14:15:34.279916 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7490835-3211-4849-83cb-ec2e642df346-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c7490835-3211-4849-83cb-ec2e642df346" (UID: "c7490835-3211-4849-83cb-ec2e642df346"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:15:34 crc kubenswrapper[4585]: I1201 14:15:34.280285 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2vkv\" (UniqueName: \"kubernetes.io/projected/c7490835-3211-4849-83cb-ec2e642df346-kube-api-access-d2vkv\") on node \"crc\" DevicePath \"\"" Dec 01 14:15:34 crc kubenswrapper[4585]: I1201 14:15:34.280315 4585 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c7490835-3211-4849-83cb-ec2e642df346-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 14:15:34 crc kubenswrapper[4585]: I1201 14:15:34.280324 4585 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c7490835-3211-4849-83cb-ec2e642df346-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 14:15:34 crc kubenswrapper[4585]: I1201 14:15:34.298537 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7490835-3211-4849-83cb-ec2e642df346-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c7490835-3211-4849-83cb-ec2e642df346" (UID: "c7490835-3211-4849-83cb-ec2e642df346"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:15:34 crc kubenswrapper[4585]: I1201 14:15:34.334925 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7490835-3211-4849-83cb-ec2e642df346-config" (OuterVolumeSpecName: "config") pod "c7490835-3211-4849-83cb-ec2e642df346" (UID: "c7490835-3211-4849-83cb-ec2e642df346"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:15:34 crc kubenswrapper[4585]: I1201 14:15:34.341355 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7490835-3211-4849-83cb-ec2e642df346-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c7490835-3211-4849-83cb-ec2e642df346" (UID: "c7490835-3211-4849-83cb-ec2e642df346"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:15:34 crc kubenswrapper[4585]: I1201 14:15:34.381227 4585 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7490835-3211-4849-83cb-ec2e642df346-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 14:15:34 crc kubenswrapper[4585]: I1201 14:15:34.381260 4585 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c7490835-3211-4849-83cb-ec2e642df346-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 01 14:15:34 crc kubenswrapper[4585]: I1201 14:15:34.381271 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7490835-3211-4849-83cb-ec2e642df346-config\") on node \"crc\" DevicePath \"\"" Dec 01 14:15:34 crc kubenswrapper[4585]: I1201 14:15:34.455900 4585 generic.go:334] "Generic (PLEG): container finished" podID="30f08b08-85f9-4df9-97ce-f0f25238e889" containerID="084e10a06a307eb11f7434765a84865afe81b8eb1a3e6453eb7a9fdfd8da628e" exitCode=0 Dec 01 14:15:34 crc kubenswrapper[4585]: I1201 14:15:34.456001 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-825nv" event={"ID":"30f08b08-85f9-4df9-97ce-f0f25238e889","Type":"ContainerDied","Data":"084e10a06a307eb11f7434765a84865afe81b8eb1a3e6453eb7a9fdfd8da628e"} Dec 01 14:15:34 crc kubenswrapper[4585]: I1201 14:15:34.457753 4585 generic.go:334] "Generic (PLEG): container finished" podID="c7490835-3211-4849-83cb-ec2e642df346" containerID="d96cb4613f3afa0936e764d46561752b707eb11424d55283a354ed8643649b96" exitCode=0 Dec 01 14:15:34 crc kubenswrapper[4585]: I1201 14:15:34.460766 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-dpwxn" Dec 01 14:15:34 crc kubenswrapper[4585]: I1201 14:15:34.461040 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-dpwxn" event={"ID":"c7490835-3211-4849-83cb-ec2e642df346","Type":"ContainerDied","Data":"d96cb4613f3afa0936e764d46561752b707eb11424d55283a354ed8643649b96"} Dec 01 14:15:34 crc kubenswrapper[4585]: I1201 14:15:34.461107 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-dpwxn" event={"ID":"c7490835-3211-4849-83cb-ec2e642df346","Type":"ContainerDied","Data":"3dcef259f9299592021520568ccb57465291599ea7ad0048a4cff820c569c8be"} Dec 01 14:15:34 crc kubenswrapper[4585]: I1201 14:15:34.461126 4585 scope.go:117] "RemoveContainer" containerID="d96cb4613f3afa0936e764d46561752b707eb11424d55283a354ed8643649b96" Dec 01 14:15:34 crc kubenswrapper[4585]: I1201 14:15:34.523865 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-dpwxn"] Dec 01 14:15:34 crc kubenswrapper[4585]: I1201 14:15:34.528390 4585 scope.go:117] "RemoveContainer" containerID="fb2fb01621a9a6293cd56586b1677b36fd943fabeab2837f5ab0c35aeb045d87" Dec 01 14:15:34 crc kubenswrapper[4585]: I1201 14:15:34.531617 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-dpwxn"] Dec 01 14:15:34 crc kubenswrapper[4585]: I1201 14:15:34.551671 4585 scope.go:117] "RemoveContainer" containerID="d96cb4613f3afa0936e764d46561752b707eb11424d55283a354ed8643649b96" Dec 01 14:15:34 crc kubenswrapper[4585]: E1201 14:15:34.552042 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d96cb4613f3afa0936e764d46561752b707eb11424d55283a354ed8643649b96\": container with ID starting with d96cb4613f3afa0936e764d46561752b707eb11424d55283a354ed8643649b96 not found: ID does not exist" containerID="d96cb4613f3afa0936e764d46561752b707eb11424d55283a354ed8643649b96" Dec 01 14:15:34 crc kubenswrapper[4585]: I1201 14:15:34.552084 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d96cb4613f3afa0936e764d46561752b707eb11424d55283a354ed8643649b96"} err="failed to get container status \"d96cb4613f3afa0936e764d46561752b707eb11424d55283a354ed8643649b96\": rpc error: code = NotFound desc = could not find container \"d96cb4613f3afa0936e764d46561752b707eb11424d55283a354ed8643649b96\": container with ID starting with d96cb4613f3afa0936e764d46561752b707eb11424d55283a354ed8643649b96 not found: ID does not exist" Dec 01 14:15:34 crc kubenswrapper[4585]: I1201 14:15:34.552113 4585 scope.go:117] "RemoveContainer" containerID="fb2fb01621a9a6293cd56586b1677b36fd943fabeab2837f5ab0c35aeb045d87" Dec 01 14:15:34 crc kubenswrapper[4585]: E1201 14:15:34.552434 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb2fb01621a9a6293cd56586b1677b36fd943fabeab2837f5ab0c35aeb045d87\": container with ID starting with fb2fb01621a9a6293cd56586b1677b36fd943fabeab2837f5ab0c35aeb045d87 not found: ID does not exist" containerID="fb2fb01621a9a6293cd56586b1677b36fd943fabeab2837f5ab0c35aeb045d87" Dec 01 14:15:34 crc kubenswrapper[4585]: I1201 14:15:34.552465 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb2fb01621a9a6293cd56586b1677b36fd943fabeab2837f5ab0c35aeb045d87"} err="failed to get container status \"fb2fb01621a9a6293cd56586b1677b36fd943fabeab2837f5ab0c35aeb045d87\": rpc error: code = NotFound desc = could not find container \"fb2fb01621a9a6293cd56586b1677b36fd943fabeab2837f5ab0c35aeb045d87\": container with ID starting with fb2fb01621a9a6293cd56586b1677b36fd943fabeab2837f5ab0c35aeb045d87 not found: ID does not exist" Dec 01 14:15:35 crc kubenswrapper[4585]: I1201 14:15:35.470722 4585 generic.go:334] "Generic (PLEG): container finished" podID="1b34175d-932c-4b94-b6cf-b164891fc965" containerID="7b5fe4f21372621fcdc69c13b5289393dffb7bef4fab18f7188c18ce70125f59" exitCode=0 Dec 01 14:15:35 crc kubenswrapper[4585]: I1201 14:15:35.470792 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xbp2z" event={"ID":"1b34175d-932c-4b94-b6cf-b164891fc965","Type":"ContainerDied","Data":"7b5fe4f21372621fcdc69c13b5289393dffb7bef4fab18f7188c18ce70125f59"} Dec 01 14:15:36 crc kubenswrapper[4585]: I1201 14:15:36.008925 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-825nv" Dec 01 14:15:36 crc kubenswrapper[4585]: I1201 14:15:36.111498 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqd5c\" (UniqueName: \"kubernetes.io/projected/30f08b08-85f9-4df9-97ce-f0f25238e889-kube-api-access-zqd5c\") pod \"30f08b08-85f9-4df9-97ce-f0f25238e889\" (UID: \"30f08b08-85f9-4df9-97ce-f0f25238e889\") " Dec 01 14:15:36 crc kubenswrapper[4585]: I1201 14:15:36.111552 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30f08b08-85f9-4df9-97ce-f0f25238e889-combined-ca-bundle\") pod \"30f08b08-85f9-4df9-97ce-f0f25238e889\" (UID: \"30f08b08-85f9-4df9-97ce-f0f25238e889\") " Dec 01 14:15:36 crc kubenswrapper[4585]: I1201 14:15:36.111579 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30f08b08-85f9-4df9-97ce-f0f25238e889-scripts\") pod \"30f08b08-85f9-4df9-97ce-f0f25238e889\" (UID: \"30f08b08-85f9-4df9-97ce-f0f25238e889\") " Dec 01 14:15:36 crc kubenswrapper[4585]: I1201 14:15:36.111670 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30f08b08-85f9-4df9-97ce-f0f25238e889-logs\") pod \"30f08b08-85f9-4df9-97ce-f0f25238e889\" (UID: \"30f08b08-85f9-4df9-97ce-f0f25238e889\") " Dec 01 14:15:36 crc kubenswrapper[4585]: I1201 14:15:36.111708 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30f08b08-85f9-4df9-97ce-f0f25238e889-config-data\") pod \"30f08b08-85f9-4df9-97ce-f0f25238e889\" (UID: \"30f08b08-85f9-4df9-97ce-f0f25238e889\") " Dec 01 14:15:36 crc kubenswrapper[4585]: I1201 14:15:36.113791 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30f08b08-85f9-4df9-97ce-f0f25238e889-logs" (OuterVolumeSpecName: "logs") pod "30f08b08-85f9-4df9-97ce-f0f25238e889" (UID: "30f08b08-85f9-4df9-97ce-f0f25238e889"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:15:36 crc kubenswrapper[4585]: I1201 14:15:36.141356 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30f08b08-85f9-4df9-97ce-f0f25238e889-scripts" (OuterVolumeSpecName: "scripts") pod "30f08b08-85f9-4df9-97ce-f0f25238e889" (UID: "30f08b08-85f9-4df9-97ce-f0f25238e889"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:15:36 crc kubenswrapper[4585]: I1201 14:15:36.146142 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30f08b08-85f9-4df9-97ce-f0f25238e889-kube-api-access-zqd5c" (OuterVolumeSpecName: "kube-api-access-zqd5c") pod "30f08b08-85f9-4df9-97ce-f0f25238e889" (UID: "30f08b08-85f9-4df9-97ce-f0f25238e889"). InnerVolumeSpecName "kube-api-access-zqd5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:15:36 crc kubenswrapper[4585]: I1201 14:15:36.158127 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30f08b08-85f9-4df9-97ce-f0f25238e889-config-data" (OuterVolumeSpecName: "config-data") pod "30f08b08-85f9-4df9-97ce-f0f25238e889" (UID: "30f08b08-85f9-4df9-97ce-f0f25238e889"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:15:36 crc kubenswrapper[4585]: I1201 14:15:36.162181 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30f08b08-85f9-4df9-97ce-f0f25238e889-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "30f08b08-85f9-4df9-97ce-f0f25238e889" (UID: "30f08b08-85f9-4df9-97ce-f0f25238e889"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:15:36 crc kubenswrapper[4585]: I1201 14:15:36.219185 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqd5c\" (UniqueName: \"kubernetes.io/projected/30f08b08-85f9-4df9-97ce-f0f25238e889-kube-api-access-zqd5c\") on node \"crc\" DevicePath \"\"" Dec 01 14:15:36 crc kubenswrapper[4585]: I1201 14:15:36.219219 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30f08b08-85f9-4df9-97ce-f0f25238e889-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 14:15:36 crc kubenswrapper[4585]: I1201 14:15:36.219230 4585 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30f08b08-85f9-4df9-97ce-f0f25238e889-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 14:15:36 crc kubenswrapper[4585]: I1201 14:15:36.219239 4585 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30f08b08-85f9-4df9-97ce-f0f25238e889-logs\") on node \"crc\" DevicePath \"\"" Dec 01 14:15:36 crc kubenswrapper[4585]: I1201 14:15:36.219249 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30f08b08-85f9-4df9-97ce-f0f25238e889-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 14:15:36 crc kubenswrapper[4585]: I1201 14:15:36.439040 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7490835-3211-4849-83cb-ec2e642df346" path="/var/lib/kubelet/pods/c7490835-3211-4849-83cb-ec2e642df346/volumes" Dec 01 14:15:36 crc kubenswrapper[4585]: I1201 14:15:36.500723 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-825nv" Dec 01 14:15:36 crc kubenswrapper[4585]: I1201 14:15:36.501452 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-825nv" event={"ID":"30f08b08-85f9-4df9-97ce-f0f25238e889","Type":"ContainerDied","Data":"a0153b7910cad9568eae2fbc14e3466eba38e67f6169a8711dec446857d34d48"} Dec 01 14:15:36 crc kubenswrapper[4585]: I1201 14:15:36.501501 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0153b7910cad9568eae2fbc14e3466eba38e67f6169a8711dec446857d34d48" Dec 01 14:15:36 crc kubenswrapper[4585]: I1201 14:15:36.643053 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5594675dd-jdqsw"] Dec 01 14:15:36 crc kubenswrapper[4585]: E1201 14:15:36.643747 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30f08b08-85f9-4df9-97ce-f0f25238e889" containerName="placement-db-sync" Dec 01 14:15:36 crc kubenswrapper[4585]: I1201 14:15:36.643768 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="30f08b08-85f9-4df9-97ce-f0f25238e889" containerName="placement-db-sync" Dec 01 14:15:36 crc kubenswrapper[4585]: E1201 14:15:36.643777 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7490835-3211-4849-83cb-ec2e642df346" containerName="init" Dec 01 14:15:36 crc kubenswrapper[4585]: I1201 14:15:36.643783 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7490835-3211-4849-83cb-ec2e642df346" containerName="init" Dec 01 14:15:36 crc kubenswrapper[4585]: E1201 14:15:36.643802 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7490835-3211-4849-83cb-ec2e642df346" containerName="dnsmasq-dns" Dec 01 14:15:36 crc kubenswrapper[4585]: I1201 14:15:36.643809 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7490835-3211-4849-83cb-ec2e642df346" containerName="dnsmasq-dns" Dec 01 14:15:36 crc kubenswrapper[4585]: I1201 14:15:36.644005 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7490835-3211-4849-83cb-ec2e642df346" containerName="dnsmasq-dns" Dec 01 14:15:36 crc kubenswrapper[4585]: I1201 14:15:36.644030 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="30f08b08-85f9-4df9-97ce-f0f25238e889" containerName="placement-db-sync" Dec 01 14:15:36 crc kubenswrapper[4585]: I1201 14:15:36.646228 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5594675dd-jdqsw" Dec 01 14:15:36 crc kubenswrapper[4585]: I1201 14:15:36.654837 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 01 14:15:36 crc kubenswrapper[4585]: I1201 14:15:36.655022 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-wjcrj" Dec 01 14:15:36 crc kubenswrapper[4585]: I1201 14:15:36.658700 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 01 14:15:36 crc kubenswrapper[4585]: I1201 14:15:36.659055 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 01 14:15:36 crc kubenswrapper[4585]: I1201 14:15:36.679462 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 01 14:15:36 crc kubenswrapper[4585]: I1201 14:15:36.693521 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5594675dd-jdqsw"] Dec 01 14:15:36 crc kubenswrapper[4585]: I1201 14:15:36.734340 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8ba0e29-a5ba-4540-a9b6-154a30ff9e99-combined-ca-bundle\") pod \"placement-5594675dd-jdqsw\" (UID: \"c8ba0e29-a5ba-4540-a9b6-154a30ff9e99\") " pod="openstack/placement-5594675dd-jdqsw" Dec 01 14:15:36 crc kubenswrapper[4585]: I1201 14:15:36.734396 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8ba0e29-a5ba-4540-a9b6-154a30ff9e99-public-tls-certs\") pod \"placement-5594675dd-jdqsw\" (UID: \"c8ba0e29-a5ba-4540-a9b6-154a30ff9e99\") " pod="openstack/placement-5594675dd-jdqsw" Dec 01 14:15:36 crc kubenswrapper[4585]: I1201 14:15:36.734446 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8ba0e29-a5ba-4540-a9b6-154a30ff9e99-internal-tls-certs\") pod \"placement-5594675dd-jdqsw\" (UID: \"c8ba0e29-a5ba-4540-a9b6-154a30ff9e99\") " pod="openstack/placement-5594675dd-jdqsw" Dec 01 14:15:36 crc kubenswrapper[4585]: I1201 14:15:36.734488 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8ba0e29-a5ba-4540-a9b6-154a30ff9e99-config-data\") pod \"placement-5594675dd-jdqsw\" (UID: \"c8ba0e29-a5ba-4540-a9b6-154a30ff9e99\") " pod="openstack/placement-5594675dd-jdqsw" Dec 01 14:15:36 crc kubenswrapper[4585]: I1201 14:15:36.734527 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gglj\" (UniqueName: \"kubernetes.io/projected/c8ba0e29-a5ba-4540-a9b6-154a30ff9e99-kube-api-access-8gglj\") pod \"placement-5594675dd-jdqsw\" (UID: \"c8ba0e29-a5ba-4540-a9b6-154a30ff9e99\") " pod="openstack/placement-5594675dd-jdqsw" Dec 01 14:15:36 crc kubenswrapper[4585]: I1201 14:15:36.734551 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8ba0e29-a5ba-4540-a9b6-154a30ff9e99-logs\") pod \"placement-5594675dd-jdqsw\" (UID: \"c8ba0e29-a5ba-4540-a9b6-154a30ff9e99\") " pod="openstack/placement-5594675dd-jdqsw" Dec 01 14:15:36 crc kubenswrapper[4585]: I1201 14:15:36.734586 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8ba0e29-a5ba-4540-a9b6-154a30ff9e99-scripts\") pod \"placement-5594675dd-jdqsw\" (UID: \"c8ba0e29-a5ba-4540-a9b6-154a30ff9e99\") " pod="openstack/placement-5594675dd-jdqsw" Dec 01 14:15:36 crc kubenswrapper[4585]: I1201 14:15:36.844656 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8ba0e29-a5ba-4540-a9b6-154a30ff9e99-combined-ca-bundle\") pod \"placement-5594675dd-jdqsw\" (UID: \"c8ba0e29-a5ba-4540-a9b6-154a30ff9e99\") " pod="openstack/placement-5594675dd-jdqsw" Dec 01 14:15:36 crc kubenswrapper[4585]: I1201 14:15:36.844718 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8ba0e29-a5ba-4540-a9b6-154a30ff9e99-public-tls-certs\") pod \"placement-5594675dd-jdqsw\" (UID: \"c8ba0e29-a5ba-4540-a9b6-154a30ff9e99\") " pod="openstack/placement-5594675dd-jdqsw" Dec 01 14:15:36 crc kubenswrapper[4585]: I1201 14:15:36.844777 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8ba0e29-a5ba-4540-a9b6-154a30ff9e99-internal-tls-certs\") pod \"placement-5594675dd-jdqsw\" (UID: \"c8ba0e29-a5ba-4540-a9b6-154a30ff9e99\") " pod="openstack/placement-5594675dd-jdqsw" Dec 01 14:15:36 crc kubenswrapper[4585]: I1201 14:15:36.844811 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8ba0e29-a5ba-4540-a9b6-154a30ff9e99-config-data\") pod \"placement-5594675dd-jdqsw\" (UID: \"c8ba0e29-a5ba-4540-a9b6-154a30ff9e99\") " pod="openstack/placement-5594675dd-jdqsw" Dec 01 14:15:36 crc kubenswrapper[4585]: I1201 14:15:36.844858 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gglj\" (UniqueName: \"kubernetes.io/projected/c8ba0e29-a5ba-4540-a9b6-154a30ff9e99-kube-api-access-8gglj\") pod \"placement-5594675dd-jdqsw\" (UID: \"c8ba0e29-a5ba-4540-a9b6-154a30ff9e99\") " pod="openstack/placement-5594675dd-jdqsw" Dec 01 14:15:36 crc kubenswrapper[4585]: I1201 14:15:36.844881 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8ba0e29-a5ba-4540-a9b6-154a30ff9e99-logs\") pod \"placement-5594675dd-jdqsw\" (UID: \"c8ba0e29-a5ba-4540-a9b6-154a30ff9e99\") " pod="openstack/placement-5594675dd-jdqsw" Dec 01 14:15:36 crc kubenswrapper[4585]: I1201 14:15:36.844910 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8ba0e29-a5ba-4540-a9b6-154a30ff9e99-scripts\") pod \"placement-5594675dd-jdqsw\" (UID: \"c8ba0e29-a5ba-4540-a9b6-154a30ff9e99\") " pod="openstack/placement-5594675dd-jdqsw" Dec 01 14:15:36 crc kubenswrapper[4585]: I1201 14:15:36.851437 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8ba0e29-a5ba-4540-a9b6-154a30ff9e99-logs\") pod \"placement-5594675dd-jdqsw\" (UID: \"c8ba0e29-a5ba-4540-a9b6-154a30ff9e99\") " pod="openstack/placement-5594675dd-jdqsw" Dec 01 14:15:36 crc kubenswrapper[4585]: I1201 14:15:36.871575 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gglj\" (UniqueName: \"kubernetes.io/projected/c8ba0e29-a5ba-4540-a9b6-154a30ff9e99-kube-api-access-8gglj\") pod \"placement-5594675dd-jdqsw\" (UID: \"c8ba0e29-a5ba-4540-a9b6-154a30ff9e99\") " pod="openstack/placement-5594675dd-jdqsw" Dec 01 14:15:36 crc kubenswrapper[4585]: I1201 14:15:36.908125 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8ba0e29-a5ba-4540-a9b6-154a30ff9e99-scripts\") pod \"placement-5594675dd-jdqsw\" (UID: \"c8ba0e29-a5ba-4540-a9b6-154a30ff9e99\") " pod="openstack/placement-5594675dd-jdqsw" Dec 01 14:15:36 crc kubenswrapper[4585]: I1201 14:15:36.924547 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8ba0e29-a5ba-4540-a9b6-154a30ff9e99-internal-tls-certs\") pod \"placement-5594675dd-jdqsw\" (UID: \"c8ba0e29-a5ba-4540-a9b6-154a30ff9e99\") " pod="openstack/placement-5594675dd-jdqsw" Dec 01 14:15:36 crc kubenswrapper[4585]: I1201 14:15:36.924903 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8ba0e29-a5ba-4540-a9b6-154a30ff9e99-public-tls-certs\") pod \"placement-5594675dd-jdqsw\" (UID: \"c8ba0e29-a5ba-4540-a9b6-154a30ff9e99\") " pod="openstack/placement-5594675dd-jdqsw" Dec 01 14:15:36 crc kubenswrapper[4585]: I1201 14:15:36.946235 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8ba0e29-a5ba-4540-a9b6-154a30ff9e99-config-data\") pod \"placement-5594675dd-jdqsw\" (UID: \"c8ba0e29-a5ba-4540-a9b6-154a30ff9e99\") " pod="openstack/placement-5594675dd-jdqsw" Dec 01 14:15:36 crc kubenswrapper[4585]: I1201 14:15:36.963688 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8ba0e29-a5ba-4540-a9b6-154a30ff9e99-combined-ca-bundle\") pod \"placement-5594675dd-jdqsw\" (UID: \"c8ba0e29-a5ba-4540-a9b6-154a30ff9e99\") " pod="openstack/placement-5594675dd-jdqsw" Dec 01 14:15:37 crc kubenswrapper[4585]: I1201 14:15:37.005414 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5594675dd-jdqsw" Dec 01 14:15:37 crc kubenswrapper[4585]: I1201 14:15:37.270278 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xbp2z" Dec 01 14:15:37 crc kubenswrapper[4585]: I1201 14:15:37.359165 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lbwv\" (UniqueName: \"kubernetes.io/projected/1b34175d-932c-4b94-b6cf-b164891fc965-kube-api-access-9lbwv\") pod \"1b34175d-932c-4b94-b6cf-b164891fc965\" (UID: \"1b34175d-932c-4b94-b6cf-b164891fc965\") " Dec 01 14:15:37 crc kubenswrapper[4585]: I1201 14:15:37.359248 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b34175d-932c-4b94-b6cf-b164891fc965-scripts\") pod \"1b34175d-932c-4b94-b6cf-b164891fc965\" (UID: \"1b34175d-932c-4b94-b6cf-b164891fc965\") " Dec 01 14:15:37 crc kubenswrapper[4585]: I1201 14:15:37.359263 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1b34175d-932c-4b94-b6cf-b164891fc965-credential-keys\") pod \"1b34175d-932c-4b94-b6cf-b164891fc965\" (UID: \"1b34175d-932c-4b94-b6cf-b164891fc965\") " Dec 01 14:15:37 crc kubenswrapper[4585]: I1201 14:15:37.359324 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b34175d-932c-4b94-b6cf-b164891fc965-combined-ca-bundle\") pod \"1b34175d-932c-4b94-b6cf-b164891fc965\" (UID: \"1b34175d-932c-4b94-b6cf-b164891fc965\") " Dec 01 14:15:37 crc kubenswrapper[4585]: I1201 14:15:37.359343 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1b34175d-932c-4b94-b6cf-b164891fc965-fernet-keys\") pod \"1b34175d-932c-4b94-b6cf-b164891fc965\" (UID: \"1b34175d-932c-4b94-b6cf-b164891fc965\") " Dec 01 14:15:37 crc kubenswrapper[4585]: I1201 14:15:37.359360 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b34175d-932c-4b94-b6cf-b164891fc965-config-data\") pod \"1b34175d-932c-4b94-b6cf-b164891fc965\" (UID: \"1b34175d-932c-4b94-b6cf-b164891fc965\") " Dec 01 14:15:37 crc kubenswrapper[4585]: I1201 14:15:37.379062 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b34175d-932c-4b94-b6cf-b164891fc965-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "1b34175d-932c-4b94-b6cf-b164891fc965" (UID: "1b34175d-932c-4b94-b6cf-b164891fc965"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:15:37 crc kubenswrapper[4585]: I1201 14:15:37.394383 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b34175d-932c-4b94-b6cf-b164891fc965-kube-api-access-9lbwv" (OuterVolumeSpecName: "kube-api-access-9lbwv") pod "1b34175d-932c-4b94-b6cf-b164891fc965" (UID: "1b34175d-932c-4b94-b6cf-b164891fc965"). InnerVolumeSpecName "kube-api-access-9lbwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:15:37 crc kubenswrapper[4585]: I1201 14:15:37.394982 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b34175d-932c-4b94-b6cf-b164891fc965-config-data" (OuterVolumeSpecName: "config-data") pod "1b34175d-932c-4b94-b6cf-b164891fc965" (UID: "1b34175d-932c-4b94-b6cf-b164891fc965"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:15:37 crc kubenswrapper[4585]: I1201 14:15:37.396474 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b34175d-932c-4b94-b6cf-b164891fc965-scripts" (OuterVolumeSpecName: "scripts") pod "1b34175d-932c-4b94-b6cf-b164891fc965" (UID: "1b34175d-932c-4b94-b6cf-b164891fc965"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:15:37 crc kubenswrapper[4585]: I1201 14:15:37.396701 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b34175d-932c-4b94-b6cf-b164891fc965-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "1b34175d-932c-4b94-b6cf-b164891fc965" (UID: "1b34175d-932c-4b94-b6cf-b164891fc965"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:15:37 crc kubenswrapper[4585]: I1201 14:15:37.451130 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b34175d-932c-4b94-b6cf-b164891fc965-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1b34175d-932c-4b94-b6cf-b164891fc965" (UID: "1b34175d-932c-4b94-b6cf-b164891fc965"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:15:37 crc kubenswrapper[4585]: I1201 14:15:37.466108 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b34175d-932c-4b94-b6cf-b164891fc965-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 14:15:37 crc kubenswrapper[4585]: I1201 14:15:37.466138 4585 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1b34175d-932c-4b94-b6cf-b164891fc965-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 01 14:15:37 crc kubenswrapper[4585]: I1201 14:15:37.466151 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b34175d-932c-4b94-b6cf-b164891fc965-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 14:15:37 crc kubenswrapper[4585]: I1201 14:15:37.466163 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9lbwv\" (UniqueName: \"kubernetes.io/projected/1b34175d-932c-4b94-b6cf-b164891fc965-kube-api-access-9lbwv\") on node \"crc\" DevicePath \"\"" Dec 01 14:15:37 crc kubenswrapper[4585]: I1201 14:15:37.466175 4585 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b34175d-932c-4b94-b6cf-b164891fc965-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 14:15:37 crc kubenswrapper[4585]: I1201 14:15:37.466188 4585 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1b34175d-932c-4b94-b6cf-b164891fc965-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 01 14:15:37 crc kubenswrapper[4585]: I1201 14:15:37.540522 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xbp2z" event={"ID":"1b34175d-932c-4b94-b6cf-b164891fc965","Type":"ContainerDied","Data":"a465f46c7cb3d74a7cb7bf6557e4e930e276f13055c7a923f54e2bfeae106ea4"} Dec 01 14:15:37 crc kubenswrapper[4585]: I1201 14:15:37.540560 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a465f46c7cb3d74a7cb7bf6557e4e930e276f13055c7a923f54e2bfeae106ea4" Dec 01 14:15:37 crc kubenswrapper[4585]: I1201 14:15:37.540621 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xbp2z" Dec 01 14:15:37 crc kubenswrapper[4585]: I1201 14:15:37.563314 4585 generic.go:334] "Generic (PLEG): container finished" podID="7557d295-9d3a-4d0f-933a-77390e7e179e" containerID="8159ec23eb18a08fdc106c8e732cbfa469362438df2218ee2363245f975bf8cb" exitCode=0 Dec 01 14:15:37 crc kubenswrapper[4585]: I1201 14:15:37.563359 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-sf4wz" event={"ID":"7557d295-9d3a-4d0f-933a-77390e7e179e","Type":"ContainerDied","Data":"8159ec23eb18a08fdc106c8e732cbfa469362438df2218ee2363245f975bf8cb"} Dec 01 14:15:37 crc kubenswrapper[4585]: I1201 14:15:37.570209 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 01 14:15:37 crc kubenswrapper[4585]: I1201 14:15:37.570593 4585 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 14:15:37 crc kubenswrapper[4585]: I1201 14:15:37.716031 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-576d96b8bf-jl74m"] Dec 01 14:15:37 crc kubenswrapper[4585]: E1201 14:15:37.716474 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b34175d-932c-4b94-b6cf-b164891fc965" containerName="keystone-bootstrap" Dec 01 14:15:37 crc kubenswrapper[4585]: I1201 14:15:37.716486 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b34175d-932c-4b94-b6cf-b164891fc965" containerName="keystone-bootstrap" Dec 01 14:15:37 crc kubenswrapper[4585]: I1201 14:15:37.716660 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b34175d-932c-4b94-b6cf-b164891fc965" containerName="keystone-bootstrap" Dec 01 14:15:37 crc kubenswrapper[4585]: I1201 14:15:37.717274 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-576d96b8bf-jl74m" Dec 01 14:15:37 crc kubenswrapper[4585]: I1201 14:15:37.724132 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 01 14:15:37 crc kubenswrapper[4585]: I1201 14:15:37.724418 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 01 14:15:37 crc kubenswrapper[4585]: I1201 14:15:37.724438 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 01 14:15:37 crc kubenswrapper[4585]: I1201 14:15:37.724535 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 01 14:15:37 crc kubenswrapper[4585]: I1201 14:15:37.724690 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-mj6w8" Dec 01 14:15:37 crc kubenswrapper[4585]: I1201 14:15:37.724894 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 01 14:15:37 crc kubenswrapper[4585]: I1201 14:15:37.750958 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-576d96b8bf-jl74m"] Dec 01 14:15:37 crc kubenswrapper[4585]: I1201 14:15:37.838340 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5594675dd-jdqsw"] Dec 01 14:15:37 crc kubenswrapper[4585]: I1201 14:15:37.891604 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ccd2ad6-369c-4511-9704-9f091dac6dd7-combined-ca-bundle\") pod \"keystone-576d96b8bf-jl74m\" (UID: \"2ccd2ad6-369c-4511-9704-9f091dac6dd7\") " pod="openstack/keystone-576d96b8bf-jl74m" Dec 01 14:15:37 crc kubenswrapper[4585]: I1201 14:15:37.891762 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2ccd2ad6-369c-4511-9704-9f091dac6dd7-fernet-keys\") pod \"keystone-576d96b8bf-jl74m\" (UID: \"2ccd2ad6-369c-4511-9704-9f091dac6dd7\") " pod="openstack/keystone-576d96b8bf-jl74m" Dec 01 14:15:37 crc kubenswrapper[4585]: I1201 14:15:37.891841 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ccd2ad6-369c-4511-9704-9f091dac6dd7-internal-tls-certs\") pod \"keystone-576d96b8bf-jl74m\" (UID: \"2ccd2ad6-369c-4511-9704-9f091dac6dd7\") " pod="openstack/keystone-576d96b8bf-jl74m" Dec 01 14:15:37 crc kubenswrapper[4585]: I1201 14:15:37.891886 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2ccd2ad6-369c-4511-9704-9f091dac6dd7-credential-keys\") pod \"keystone-576d96b8bf-jl74m\" (UID: \"2ccd2ad6-369c-4511-9704-9f091dac6dd7\") " pod="openstack/keystone-576d96b8bf-jl74m" Dec 01 14:15:37 crc kubenswrapper[4585]: I1201 14:15:37.891910 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ccd2ad6-369c-4511-9704-9f091dac6dd7-scripts\") pod \"keystone-576d96b8bf-jl74m\" (UID: \"2ccd2ad6-369c-4511-9704-9f091dac6dd7\") " pod="openstack/keystone-576d96b8bf-jl74m" Dec 01 14:15:37 crc kubenswrapper[4585]: I1201 14:15:37.891929 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ccd2ad6-369c-4511-9704-9f091dac6dd7-public-tls-certs\") pod \"keystone-576d96b8bf-jl74m\" (UID: \"2ccd2ad6-369c-4511-9704-9f091dac6dd7\") " pod="openstack/keystone-576d96b8bf-jl74m" Dec 01 14:15:37 crc kubenswrapper[4585]: I1201 14:15:37.891994 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q64pz\" (UniqueName: \"kubernetes.io/projected/2ccd2ad6-369c-4511-9704-9f091dac6dd7-kube-api-access-q64pz\") pod \"keystone-576d96b8bf-jl74m\" (UID: \"2ccd2ad6-369c-4511-9704-9f091dac6dd7\") " pod="openstack/keystone-576d96b8bf-jl74m" Dec 01 14:15:37 crc kubenswrapper[4585]: I1201 14:15:37.892041 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ccd2ad6-369c-4511-9704-9f091dac6dd7-config-data\") pod \"keystone-576d96b8bf-jl74m\" (UID: \"2ccd2ad6-369c-4511-9704-9f091dac6dd7\") " pod="openstack/keystone-576d96b8bf-jl74m" Dec 01 14:15:37 crc kubenswrapper[4585]: I1201 14:15:37.993106 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2ccd2ad6-369c-4511-9704-9f091dac6dd7-fernet-keys\") pod \"keystone-576d96b8bf-jl74m\" (UID: \"2ccd2ad6-369c-4511-9704-9f091dac6dd7\") " pod="openstack/keystone-576d96b8bf-jl74m" Dec 01 14:15:37 crc kubenswrapper[4585]: I1201 14:15:37.993150 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ccd2ad6-369c-4511-9704-9f091dac6dd7-internal-tls-certs\") pod \"keystone-576d96b8bf-jl74m\" (UID: \"2ccd2ad6-369c-4511-9704-9f091dac6dd7\") " pod="openstack/keystone-576d96b8bf-jl74m" Dec 01 14:15:37 crc kubenswrapper[4585]: I1201 14:15:37.993177 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2ccd2ad6-369c-4511-9704-9f091dac6dd7-credential-keys\") pod \"keystone-576d96b8bf-jl74m\" (UID: \"2ccd2ad6-369c-4511-9704-9f091dac6dd7\") " pod="openstack/keystone-576d96b8bf-jl74m" Dec 01 14:15:37 crc kubenswrapper[4585]: I1201 14:15:37.993202 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ccd2ad6-369c-4511-9704-9f091dac6dd7-scripts\") pod \"keystone-576d96b8bf-jl74m\" (UID: \"2ccd2ad6-369c-4511-9704-9f091dac6dd7\") " pod="openstack/keystone-576d96b8bf-jl74m" Dec 01 14:15:37 crc kubenswrapper[4585]: I1201 14:15:37.993222 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ccd2ad6-369c-4511-9704-9f091dac6dd7-public-tls-certs\") pod \"keystone-576d96b8bf-jl74m\" (UID: \"2ccd2ad6-369c-4511-9704-9f091dac6dd7\") " pod="openstack/keystone-576d96b8bf-jl74m" Dec 01 14:15:37 crc kubenswrapper[4585]: I1201 14:15:37.993262 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q64pz\" (UniqueName: \"kubernetes.io/projected/2ccd2ad6-369c-4511-9704-9f091dac6dd7-kube-api-access-q64pz\") pod \"keystone-576d96b8bf-jl74m\" (UID: \"2ccd2ad6-369c-4511-9704-9f091dac6dd7\") " pod="openstack/keystone-576d96b8bf-jl74m" Dec 01 14:15:37 crc kubenswrapper[4585]: I1201 14:15:37.993298 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ccd2ad6-369c-4511-9704-9f091dac6dd7-config-data\") pod \"keystone-576d96b8bf-jl74m\" (UID: \"2ccd2ad6-369c-4511-9704-9f091dac6dd7\") " pod="openstack/keystone-576d96b8bf-jl74m" Dec 01 14:15:37 crc kubenswrapper[4585]: I1201 14:15:37.993318 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ccd2ad6-369c-4511-9704-9f091dac6dd7-combined-ca-bundle\") pod \"keystone-576d96b8bf-jl74m\" (UID: \"2ccd2ad6-369c-4511-9704-9f091dac6dd7\") " pod="openstack/keystone-576d96b8bf-jl74m" Dec 01 14:15:37 crc kubenswrapper[4585]: I1201 14:15:37.998000 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 01 14:15:38 crc kubenswrapper[4585]: I1201 14:15:38.003530 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ccd2ad6-369c-4511-9704-9f091dac6dd7-config-data\") pod \"keystone-576d96b8bf-jl74m\" (UID: \"2ccd2ad6-369c-4511-9704-9f091dac6dd7\") " pod="openstack/keystone-576d96b8bf-jl74m" Dec 01 14:15:38 crc kubenswrapper[4585]: I1201 14:15:38.003735 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ccd2ad6-369c-4511-9704-9f091dac6dd7-scripts\") pod \"keystone-576d96b8bf-jl74m\" (UID: \"2ccd2ad6-369c-4511-9704-9f091dac6dd7\") " pod="openstack/keystone-576d96b8bf-jl74m" Dec 01 14:15:38 crc kubenswrapper[4585]: I1201 14:15:38.006528 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ccd2ad6-369c-4511-9704-9f091dac6dd7-combined-ca-bundle\") pod \"keystone-576d96b8bf-jl74m\" (UID: \"2ccd2ad6-369c-4511-9704-9f091dac6dd7\") " pod="openstack/keystone-576d96b8bf-jl74m" Dec 01 14:15:38 crc kubenswrapper[4585]: I1201 14:15:38.015248 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2ccd2ad6-369c-4511-9704-9f091dac6dd7-fernet-keys\") pod \"keystone-576d96b8bf-jl74m\" (UID: \"2ccd2ad6-369c-4511-9704-9f091dac6dd7\") " pod="openstack/keystone-576d96b8bf-jl74m" Dec 01 14:15:38 crc kubenswrapper[4585]: I1201 14:15:38.016412 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ccd2ad6-369c-4511-9704-9f091dac6dd7-public-tls-certs\") pod \"keystone-576d96b8bf-jl74m\" (UID: \"2ccd2ad6-369c-4511-9704-9f091dac6dd7\") " pod="openstack/keystone-576d96b8bf-jl74m" Dec 01 14:15:38 crc kubenswrapper[4585]: I1201 14:15:38.017286 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ccd2ad6-369c-4511-9704-9f091dac6dd7-internal-tls-certs\") pod \"keystone-576d96b8bf-jl74m\" (UID: \"2ccd2ad6-369c-4511-9704-9f091dac6dd7\") " pod="openstack/keystone-576d96b8bf-jl74m" Dec 01 14:15:38 crc kubenswrapper[4585]: I1201 14:15:38.049639 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2ccd2ad6-369c-4511-9704-9f091dac6dd7-credential-keys\") pod \"keystone-576d96b8bf-jl74m\" (UID: \"2ccd2ad6-369c-4511-9704-9f091dac6dd7\") " pod="openstack/keystone-576d96b8bf-jl74m" Dec 01 14:15:38 crc kubenswrapper[4585]: I1201 14:15:38.079340 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q64pz\" (UniqueName: \"kubernetes.io/projected/2ccd2ad6-369c-4511-9704-9f091dac6dd7-kube-api-access-q64pz\") pod \"keystone-576d96b8bf-jl74m\" (UID: \"2ccd2ad6-369c-4511-9704-9f091dac6dd7\") " pod="openstack/keystone-576d96b8bf-jl74m" Dec 01 14:15:38 crc kubenswrapper[4585]: I1201 14:15:38.361987 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-576d96b8bf-jl74m" Dec 01 14:15:39 crc kubenswrapper[4585]: I1201 14:15:39.674857 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 01 14:15:39 crc kubenswrapper[4585]: I1201 14:15:39.675222 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 01 14:15:39 crc kubenswrapper[4585]: I1201 14:15:39.792693 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 01 14:15:39 crc kubenswrapper[4585]: I1201 14:15:39.826839 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 01 14:15:40 crc kubenswrapper[4585]: I1201 14:15:40.589747 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 01 14:15:40 crc kubenswrapper[4585]: I1201 14:15:40.589777 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 01 14:15:40 crc kubenswrapper[4585]: I1201 14:15:40.634254 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6f5b64975d-2mfhq" podUID="d6fd44d5-e430-42bb-ad0b-7c78e7a1f288" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Dec 01 14:15:40 crc kubenswrapper[4585]: I1201 14:15:40.798305 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6bbf659b46-55tth" podUID="e3b59a9b-bef9-4f7c-b861-bf6bc2a8bac1" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.149:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.149:8443: connect: connection refused" Dec 01 14:15:42 crc kubenswrapper[4585]: I1201 14:15:42.609174 4585 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 14:15:42 crc kubenswrapper[4585]: I1201 14:15:42.610077 4585 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 14:15:44 crc kubenswrapper[4585]: I1201 14:15:44.960793 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 01 14:15:44 crc kubenswrapper[4585]: I1201 14:15:44.961481 4585 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 14:15:45 crc kubenswrapper[4585]: I1201 14:15:45.166537 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 01 14:15:45 crc kubenswrapper[4585]: W1201 14:15:45.918736 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8ba0e29_a5ba_4540_a9b6_154a30ff9e99.slice/crio-929f16534129bcdee578ab82bfb9ba91bb7830f4963f66ba1b9024ff25df9430 WatchSource:0}: Error finding container 929f16534129bcdee578ab82bfb9ba91bb7830f4963f66ba1b9024ff25df9430: Status 404 returned error can't find the container with id 929f16534129bcdee578ab82bfb9ba91bb7830f4963f66ba1b9024ff25df9430 Dec 01 14:15:46 crc kubenswrapper[4585]: I1201 14:15:46.212338 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-sf4wz" Dec 01 14:15:46 crc kubenswrapper[4585]: I1201 14:15:46.411644 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7557d295-9d3a-4d0f-933a-77390e7e179e-db-sync-config-data\") pod \"7557d295-9d3a-4d0f-933a-77390e7e179e\" (UID: \"7557d295-9d3a-4d0f-933a-77390e7e179e\") " Dec 01 14:15:46 crc kubenswrapper[4585]: I1201 14:15:46.411864 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7557d295-9d3a-4d0f-933a-77390e7e179e-combined-ca-bundle\") pod \"7557d295-9d3a-4d0f-933a-77390e7e179e\" (UID: \"7557d295-9d3a-4d0f-933a-77390e7e179e\") " Dec 01 14:15:46 crc kubenswrapper[4585]: I1201 14:15:46.412112 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fv5nz\" (UniqueName: \"kubernetes.io/projected/7557d295-9d3a-4d0f-933a-77390e7e179e-kube-api-access-fv5nz\") pod \"7557d295-9d3a-4d0f-933a-77390e7e179e\" (UID: \"7557d295-9d3a-4d0f-933a-77390e7e179e\") " Dec 01 14:15:46 crc kubenswrapper[4585]: I1201 14:15:46.418136 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7557d295-9d3a-4d0f-933a-77390e7e179e-kube-api-access-fv5nz" (OuterVolumeSpecName: "kube-api-access-fv5nz") pod "7557d295-9d3a-4d0f-933a-77390e7e179e" (UID: "7557d295-9d3a-4d0f-933a-77390e7e179e"). InnerVolumeSpecName "kube-api-access-fv5nz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:15:46 crc kubenswrapper[4585]: I1201 14:15:46.420501 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7557d295-9d3a-4d0f-933a-77390e7e179e-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "7557d295-9d3a-4d0f-933a-77390e7e179e" (UID: "7557d295-9d3a-4d0f-933a-77390e7e179e"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:15:46 crc kubenswrapper[4585]: I1201 14:15:46.477634 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7557d295-9d3a-4d0f-933a-77390e7e179e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7557d295-9d3a-4d0f-933a-77390e7e179e" (UID: "7557d295-9d3a-4d0f-933a-77390e7e179e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:15:46 crc kubenswrapper[4585]: I1201 14:15:46.523478 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fv5nz\" (UniqueName: \"kubernetes.io/projected/7557d295-9d3a-4d0f-933a-77390e7e179e-kube-api-access-fv5nz\") on node \"crc\" DevicePath \"\"" Dec 01 14:15:46 crc kubenswrapper[4585]: I1201 14:15:46.523737 4585 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7557d295-9d3a-4d0f-933a-77390e7e179e-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 14:15:46 crc kubenswrapper[4585]: I1201 14:15:46.523751 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7557d295-9d3a-4d0f-933a-77390e7e179e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 14:15:46 crc kubenswrapper[4585]: I1201 14:15:46.584534 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-576d96b8bf-jl74m"] Dec 01 14:15:46 crc kubenswrapper[4585]: I1201 14:15:46.658707 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"192bd785-3d80-4b5e-8db2-55a0e3846802","Type":"ContainerStarted","Data":"e0f5c62f7600bf9dc4ace09b2dbd0f4fa25968e200a287e821eb0803416bbb1a"} Dec 01 14:15:46 crc kubenswrapper[4585]: I1201 14:15:46.661265 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-576d96b8bf-jl74m" event={"ID":"2ccd2ad6-369c-4511-9704-9f091dac6dd7","Type":"ContainerStarted","Data":"6cc24730fbefa88b3e7b943c136aea982ca87f987a023691224afef19058a2b8"} Dec 01 14:15:46 crc kubenswrapper[4585]: I1201 14:15:46.664140 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-sf4wz" event={"ID":"7557d295-9d3a-4d0f-933a-77390e7e179e","Type":"ContainerDied","Data":"fb7b03029bd42f63483b4103dd22a27c4cb66a9899d0b086dd99c0310b175f90"} Dec 01 14:15:46 crc kubenswrapper[4585]: I1201 14:15:46.664171 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb7b03029bd42f63483b4103dd22a27c4cb66a9899d0b086dd99c0310b175f90" Dec 01 14:15:46 crc kubenswrapper[4585]: I1201 14:15:46.664242 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-sf4wz" Dec 01 14:15:46 crc kubenswrapper[4585]: I1201 14:15:46.693764 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5594675dd-jdqsw" event={"ID":"c8ba0e29-a5ba-4540-a9b6-154a30ff9e99","Type":"ContainerStarted","Data":"3e4e87e8d869eab4d09358cf4cb97b05d481a0aebff60d58d462f5a6648d09bc"} Dec 01 14:15:46 crc kubenswrapper[4585]: I1201 14:15:46.693816 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5594675dd-jdqsw" event={"ID":"c8ba0e29-a5ba-4540-a9b6-154a30ff9e99","Type":"ContainerStarted","Data":"929f16534129bcdee578ab82bfb9ba91bb7830f4963f66ba1b9024ff25df9430"} Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.595607 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-868456d7cd-m64gp"] Dec 01 14:15:47 crc kubenswrapper[4585]: E1201 14:15:47.596249 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7557d295-9d3a-4d0f-933a-77390e7e179e" containerName="barbican-db-sync" Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.596263 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="7557d295-9d3a-4d0f-933a-77390e7e179e" containerName="barbican-db-sync" Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.596454 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="7557d295-9d3a-4d0f-933a-77390e7e179e" containerName="barbican-db-sync" Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.598390 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-868456d7cd-m64gp" Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.609750 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.610057 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.610209 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-fkm5f" Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.664030 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-868456d7cd-m64gp"] Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.681774 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-65f6dd57f9-22fc5"] Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.683446 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-65f6dd57f9-22fc5" Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.687373 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.723963 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-wmh4t" event={"ID":"7dead942-d6c5-4a4a-aa5e-5c57b6da0c48","Type":"ContainerStarted","Data":"c3462b3634ded13b7f47d7b9fed51f7c11e55c11f88e9558889f60d36bb6fe91"} Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.737046 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5594675dd-jdqsw" event={"ID":"c8ba0e29-a5ba-4540-a9b6-154a30ff9e99","Type":"ContainerStarted","Data":"b3d51bfb046a7bb3ef8e0c99af7f1fbe9f46b3ca9ac238b63d10d8fcee80739a"} Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.737106 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5594675dd-jdqsw" Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.737137 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5594675dd-jdqsw" Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.737771 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-65f6dd57f9-22fc5"] Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.750454 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-576d96b8bf-jl74m" event={"ID":"2ccd2ad6-369c-4511-9704-9f091dac6dd7","Type":"ContainerStarted","Data":"e650f21cf7f98e50ae81a1e99c7e3f1b468c27d6baa6dbc65291e2fea88d6b3d"} Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.751340 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-576d96b8bf-jl74m" Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.755187 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/31b37f1e-9af0-4922-8e48-55f82175411c-config-data-custom\") pod \"barbican-keystone-listener-868456d7cd-m64gp\" (UID: \"31b37f1e-9af0-4922-8e48-55f82175411c\") " pod="openstack/barbican-keystone-listener-868456d7cd-m64gp" Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.755529 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61f0d5fb-5daa-4828-a1a8-92dc39a7c822-combined-ca-bundle\") pod \"barbican-worker-65f6dd57f9-22fc5\" (UID: \"61f0d5fb-5daa-4828-a1a8-92dc39a7c822\") " pod="openstack/barbican-worker-65f6dd57f9-22fc5" Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.755549 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5t6nn\" (UniqueName: \"kubernetes.io/projected/61f0d5fb-5daa-4828-a1a8-92dc39a7c822-kube-api-access-5t6nn\") pod \"barbican-worker-65f6dd57f9-22fc5\" (UID: \"61f0d5fb-5daa-4828-a1a8-92dc39a7c822\") " pod="openstack/barbican-worker-65f6dd57f9-22fc5" Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.755609 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61f0d5fb-5daa-4828-a1a8-92dc39a7c822-config-data\") pod \"barbican-worker-65f6dd57f9-22fc5\" (UID: \"61f0d5fb-5daa-4828-a1a8-92dc39a7c822\") " pod="openstack/barbican-worker-65f6dd57f9-22fc5" Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.755652 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31b37f1e-9af0-4922-8e48-55f82175411c-combined-ca-bundle\") pod \"barbican-keystone-listener-868456d7cd-m64gp\" (UID: \"31b37f1e-9af0-4922-8e48-55f82175411c\") " pod="openstack/barbican-keystone-listener-868456d7cd-m64gp" Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.755718 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31b37f1e-9af0-4922-8e48-55f82175411c-config-data\") pod \"barbican-keystone-listener-868456d7cd-m64gp\" (UID: \"31b37f1e-9af0-4922-8e48-55f82175411c\") " pod="openstack/barbican-keystone-listener-868456d7cd-m64gp" Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.755795 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61f0d5fb-5daa-4828-a1a8-92dc39a7c822-logs\") pod \"barbican-worker-65f6dd57f9-22fc5\" (UID: \"61f0d5fb-5daa-4828-a1a8-92dc39a7c822\") " pod="openstack/barbican-worker-65f6dd57f9-22fc5" Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.755823 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmmq9\" (UniqueName: \"kubernetes.io/projected/31b37f1e-9af0-4922-8e48-55f82175411c-kube-api-access-bmmq9\") pod \"barbican-keystone-listener-868456d7cd-m64gp\" (UID: \"31b37f1e-9af0-4922-8e48-55f82175411c\") " pod="openstack/barbican-keystone-listener-868456d7cd-m64gp" Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.755887 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31b37f1e-9af0-4922-8e48-55f82175411c-logs\") pod \"barbican-keystone-listener-868456d7cd-m64gp\" (UID: \"31b37f1e-9af0-4922-8e48-55f82175411c\") " pod="openstack/barbican-keystone-listener-868456d7cd-m64gp" Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.755918 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/61f0d5fb-5daa-4828-a1a8-92dc39a7c822-config-data-custom\") pod \"barbican-worker-65f6dd57f9-22fc5\" (UID: \"61f0d5fb-5daa-4828-a1a8-92dc39a7c822\") " pod="openstack/barbican-worker-65f6dd57f9-22fc5" Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.775238 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-wmh4t" podStartSLOduration=3.474099462 podStartE2EDuration="55.775221271s" podCreationTimestamp="2025-12-01 14:14:52 +0000 UTC" firstStartedPulling="2025-12-01 14:14:53.712871114 +0000 UTC m=+1007.697084969" lastFinishedPulling="2025-12-01 14:15:46.013992923 +0000 UTC m=+1059.998206778" observedRunningTime="2025-12-01 14:15:47.758250718 +0000 UTC m=+1061.742464573" watchObservedRunningTime="2025-12-01 14:15:47.775221271 +0000 UTC m=+1061.759435116" Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.810688 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-5g4hh"] Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.816746 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-5g4hh" Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.824533 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-77c848ff4-gtzqf"] Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.827295 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-77c848ff4-gtzqf" Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.847158 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.857330 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31b37f1e-9af0-4922-8e48-55f82175411c-logs\") pod \"barbican-keystone-listener-868456d7cd-m64gp\" (UID: \"31b37f1e-9af0-4922-8e48-55f82175411c\") " pod="openstack/barbican-keystone-listener-868456d7cd-m64gp" Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.857372 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/61f0d5fb-5daa-4828-a1a8-92dc39a7c822-config-data-custom\") pod \"barbican-worker-65f6dd57f9-22fc5\" (UID: \"61f0d5fb-5daa-4828-a1a8-92dc39a7c822\") " pod="openstack/barbican-worker-65f6dd57f9-22fc5" Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.857408 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/31b37f1e-9af0-4922-8e48-55f82175411c-config-data-custom\") pod \"barbican-keystone-listener-868456d7cd-m64gp\" (UID: \"31b37f1e-9af0-4922-8e48-55f82175411c\") " pod="openstack/barbican-keystone-listener-868456d7cd-m64gp" Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.857431 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61f0d5fb-5daa-4828-a1a8-92dc39a7c822-combined-ca-bundle\") pod \"barbican-worker-65f6dd57f9-22fc5\" (UID: \"61f0d5fb-5daa-4828-a1a8-92dc39a7c822\") " pod="openstack/barbican-worker-65f6dd57f9-22fc5" Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.857448 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5t6nn\" (UniqueName: \"kubernetes.io/projected/61f0d5fb-5daa-4828-a1a8-92dc39a7c822-kube-api-access-5t6nn\") pod \"barbican-worker-65f6dd57f9-22fc5\" (UID: \"61f0d5fb-5daa-4828-a1a8-92dc39a7c822\") " pod="openstack/barbican-worker-65f6dd57f9-22fc5" Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.857486 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61f0d5fb-5daa-4828-a1a8-92dc39a7c822-config-data\") pod \"barbican-worker-65f6dd57f9-22fc5\" (UID: \"61f0d5fb-5daa-4828-a1a8-92dc39a7c822\") " pod="openstack/barbican-worker-65f6dd57f9-22fc5" Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.857519 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31b37f1e-9af0-4922-8e48-55f82175411c-combined-ca-bundle\") pod \"barbican-keystone-listener-868456d7cd-m64gp\" (UID: \"31b37f1e-9af0-4922-8e48-55f82175411c\") " pod="openstack/barbican-keystone-listener-868456d7cd-m64gp" Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.857553 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31b37f1e-9af0-4922-8e48-55f82175411c-config-data\") pod \"barbican-keystone-listener-868456d7cd-m64gp\" (UID: \"31b37f1e-9af0-4922-8e48-55f82175411c\") " pod="openstack/barbican-keystone-listener-868456d7cd-m64gp" Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.857702 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61f0d5fb-5daa-4828-a1a8-92dc39a7c822-logs\") pod \"barbican-worker-65f6dd57f9-22fc5\" (UID: \"61f0d5fb-5daa-4828-a1a8-92dc39a7c822\") " pod="openstack/barbican-worker-65f6dd57f9-22fc5" Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.857722 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmmq9\" (UniqueName: \"kubernetes.io/projected/31b37f1e-9af0-4922-8e48-55f82175411c-kube-api-access-bmmq9\") pod \"barbican-keystone-listener-868456d7cd-m64gp\" (UID: \"31b37f1e-9af0-4922-8e48-55f82175411c\") " pod="openstack/barbican-keystone-listener-868456d7cd-m64gp" Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.874273 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31b37f1e-9af0-4922-8e48-55f82175411c-logs\") pod \"barbican-keystone-listener-868456d7cd-m64gp\" (UID: \"31b37f1e-9af0-4922-8e48-55f82175411c\") " pod="openstack/barbican-keystone-listener-868456d7cd-m64gp" Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.878088 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-5g4hh"] Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.880915 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61f0d5fb-5daa-4828-a1a8-92dc39a7c822-logs\") pod \"barbican-worker-65f6dd57f9-22fc5\" (UID: \"61f0d5fb-5daa-4828-a1a8-92dc39a7c822\") " pod="openstack/barbican-worker-65f6dd57f9-22fc5" Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.887645 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/31b37f1e-9af0-4922-8e48-55f82175411c-config-data-custom\") pod \"barbican-keystone-listener-868456d7cd-m64gp\" (UID: \"31b37f1e-9af0-4922-8e48-55f82175411c\") " pod="openstack/barbican-keystone-listener-868456d7cd-m64gp" Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.888579 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5594675dd-jdqsw" podStartSLOduration=11.888029698 podStartE2EDuration="11.888029698s" podCreationTimestamp="2025-12-01 14:15:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:15:47.817236581 +0000 UTC m=+1061.801450436" watchObservedRunningTime="2025-12-01 14:15:47.888029698 +0000 UTC m=+1061.872243553" Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.916018 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/61f0d5fb-5daa-4828-a1a8-92dc39a7c822-config-data-custom\") pod \"barbican-worker-65f6dd57f9-22fc5\" (UID: \"61f0d5fb-5daa-4828-a1a8-92dc39a7c822\") " pod="openstack/barbican-worker-65f6dd57f9-22fc5" Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.916089 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-77c848ff4-gtzqf"] Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.917466 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5t6nn\" (UniqueName: \"kubernetes.io/projected/61f0d5fb-5daa-4828-a1a8-92dc39a7c822-kube-api-access-5t6nn\") pod \"barbican-worker-65f6dd57f9-22fc5\" (UID: \"61f0d5fb-5daa-4828-a1a8-92dc39a7c822\") " pod="openstack/barbican-worker-65f6dd57f9-22fc5" Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.936183 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-576d96b8bf-jl74m" podStartSLOduration=10.936161341 podStartE2EDuration="10.936161341s" podCreationTimestamp="2025-12-01 14:15:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:15:47.856433415 +0000 UTC m=+1061.840647270" watchObservedRunningTime="2025-12-01 14:15:47.936161341 +0000 UTC m=+1061.920375206" Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.937256 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61f0d5fb-5daa-4828-a1a8-92dc39a7c822-config-data\") pod \"barbican-worker-65f6dd57f9-22fc5\" (UID: \"61f0d5fb-5daa-4828-a1a8-92dc39a7c822\") " pod="openstack/barbican-worker-65f6dd57f9-22fc5" Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.937734 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61f0d5fb-5daa-4828-a1a8-92dc39a7c822-combined-ca-bundle\") pod \"barbican-worker-65f6dd57f9-22fc5\" (UID: \"61f0d5fb-5daa-4828-a1a8-92dc39a7c822\") " pod="openstack/barbican-worker-65f6dd57f9-22fc5" Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.938563 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31b37f1e-9af0-4922-8e48-55f82175411c-config-data\") pod \"barbican-keystone-listener-868456d7cd-m64gp\" (UID: \"31b37f1e-9af0-4922-8e48-55f82175411c\") " pod="openstack/barbican-keystone-listener-868456d7cd-m64gp" Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.940870 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31b37f1e-9af0-4922-8e48-55f82175411c-combined-ca-bundle\") pod \"barbican-keystone-listener-868456d7cd-m64gp\" (UID: \"31b37f1e-9af0-4922-8e48-55f82175411c\") " pod="openstack/barbican-keystone-listener-868456d7cd-m64gp" Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.946538 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmmq9\" (UniqueName: \"kubernetes.io/projected/31b37f1e-9af0-4922-8e48-55f82175411c-kube-api-access-bmmq9\") pod \"barbican-keystone-listener-868456d7cd-m64gp\" (UID: \"31b37f1e-9af0-4922-8e48-55f82175411c\") " pod="openstack/barbican-keystone-listener-868456d7cd-m64gp" Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.961313 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a857dd33-a536-4375-80cc-318b8a8023b6-config\") pod \"dnsmasq-dns-85ff748b95-5g4hh\" (UID: \"a857dd33-a536-4375-80cc-318b8a8023b6\") " pod="openstack/dnsmasq-dns-85ff748b95-5g4hh" Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.961363 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwn6m\" (UniqueName: \"kubernetes.io/projected/9dce0a63-d2c7-4004-b78a-5e7016fe10e9-kube-api-access-lwn6m\") pod \"barbican-api-77c848ff4-gtzqf\" (UID: \"9dce0a63-d2c7-4004-b78a-5e7016fe10e9\") " pod="openstack/barbican-api-77c848ff4-gtzqf" Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.961393 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a857dd33-a536-4375-80cc-318b8a8023b6-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-5g4hh\" (UID: \"a857dd33-a536-4375-80cc-318b8a8023b6\") " pod="openstack/dnsmasq-dns-85ff748b95-5g4hh" Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.961439 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9dce0a63-d2c7-4004-b78a-5e7016fe10e9-logs\") pod \"barbican-api-77c848ff4-gtzqf\" (UID: \"9dce0a63-d2c7-4004-b78a-5e7016fe10e9\") " pod="openstack/barbican-api-77c848ff4-gtzqf" Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.961471 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a857dd33-a536-4375-80cc-318b8a8023b6-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-5g4hh\" (UID: \"a857dd33-a536-4375-80cc-318b8a8023b6\") " pod="openstack/dnsmasq-dns-85ff748b95-5g4hh" Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.961491 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a857dd33-a536-4375-80cc-318b8a8023b6-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-5g4hh\" (UID: \"a857dd33-a536-4375-80cc-318b8a8023b6\") " pod="openstack/dnsmasq-dns-85ff748b95-5g4hh" Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.961514 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dce0a63-d2c7-4004-b78a-5e7016fe10e9-config-data\") pod \"barbican-api-77c848ff4-gtzqf\" (UID: \"9dce0a63-d2c7-4004-b78a-5e7016fe10e9\") " pod="openstack/barbican-api-77c848ff4-gtzqf" Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.961534 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dce0a63-d2c7-4004-b78a-5e7016fe10e9-combined-ca-bundle\") pod \"barbican-api-77c848ff4-gtzqf\" (UID: \"9dce0a63-d2c7-4004-b78a-5e7016fe10e9\") " pod="openstack/barbican-api-77c848ff4-gtzqf" Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.961557 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zs5n8\" (UniqueName: \"kubernetes.io/projected/a857dd33-a536-4375-80cc-318b8a8023b6-kube-api-access-zs5n8\") pod \"dnsmasq-dns-85ff748b95-5g4hh\" (UID: \"a857dd33-a536-4375-80cc-318b8a8023b6\") " pod="openstack/dnsmasq-dns-85ff748b95-5g4hh" Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.961585 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9dce0a63-d2c7-4004-b78a-5e7016fe10e9-config-data-custom\") pod \"barbican-api-77c848ff4-gtzqf\" (UID: \"9dce0a63-d2c7-4004-b78a-5e7016fe10e9\") " pod="openstack/barbican-api-77c848ff4-gtzqf" Dec 01 14:15:47 crc kubenswrapper[4585]: I1201 14:15:47.961608 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a857dd33-a536-4375-80cc-318b8a8023b6-dns-svc\") pod \"dnsmasq-dns-85ff748b95-5g4hh\" (UID: \"a857dd33-a536-4375-80cc-318b8a8023b6\") " pod="openstack/dnsmasq-dns-85ff748b95-5g4hh" Dec 01 14:15:48 crc kubenswrapper[4585]: I1201 14:15:48.016192 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-65f6dd57f9-22fc5" Dec 01 14:15:48 crc kubenswrapper[4585]: I1201 14:15:48.062785 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a857dd33-a536-4375-80cc-318b8a8023b6-dns-svc\") pod \"dnsmasq-dns-85ff748b95-5g4hh\" (UID: \"a857dd33-a536-4375-80cc-318b8a8023b6\") " pod="openstack/dnsmasq-dns-85ff748b95-5g4hh" Dec 01 14:15:48 crc kubenswrapper[4585]: I1201 14:15:48.062865 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a857dd33-a536-4375-80cc-318b8a8023b6-config\") pod \"dnsmasq-dns-85ff748b95-5g4hh\" (UID: \"a857dd33-a536-4375-80cc-318b8a8023b6\") " pod="openstack/dnsmasq-dns-85ff748b95-5g4hh" Dec 01 14:15:48 crc kubenswrapper[4585]: I1201 14:15:48.062891 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwn6m\" (UniqueName: \"kubernetes.io/projected/9dce0a63-d2c7-4004-b78a-5e7016fe10e9-kube-api-access-lwn6m\") pod \"barbican-api-77c848ff4-gtzqf\" (UID: \"9dce0a63-d2c7-4004-b78a-5e7016fe10e9\") " pod="openstack/barbican-api-77c848ff4-gtzqf" Dec 01 14:15:48 crc kubenswrapper[4585]: I1201 14:15:48.062916 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a857dd33-a536-4375-80cc-318b8a8023b6-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-5g4hh\" (UID: \"a857dd33-a536-4375-80cc-318b8a8023b6\") " pod="openstack/dnsmasq-dns-85ff748b95-5g4hh" Dec 01 14:15:48 crc kubenswrapper[4585]: I1201 14:15:48.062966 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9dce0a63-d2c7-4004-b78a-5e7016fe10e9-logs\") pod \"barbican-api-77c848ff4-gtzqf\" (UID: \"9dce0a63-d2c7-4004-b78a-5e7016fe10e9\") " pod="openstack/barbican-api-77c848ff4-gtzqf" Dec 01 14:15:48 crc kubenswrapper[4585]: I1201 14:15:48.063034 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a857dd33-a536-4375-80cc-318b8a8023b6-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-5g4hh\" (UID: \"a857dd33-a536-4375-80cc-318b8a8023b6\") " pod="openstack/dnsmasq-dns-85ff748b95-5g4hh" Dec 01 14:15:48 crc kubenswrapper[4585]: I1201 14:15:48.063059 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a857dd33-a536-4375-80cc-318b8a8023b6-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-5g4hh\" (UID: \"a857dd33-a536-4375-80cc-318b8a8023b6\") " pod="openstack/dnsmasq-dns-85ff748b95-5g4hh" Dec 01 14:15:48 crc kubenswrapper[4585]: I1201 14:15:48.063081 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dce0a63-d2c7-4004-b78a-5e7016fe10e9-config-data\") pod \"barbican-api-77c848ff4-gtzqf\" (UID: \"9dce0a63-d2c7-4004-b78a-5e7016fe10e9\") " pod="openstack/barbican-api-77c848ff4-gtzqf" Dec 01 14:15:48 crc kubenswrapper[4585]: I1201 14:15:48.063104 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dce0a63-d2c7-4004-b78a-5e7016fe10e9-combined-ca-bundle\") pod \"barbican-api-77c848ff4-gtzqf\" (UID: \"9dce0a63-d2c7-4004-b78a-5e7016fe10e9\") " pod="openstack/barbican-api-77c848ff4-gtzqf" Dec 01 14:15:48 crc kubenswrapper[4585]: I1201 14:15:48.063126 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zs5n8\" (UniqueName: \"kubernetes.io/projected/a857dd33-a536-4375-80cc-318b8a8023b6-kube-api-access-zs5n8\") pod \"dnsmasq-dns-85ff748b95-5g4hh\" (UID: \"a857dd33-a536-4375-80cc-318b8a8023b6\") " pod="openstack/dnsmasq-dns-85ff748b95-5g4hh" Dec 01 14:15:48 crc kubenswrapper[4585]: I1201 14:15:48.063172 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9dce0a63-d2c7-4004-b78a-5e7016fe10e9-config-data-custom\") pod \"barbican-api-77c848ff4-gtzqf\" (UID: \"9dce0a63-d2c7-4004-b78a-5e7016fe10e9\") " pod="openstack/barbican-api-77c848ff4-gtzqf" Dec 01 14:15:48 crc kubenswrapper[4585]: I1201 14:15:48.064139 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9dce0a63-d2c7-4004-b78a-5e7016fe10e9-logs\") pod \"barbican-api-77c848ff4-gtzqf\" (UID: \"9dce0a63-d2c7-4004-b78a-5e7016fe10e9\") " pod="openstack/barbican-api-77c848ff4-gtzqf" Dec 01 14:15:48 crc kubenswrapper[4585]: I1201 14:15:48.068848 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dce0a63-d2c7-4004-b78a-5e7016fe10e9-config-data\") pod \"barbican-api-77c848ff4-gtzqf\" (UID: \"9dce0a63-d2c7-4004-b78a-5e7016fe10e9\") " pod="openstack/barbican-api-77c848ff4-gtzqf" Dec 01 14:15:48 crc kubenswrapper[4585]: I1201 14:15:48.076072 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9dce0a63-d2c7-4004-b78a-5e7016fe10e9-config-data-custom\") pod \"barbican-api-77c848ff4-gtzqf\" (UID: \"9dce0a63-d2c7-4004-b78a-5e7016fe10e9\") " pod="openstack/barbican-api-77c848ff4-gtzqf" Dec 01 14:15:48 crc kubenswrapper[4585]: I1201 14:15:48.076813 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a857dd33-a536-4375-80cc-318b8a8023b6-dns-svc\") pod \"dnsmasq-dns-85ff748b95-5g4hh\" (UID: \"a857dd33-a536-4375-80cc-318b8a8023b6\") " pod="openstack/dnsmasq-dns-85ff748b95-5g4hh" Dec 01 14:15:48 crc kubenswrapper[4585]: I1201 14:15:48.080242 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dce0a63-d2c7-4004-b78a-5e7016fe10e9-combined-ca-bundle\") pod \"barbican-api-77c848ff4-gtzqf\" (UID: \"9dce0a63-d2c7-4004-b78a-5e7016fe10e9\") " pod="openstack/barbican-api-77c848ff4-gtzqf" Dec 01 14:15:48 crc kubenswrapper[4585]: I1201 14:15:48.083465 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a857dd33-a536-4375-80cc-318b8a8023b6-config\") pod \"dnsmasq-dns-85ff748b95-5g4hh\" (UID: \"a857dd33-a536-4375-80cc-318b8a8023b6\") " pod="openstack/dnsmasq-dns-85ff748b95-5g4hh" Dec 01 14:15:48 crc kubenswrapper[4585]: I1201 14:15:48.083695 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a857dd33-a536-4375-80cc-318b8a8023b6-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-5g4hh\" (UID: \"a857dd33-a536-4375-80cc-318b8a8023b6\") " pod="openstack/dnsmasq-dns-85ff748b95-5g4hh" Dec 01 14:15:48 crc kubenswrapper[4585]: I1201 14:15:48.084532 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a857dd33-a536-4375-80cc-318b8a8023b6-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-5g4hh\" (UID: \"a857dd33-a536-4375-80cc-318b8a8023b6\") " pod="openstack/dnsmasq-dns-85ff748b95-5g4hh" Dec 01 14:15:48 crc kubenswrapper[4585]: I1201 14:15:48.086147 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a857dd33-a536-4375-80cc-318b8a8023b6-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-5g4hh\" (UID: \"a857dd33-a536-4375-80cc-318b8a8023b6\") " pod="openstack/dnsmasq-dns-85ff748b95-5g4hh" Dec 01 14:15:48 crc kubenswrapper[4585]: I1201 14:15:48.091517 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwn6m\" (UniqueName: \"kubernetes.io/projected/9dce0a63-d2c7-4004-b78a-5e7016fe10e9-kube-api-access-lwn6m\") pod \"barbican-api-77c848ff4-gtzqf\" (UID: \"9dce0a63-d2c7-4004-b78a-5e7016fe10e9\") " pod="openstack/barbican-api-77c848ff4-gtzqf" Dec 01 14:15:48 crc kubenswrapper[4585]: I1201 14:15:48.093537 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zs5n8\" (UniqueName: \"kubernetes.io/projected/a857dd33-a536-4375-80cc-318b8a8023b6-kube-api-access-zs5n8\") pod \"dnsmasq-dns-85ff748b95-5g4hh\" (UID: \"a857dd33-a536-4375-80cc-318b8a8023b6\") " pod="openstack/dnsmasq-dns-85ff748b95-5g4hh" Dec 01 14:15:48 crc kubenswrapper[4585]: I1201 14:15:48.156277 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-5g4hh" Dec 01 14:15:48 crc kubenswrapper[4585]: I1201 14:15:48.221532 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-868456d7cd-m64gp" Dec 01 14:15:48 crc kubenswrapper[4585]: I1201 14:15:48.320517 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-77c848ff4-gtzqf" Dec 01 14:15:48 crc kubenswrapper[4585]: I1201 14:15:48.864274 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-65f6dd57f9-22fc5"] Dec 01 14:15:48 crc kubenswrapper[4585]: I1201 14:15:48.980766 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-5g4hh"] Dec 01 14:15:49 crc kubenswrapper[4585]: W1201 14:15:49.021488 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda857dd33_a536_4375_80cc_318b8a8023b6.slice/crio-9e57c7464ae33b9da0dc4fc44f2a21ca7cea020cbff53d2a34423f406f594a47 WatchSource:0}: Error finding container 9e57c7464ae33b9da0dc4fc44f2a21ca7cea020cbff53d2a34423f406f594a47: Status 404 returned error can't find the container with id 9e57c7464ae33b9da0dc4fc44f2a21ca7cea020cbff53d2a34423f406f594a47 Dec 01 14:15:49 crc kubenswrapper[4585]: I1201 14:15:49.117116 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-77c848ff4-gtzqf"] Dec 01 14:15:49 crc kubenswrapper[4585]: I1201 14:15:49.143029 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-868456d7cd-m64gp"] Dec 01 14:15:49 crc kubenswrapper[4585]: I1201 14:15:49.799827 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-65f6dd57f9-22fc5" event={"ID":"61f0d5fb-5daa-4828-a1a8-92dc39a7c822","Type":"ContainerStarted","Data":"43efafc47e9fdf5f0739102465f9a85c9775d8372c5c1c6a0aefc69580a2a325"} Dec 01 14:15:49 crc kubenswrapper[4585]: I1201 14:15:49.802923 4585 generic.go:334] "Generic (PLEG): container finished" podID="a857dd33-a536-4375-80cc-318b8a8023b6" containerID="d2c1ab56ed39195ab4749b25a23c8d4a0a9b9c1f9162e96ad7c146782c89f5a4" exitCode=0 Dec 01 14:15:49 crc kubenswrapper[4585]: I1201 14:15:49.803227 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-5g4hh" event={"ID":"a857dd33-a536-4375-80cc-318b8a8023b6","Type":"ContainerDied","Data":"d2c1ab56ed39195ab4749b25a23c8d4a0a9b9c1f9162e96ad7c146782c89f5a4"} Dec 01 14:15:49 crc kubenswrapper[4585]: I1201 14:15:49.803277 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-5g4hh" event={"ID":"a857dd33-a536-4375-80cc-318b8a8023b6","Type":"ContainerStarted","Data":"9e57c7464ae33b9da0dc4fc44f2a21ca7cea020cbff53d2a34423f406f594a47"} Dec 01 14:15:49 crc kubenswrapper[4585]: I1201 14:15:49.807743 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-77c848ff4-gtzqf" event={"ID":"9dce0a63-d2c7-4004-b78a-5e7016fe10e9","Type":"ContainerStarted","Data":"b9bb77f521832e57692d2665547f08652dbf3fb329788eaea1b3c7ea2807b528"} Dec 01 14:15:49 crc kubenswrapper[4585]: I1201 14:15:49.807789 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-77c848ff4-gtzqf" event={"ID":"9dce0a63-d2c7-4004-b78a-5e7016fe10e9","Type":"ContainerStarted","Data":"6bc4be9fb2c8c795d2d716883a9fe496c594d9899aca60b4fb41a4607d1e190f"} Dec 01 14:15:49 crc kubenswrapper[4585]: I1201 14:15:49.807803 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-77c848ff4-gtzqf" event={"ID":"9dce0a63-d2c7-4004-b78a-5e7016fe10e9","Type":"ContainerStarted","Data":"9b56d7d3c4b389b564a1a9b8ef560950db9b065727e314987d81a739139362e4"} Dec 01 14:15:49 crc kubenswrapper[4585]: I1201 14:15:49.808009 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-77c848ff4-gtzqf" Dec 01 14:15:49 crc kubenswrapper[4585]: I1201 14:15:49.808059 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-77c848ff4-gtzqf" Dec 01 14:15:49 crc kubenswrapper[4585]: I1201 14:15:49.810350 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-868456d7cd-m64gp" event={"ID":"31b37f1e-9af0-4922-8e48-55f82175411c","Type":"ContainerStarted","Data":"bb9a7646ab153d46c2e1bdeef5beade99dfac4331ac1732c593d64418f0ba463"} Dec 01 14:15:49 crc kubenswrapper[4585]: I1201 14:15:49.879789 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-77c848ff4-gtzqf" podStartSLOduration=2.879770559 podStartE2EDuration="2.879770559s" podCreationTimestamp="2025-12-01 14:15:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:15:49.842453774 +0000 UTC m=+1063.826667629" watchObservedRunningTime="2025-12-01 14:15:49.879770559 +0000 UTC m=+1063.863984414" Dec 01 14:15:50 crc kubenswrapper[4585]: I1201 14:15:50.633590 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6f5b64975d-2mfhq" podUID="d6fd44d5-e430-42bb-ad0b-7c78e7a1f288" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Dec 01 14:15:50 crc kubenswrapper[4585]: I1201 14:15:50.791307 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6bbf659b46-55tth" podUID="e3b59a9b-bef9-4f7c-b861-bf6bc2a8bac1" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.149:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.149:8443: connect: connection refused" Dec 01 14:15:50 crc kubenswrapper[4585]: I1201 14:15:50.826900 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-5g4hh" event={"ID":"a857dd33-a536-4375-80cc-318b8a8023b6","Type":"ContainerStarted","Data":"f17be2060ba478c7cdd9bf40be0173d1531cfd8b94f864a02657c9ed11f06e32"} Dec 01 14:15:50 crc kubenswrapper[4585]: I1201 14:15:50.826958 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85ff748b95-5g4hh" Dec 01 14:15:51 crc kubenswrapper[4585]: I1201 14:15:51.198766 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85ff748b95-5g4hh" podStartSLOduration=4.198746708 podStartE2EDuration="4.198746708s" podCreationTimestamp="2025-12-01 14:15:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:15:50.848154133 +0000 UTC m=+1064.832367988" watchObservedRunningTime="2025-12-01 14:15:51.198746708 +0000 UTC m=+1065.182960553" Dec 01 14:15:51 crc kubenswrapper[4585]: I1201 14:15:51.209367 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6564675f78-48rkf"] Dec 01 14:15:51 crc kubenswrapper[4585]: I1201 14:15:51.211353 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6564675f78-48rkf" Dec 01 14:15:51 crc kubenswrapper[4585]: I1201 14:15:51.216864 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 01 14:15:51 crc kubenswrapper[4585]: I1201 14:15:51.217528 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 01 14:15:51 crc kubenswrapper[4585]: I1201 14:15:51.223557 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6564675f78-48rkf"] Dec 01 14:15:51 crc kubenswrapper[4585]: I1201 14:15:51.334117 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87101522-5785-472c-9563-d86146676171-config-data\") pod \"barbican-api-6564675f78-48rkf\" (UID: \"87101522-5785-472c-9563-d86146676171\") " pod="openstack/barbican-api-6564675f78-48rkf" Dec 01 14:15:51 crc kubenswrapper[4585]: I1201 14:15:51.334158 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87101522-5785-472c-9563-d86146676171-combined-ca-bundle\") pod \"barbican-api-6564675f78-48rkf\" (UID: \"87101522-5785-472c-9563-d86146676171\") " pod="openstack/barbican-api-6564675f78-48rkf" Dec 01 14:15:51 crc kubenswrapper[4585]: I1201 14:15:51.334179 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/87101522-5785-472c-9563-d86146676171-config-data-custom\") pod \"barbican-api-6564675f78-48rkf\" (UID: \"87101522-5785-472c-9563-d86146676171\") " pod="openstack/barbican-api-6564675f78-48rkf" Dec 01 14:15:51 crc kubenswrapper[4585]: I1201 14:15:51.334218 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/87101522-5785-472c-9563-d86146676171-public-tls-certs\") pod \"barbican-api-6564675f78-48rkf\" (UID: \"87101522-5785-472c-9563-d86146676171\") " pod="openstack/barbican-api-6564675f78-48rkf" Dec 01 14:15:51 crc kubenswrapper[4585]: I1201 14:15:51.334252 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/87101522-5785-472c-9563-d86146676171-internal-tls-certs\") pod \"barbican-api-6564675f78-48rkf\" (UID: \"87101522-5785-472c-9563-d86146676171\") " pod="openstack/barbican-api-6564675f78-48rkf" Dec 01 14:15:51 crc kubenswrapper[4585]: I1201 14:15:51.334271 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87101522-5785-472c-9563-d86146676171-logs\") pod \"barbican-api-6564675f78-48rkf\" (UID: \"87101522-5785-472c-9563-d86146676171\") " pod="openstack/barbican-api-6564675f78-48rkf" Dec 01 14:15:51 crc kubenswrapper[4585]: I1201 14:15:51.334308 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sz7tb\" (UniqueName: \"kubernetes.io/projected/87101522-5785-472c-9563-d86146676171-kube-api-access-sz7tb\") pod \"barbican-api-6564675f78-48rkf\" (UID: \"87101522-5785-472c-9563-d86146676171\") " pod="openstack/barbican-api-6564675f78-48rkf" Dec 01 14:15:51 crc kubenswrapper[4585]: I1201 14:15:51.435736 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/87101522-5785-472c-9563-d86146676171-public-tls-certs\") pod \"barbican-api-6564675f78-48rkf\" (UID: \"87101522-5785-472c-9563-d86146676171\") " pod="openstack/barbican-api-6564675f78-48rkf" Dec 01 14:15:51 crc kubenswrapper[4585]: I1201 14:15:51.435823 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/87101522-5785-472c-9563-d86146676171-internal-tls-certs\") pod \"barbican-api-6564675f78-48rkf\" (UID: \"87101522-5785-472c-9563-d86146676171\") " pod="openstack/barbican-api-6564675f78-48rkf" Dec 01 14:15:51 crc kubenswrapper[4585]: I1201 14:15:51.435866 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87101522-5785-472c-9563-d86146676171-logs\") pod \"barbican-api-6564675f78-48rkf\" (UID: \"87101522-5785-472c-9563-d86146676171\") " pod="openstack/barbican-api-6564675f78-48rkf" Dec 01 14:15:51 crc kubenswrapper[4585]: I1201 14:15:51.435913 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sz7tb\" (UniqueName: \"kubernetes.io/projected/87101522-5785-472c-9563-d86146676171-kube-api-access-sz7tb\") pod \"barbican-api-6564675f78-48rkf\" (UID: \"87101522-5785-472c-9563-d86146676171\") " pod="openstack/barbican-api-6564675f78-48rkf" Dec 01 14:15:51 crc kubenswrapper[4585]: I1201 14:15:51.436087 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87101522-5785-472c-9563-d86146676171-config-data\") pod \"barbican-api-6564675f78-48rkf\" (UID: \"87101522-5785-472c-9563-d86146676171\") " pod="openstack/barbican-api-6564675f78-48rkf" Dec 01 14:15:51 crc kubenswrapper[4585]: I1201 14:15:51.436126 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87101522-5785-472c-9563-d86146676171-combined-ca-bundle\") pod \"barbican-api-6564675f78-48rkf\" (UID: \"87101522-5785-472c-9563-d86146676171\") " pod="openstack/barbican-api-6564675f78-48rkf" Dec 01 14:15:51 crc kubenswrapper[4585]: I1201 14:15:51.436142 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/87101522-5785-472c-9563-d86146676171-config-data-custom\") pod \"barbican-api-6564675f78-48rkf\" (UID: \"87101522-5785-472c-9563-d86146676171\") " pod="openstack/barbican-api-6564675f78-48rkf" Dec 01 14:15:51 crc kubenswrapper[4585]: I1201 14:15:51.437403 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87101522-5785-472c-9563-d86146676171-logs\") pod \"barbican-api-6564675f78-48rkf\" (UID: \"87101522-5785-472c-9563-d86146676171\") " pod="openstack/barbican-api-6564675f78-48rkf" Dec 01 14:15:51 crc kubenswrapper[4585]: I1201 14:15:51.445650 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87101522-5785-472c-9563-d86146676171-combined-ca-bundle\") pod \"barbican-api-6564675f78-48rkf\" (UID: \"87101522-5785-472c-9563-d86146676171\") " pod="openstack/barbican-api-6564675f78-48rkf" Dec 01 14:15:51 crc kubenswrapper[4585]: I1201 14:15:51.454985 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/87101522-5785-472c-9563-d86146676171-public-tls-certs\") pod \"barbican-api-6564675f78-48rkf\" (UID: \"87101522-5785-472c-9563-d86146676171\") " pod="openstack/barbican-api-6564675f78-48rkf" Dec 01 14:15:51 crc kubenswrapper[4585]: I1201 14:15:51.457946 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/87101522-5785-472c-9563-d86146676171-internal-tls-certs\") pod \"barbican-api-6564675f78-48rkf\" (UID: \"87101522-5785-472c-9563-d86146676171\") " pod="openstack/barbican-api-6564675f78-48rkf" Dec 01 14:15:51 crc kubenswrapper[4585]: I1201 14:15:51.469267 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sz7tb\" (UniqueName: \"kubernetes.io/projected/87101522-5785-472c-9563-d86146676171-kube-api-access-sz7tb\") pod \"barbican-api-6564675f78-48rkf\" (UID: \"87101522-5785-472c-9563-d86146676171\") " pod="openstack/barbican-api-6564675f78-48rkf" Dec 01 14:15:51 crc kubenswrapper[4585]: I1201 14:15:51.472086 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87101522-5785-472c-9563-d86146676171-config-data\") pod \"barbican-api-6564675f78-48rkf\" (UID: \"87101522-5785-472c-9563-d86146676171\") " pod="openstack/barbican-api-6564675f78-48rkf" Dec 01 14:15:51 crc kubenswrapper[4585]: I1201 14:15:51.472662 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/87101522-5785-472c-9563-d86146676171-config-data-custom\") pod \"barbican-api-6564675f78-48rkf\" (UID: \"87101522-5785-472c-9563-d86146676171\") " pod="openstack/barbican-api-6564675f78-48rkf" Dec 01 14:15:51 crc kubenswrapper[4585]: I1201 14:15:51.551107 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6564675f78-48rkf" Dec 01 14:15:52 crc kubenswrapper[4585]: I1201 14:15:52.853909 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-65f6dd57f9-22fc5" event={"ID":"61f0d5fb-5daa-4828-a1a8-92dc39a7c822","Type":"ContainerStarted","Data":"6e028b3d8ed23beeda62d27751e9bf270c8513ab13e1780941f6657624c97472"} Dec 01 14:15:52 crc kubenswrapper[4585]: I1201 14:15:52.856379 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-868456d7cd-m64gp" event={"ID":"31b37f1e-9af0-4922-8e48-55f82175411c","Type":"ContainerStarted","Data":"e0d9aec6b65264b36e9ab8dddba417b536061290e19da17151bfaa3685f2a293"} Dec 01 14:15:52 crc kubenswrapper[4585]: I1201 14:15:52.959073 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6564675f78-48rkf"] Dec 01 14:15:53 crc kubenswrapper[4585]: I1201 14:15:53.335016 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-84c9dcff9d-xn9xw" Dec 01 14:15:53 crc kubenswrapper[4585]: I1201 14:15:53.871709 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-868456d7cd-m64gp" event={"ID":"31b37f1e-9af0-4922-8e48-55f82175411c","Type":"ContainerStarted","Data":"bc7696c23d9138f0c0066b331b3f58bcf474d0d41254e593b6e4d5f7c506cfa6"} Dec 01 14:15:53 crc kubenswrapper[4585]: I1201 14:15:53.881230 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-65f6dd57f9-22fc5" event={"ID":"61f0d5fb-5daa-4828-a1a8-92dc39a7c822","Type":"ContainerStarted","Data":"22bb3ea3771db7663b1ad00249f467c840f50bb93fee1c3e78f2bd2572b515cc"} Dec 01 14:15:53 crc kubenswrapper[4585]: I1201 14:15:53.888109 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-868456d7cd-m64gp" podStartSLOduration=3.688951639 podStartE2EDuration="6.888098086s" podCreationTimestamp="2025-12-01 14:15:47 +0000 UTC" firstStartedPulling="2025-12-01 14:15:49.204253923 +0000 UTC m=+1063.188467778" lastFinishedPulling="2025-12-01 14:15:52.40340037 +0000 UTC m=+1066.387614225" observedRunningTime="2025-12-01 14:15:53.884050228 +0000 UTC m=+1067.868264083" watchObservedRunningTime="2025-12-01 14:15:53.888098086 +0000 UTC m=+1067.872311941" Dec 01 14:15:53 crc kubenswrapper[4585]: I1201 14:15:53.893717 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6564675f78-48rkf" event={"ID":"87101522-5785-472c-9563-d86146676171","Type":"ContainerStarted","Data":"0ae9a512fcc6d8f342f70ea3e0cfd83fae25da855f4590dd532e583f2feb6301"} Dec 01 14:15:53 crc kubenswrapper[4585]: I1201 14:15:53.893759 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6564675f78-48rkf" event={"ID":"87101522-5785-472c-9563-d86146676171","Type":"ContainerStarted","Data":"de0c68096c52d734d9a2c50d64bf3f993b51d7760a6fd78ed8bb074645c8ccaa"} Dec 01 14:15:53 crc kubenswrapper[4585]: I1201 14:15:53.893770 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6564675f78-48rkf" event={"ID":"87101522-5785-472c-9563-d86146676171","Type":"ContainerStarted","Data":"1e0d28f4cfc5b9d341cefa427e42f16d8414a3f60331802ec926016728832873"} Dec 01 14:15:53 crc kubenswrapper[4585]: I1201 14:15:53.893999 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6564675f78-48rkf" Dec 01 14:15:53 crc kubenswrapper[4585]: I1201 14:15:53.894078 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6564675f78-48rkf" Dec 01 14:15:53 crc kubenswrapper[4585]: I1201 14:15:53.910612 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-65f6dd57f9-22fc5" podStartSLOduration=3.4076853910000002 podStartE2EDuration="6.910590835s" podCreationTimestamp="2025-12-01 14:15:47 +0000 UTC" firstStartedPulling="2025-12-01 14:15:48.867202999 +0000 UTC m=+1062.851416854" lastFinishedPulling="2025-12-01 14:15:52.370108443 +0000 UTC m=+1066.354322298" observedRunningTime="2025-12-01 14:15:53.907937934 +0000 UTC m=+1067.892151789" watchObservedRunningTime="2025-12-01 14:15:53.910590835 +0000 UTC m=+1067.894804690" Dec 01 14:15:53 crc kubenswrapper[4585]: I1201 14:15:53.949215 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6564675f78-48rkf" podStartSLOduration=2.949198474 podStartE2EDuration="2.949198474s" podCreationTimestamp="2025-12-01 14:15:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:15:53.944480328 +0000 UTC m=+1067.928694183" watchObservedRunningTime="2025-12-01 14:15:53.949198474 +0000 UTC m=+1067.933412329" Dec 01 14:15:55 crc kubenswrapper[4585]: I1201 14:15:55.914952 4585 generic.go:334] "Generic (PLEG): container finished" podID="7dead942-d6c5-4a4a-aa5e-5c57b6da0c48" containerID="c3462b3634ded13b7f47d7b9fed51f7c11e55c11f88e9558889f60d36bb6fe91" exitCode=0 Dec 01 14:15:55 crc kubenswrapper[4585]: I1201 14:15:55.915020 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-wmh4t" event={"ID":"7dead942-d6c5-4a4a-aa5e-5c57b6da0c48","Type":"ContainerDied","Data":"c3462b3634ded13b7f47d7b9fed51f7c11e55c11f88e9558889f60d36bb6fe91"} Dec 01 14:15:55 crc kubenswrapper[4585]: I1201 14:15:55.920373 4585 generic.go:334] "Generic (PLEG): container finished" podID="477277de-eaf5-4536-90d9-9091737cb66c" containerID="4dce655e753291e91d944ab90d6893b8a27eccfedd887ce33bdc93ab7812d057" exitCode=137 Dec 01 14:15:55 crc kubenswrapper[4585]: I1201 14:15:55.920429 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54d4dd665-7g5hl" event={"ID":"477277de-eaf5-4536-90d9-9091737cb66c","Type":"ContainerDied","Data":"4dce655e753291e91d944ab90d6893b8a27eccfedd887ce33bdc93ab7812d057"} Dec 01 14:15:56 crc kubenswrapper[4585]: W1201 14:15:56.083320 4585 container.go:586] Failed to update stats for container "/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a389fb6_e678_4552_8cf8_3aea857a545c.slice/crio-fa34beb20f110db1a486dd8ccaf3f004c4dcbbd684e658a0d5f61f4d952c9817": error while statting cgroup v2: [unable to parse /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a389fb6_e678_4552_8cf8_3aea857a545c.slice/crio-fa34beb20f110db1a486dd8ccaf3f004c4dcbbd684e658a0d5f61f4d952c9817/memory.stat: read /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a389fb6_e678_4552_8cf8_3aea857a545c.slice/crio-fa34beb20f110db1a486dd8ccaf3f004c4dcbbd684e658a0d5f61f4d952c9817/memory.stat: no such device], continuing to push stats Dec 01 14:15:56 crc kubenswrapper[4585]: I1201 14:15:56.636393 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-77c848ff4-gtzqf" podUID="9dce0a63-d2c7-4004-b78a-5e7016fe10e9" containerName="barbican-api" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 14:15:56 crc kubenswrapper[4585]: I1201 14:15:56.930839 4585 generic.go:334] "Generic (PLEG): container finished" podID="5a389fb6-e678-4552-8cf8-3aea857a545c" containerID="ccdf936df12a71d20f624a17f99cc73a7a22471ec69998e0bd3b3bb0444d8792" exitCode=137 Dec 01 14:15:56 crc kubenswrapper[4585]: I1201 14:15:56.930869 4585 generic.go:334] "Generic (PLEG): container finished" podID="5a389fb6-e678-4552-8cf8-3aea857a545c" containerID="d838107b5f7119f26afc1894fb492640a7f5d895edcec345d4fda163059ae883" exitCode=137 Dec 01 14:15:56 crc kubenswrapper[4585]: I1201 14:15:56.930880 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8556bc9b75-ldp9m" event={"ID":"5a389fb6-e678-4552-8cf8-3aea857a545c","Type":"ContainerDied","Data":"ccdf936df12a71d20f624a17f99cc73a7a22471ec69998e0bd3b3bb0444d8792"} Dec 01 14:15:56 crc kubenswrapper[4585]: I1201 14:15:56.930923 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8556bc9b75-ldp9m" event={"ID":"5a389fb6-e678-4552-8cf8-3aea857a545c","Type":"ContainerDied","Data":"d838107b5f7119f26afc1894fb492640a7f5d895edcec345d4fda163059ae883"} Dec 01 14:15:57 crc kubenswrapper[4585]: I1201 14:15:57.027879 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-fc9dbdd9-h6k6z" Dec 01 14:15:57 crc kubenswrapper[4585]: I1201 14:15:57.090157 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-84c9dcff9d-xn9xw"] Dec 01 14:15:57 crc kubenswrapper[4585]: I1201 14:15:57.090404 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-84c9dcff9d-xn9xw" podUID="1e9edaf4-b918-4370-8184-79de4b087dfc" containerName="neutron-api" containerID="cri-o://8456e5822a825f57ff78e6f6cdf5ebaf37da8d4e69fe80ac6039bc9517e07e80" gracePeriod=30 Dec 01 14:15:57 crc kubenswrapper[4585]: I1201 14:15:57.092118 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-84c9dcff9d-xn9xw" podUID="1e9edaf4-b918-4370-8184-79de4b087dfc" containerName="neutron-httpd" containerID="cri-o://2d386bedb990d1f77ddf7db9f242792eaf5889a9e32ada4df5a3416b96df2c0d" gracePeriod=30 Dec 01 14:15:57 crc kubenswrapper[4585]: I1201 14:15:57.946246 4585 generic.go:334] "Generic (PLEG): container finished" podID="1e9edaf4-b918-4370-8184-79de4b087dfc" containerID="2d386bedb990d1f77ddf7db9f242792eaf5889a9e32ada4df5a3416b96df2c0d" exitCode=0 Dec 01 14:15:57 crc kubenswrapper[4585]: I1201 14:15:57.946460 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-84c9dcff9d-xn9xw" event={"ID":"1e9edaf4-b918-4370-8184-79de4b087dfc","Type":"ContainerDied","Data":"2d386bedb990d1f77ddf7db9f242792eaf5889a9e32ada4df5a3416b96df2c0d"} Dec 01 14:15:58 crc kubenswrapper[4585]: I1201 14:15:58.158871 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-85ff748b95-5g4hh" Dec 01 14:15:58 crc kubenswrapper[4585]: I1201 14:15:58.277692 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-szrfs"] Dec 01 14:15:58 crc kubenswrapper[4585]: I1201 14:15:58.277894 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55f844cf75-szrfs" podUID="f1901264-234a-4675-9867-3fb1f2689592" containerName="dnsmasq-dns" containerID="cri-o://13d5e365d0fef2c5c4fcf1847fb20ef4fcd1cbf13e70aea0e07d21661c48f846" gracePeriod=10 Dec 01 14:15:58 crc kubenswrapper[4585]: I1201 14:15:58.994045 4585 generic.go:334] "Generic (PLEG): container finished" podID="f1901264-234a-4675-9867-3fb1f2689592" containerID="13d5e365d0fef2c5c4fcf1847fb20ef4fcd1cbf13e70aea0e07d21661c48f846" exitCode=0 Dec 01 14:15:58 crc kubenswrapper[4585]: I1201 14:15:58.994124 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-szrfs" event={"ID":"f1901264-234a-4675-9867-3fb1f2689592","Type":"ContainerDied","Data":"13d5e365d0fef2c5c4fcf1847fb20ef4fcd1cbf13e70aea0e07d21661c48f846"} Dec 01 14:16:00 crc kubenswrapper[4585]: I1201 14:16:00.110024 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-77c848ff4-gtzqf" Dec 01 14:16:00 crc kubenswrapper[4585]: I1201 14:16:00.633235 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6f5b64975d-2mfhq" podUID="d6fd44d5-e430-42bb-ad0b-7c78e7a1f288" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Dec 01 14:16:00 crc kubenswrapper[4585]: I1201 14:16:00.633308 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6f5b64975d-2mfhq" Dec 01 14:16:00 crc kubenswrapper[4585]: I1201 14:16:00.634050 4585 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"93a074bee351885d31ad8a67b537a1943c95441154a2cae193cda36b50b6191f"} pod="openstack/horizon-6f5b64975d-2mfhq" containerMessage="Container horizon failed startup probe, will be restarted" Dec 01 14:16:00 crc kubenswrapper[4585]: I1201 14:16:00.634087 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6f5b64975d-2mfhq" podUID="d6fd44d5-e430-42bb-ad0b-7c78e7a1f288" containerName="horizon" containerID="cri-o://93a074bee351885d31ad8a67b537a1943c95441154a2cae193cda36b50b6191f" gracePeriod=30 Dec 01 14:16:00 crc kubenswrapper[4585]: I1201 14:16:00.791292 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6bbf659b46-55tth" podUID="e3b59a9b-bef9-4f7c-b861-bf6bc2a8bac1" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.149:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.149:8443: connect: connection refused" Dec 01 14:16:00 crc kubenswrapper[4585]: I1201 14:16:00.791372 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6bbf659b46-55tth" Dec 01 14:16:00 crc kubenswrapper[4585]: I1201 14:16:00.792119 4585 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"37ec8f025e77f78f9cd14818cc5d1fa36d23c0908d41f30996a803ffc9bcaa9d"} pod="openstack/horizon-6bbf659b46-55tth" containerMessage="Container horizon failed startup probe, will be restarted" Dec 01 14:16:00 crc kubenswrapper[4585]: I1201 14:16:00.792167 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6bbf659b46-55tth" podUID="e3b59a9b-bef9-4f7c-b861-bf6bc2a8bac1" containerName="horizon" containerID="cri-o://37ec8f025e77f78f9cd14818cc5d1fa36d23c0908d41f30996a803ffc9bcaa9d" gracePeriod=30 Dec 01 14:16:00 crc kubenswrapper[4585]: I1201 14:16:00.861682 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-77c848ff4-gtzqf" Dec 01 14:16:01 crc kubenswrapper[4585]: I1201 14:16:01.965461 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-wmh4t" Dec 01 14:16:01 crc kubenswrapper[4585]: I1201 14:16:01.985286 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54d4dd665-7g5hl" Dec 01 14:16:02 crc kubenswrapper[4585]: I1201 14:16:02.048677 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54d4dd665-7g5hl" event={"ID":"477277de-eaf5-4536-90d9-9091737cb66c","Type":"ContainerDied","Data":"4daf9f1042b58fdd080dfd024c4ab2faa26471369bf810a35309b7c12aa1db0c"} Dec 01 14:16:02 crc kubenswrapper[4585]: I1201 14:16:02.048731 4585 scope.go:117] "RemoveContainer" containerID="4dce655e753291e91d944ab90d6893b8a27eccfedd887ce33bdc93ab7812d057" Dec 01 14:16:02 crc kubenswrapper[4585]: I1201 14:16:02.048863 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54d4dd665-7g5hl" Dec 01 14:16:02 crc kubenswrapper[4585]: I1201 14:16:02.067171 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-wmh4t" event={"ID":"7dead942-d6c5-4a4a-aa5e-5c57b6da0c48","Type":"ContainerDied","Data":"fba5b48e7d41d5e73af29130c40c82c177a2f45ee55d349fef091f6fc7de4d96"} Dec 01 14:16:02 crc kubenswrapper[4585]: I1201 14:16:02.067205 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fba5b48e7d41d5e73af29130c40c82c177a2f45ee55d349fef091f6fc7de4d96" Dec 01 14:16:02 crc kubenswrapper[4585]: I1201 14:16:02.067267 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-wmh4t" Dec 01 14:16:02 crc kubenswrapper[4585]: I1201 14:16:02.094984 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/477277de-eaf5-4536-90d9-9091737cb66c-config-data\") pod \"477277de-eaf5-4536-90d9-9091737cb66c\" (UID: \"477277de-eaf5-4536-90d9-9091737cb66c\") " Dec 01 14:16:02 crc kubenswrapper[4585]: I1201 14:16:02.095020 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dead942-d6c5-4a4a-aa5e-5c57b6da0c48-combined-ca-bundle\") pod \"7dead942-d6c5-4a4a-aa5e-5c57b6da0c48\" (UID: \"7dead942-d6c5-4a4a-aa5e-5c57b6da0c48\") " Dec 01 14:16:02 crc kubenswrapper[4585]: I1201 14:16:02.095078 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/477277de-eaf5-4536-90d9-9091737cb66c-logs\") pod \"477277de-eaf5-4536-90d9-9091737cb66c\" (UID: \"477277de-eaf5-4536-90d9-9091737cb66c\") " Dec 01 14:16:02 crc kubenswrapper[4585]: I1201 14:16:02.095140 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7dead942-d6c5-4a4a-aa5e-5c57b6da0c48-scripts\") pod \"7dead942-d6c5-4a4a-aa5e-5c57b6da0c48\" (UID: \"7dead942-d6c5-4a4a-aa5e-5c57b6da0c48\") " Dec 01 14:16:02 crc kubenswrapper[4585]: I1201 14:16:02.095173 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8c27\" (UniqueName: \"kubernetes.io/projected/7dead942-d6c5-4a4a-aa5e-5c57b6da0c48-kube-api-access-m8c27\") pod \"7dead942-d6c5-4a4a-aa5e-5c57b6da0c48\" (UID: \"7dead942-d6c5-4a4a-aa5e-5c57b6da0c48\") " Dec 01 14:16:02 crc kubenswrapper[4585]: I1201 14:16:02.095204 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dead942-d6c5-4a4a-aa5e-5c57b6da0c48-config-data\") pod \"7dead942-d6c5-4a4a-aa5e-5c57b6da0c48\" (UID: \"7dead942-d6c5-4a4a-aa5e-5c57b6da0c48\") " Dec 01 14:16:02 crc kubenswrapper[4585]: I1201 14:16:02.095253 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqpbj\" (UniqueName: \"kubernetes.io/projected/477277de-eaf5-4536-90d9-9091737cb66c-kube-api-access-xqpbj\") pod \"477277de-eaf5-4536-90d9-9091737cb66c\" (UID: \"477277de-eaf5-4536-90d9-9091737cb66c\") " Dec 01 14:16:02 crc kubenswrapper[4585]: I1201 14:16:02.095305 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/477277de-eaf5-4536-90d9-9091737cb66c-scripts\") pod \"477277de-eaf5-4536-90d9-9091737cb66c\" (UID: \"477277de-eaf5-4536-90d9-9091737cb66c\") " Dec 01 14:16:02 crc kubenswrapper[4585]: I1201 14:16:02.095338 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7dead942-d6c5-4a4a-aa5e-5c57b6da0c48-db-sync-config-data\") pod \"7dead942-d6c5-4a4a-aa5e-5c57b6da0c48\" (UID: \"7dead942-d6c5-4a4a-aa5e-5c57b6da0c48\") " Dec 01 14:16:02 crc kubenswrapper[4585]: I1201 14:16:02.095366 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/477277de-eaf5-4536-90d9-9091737cb66c-horizon-secret-key\") pod \"477277de-eaf5-4536-90d9-9091737cb66c\" (UID: \"477277de-eaf5-4536-90d9-9091737cb66c\") " Dec 01 14:16:02 crc kubenswrapper[4585]: I1201 14:16:02.095400 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7dead942-d6c5-4a4a-aa5e-5c57b6da0c48-etc-machine-id\") pod \"7dead942-d6c5-4a4a-aa5e-5c57b6da0c48\" (UID: \"7dead942-d6c5-4a4a-aa5e-5c57b6da0c48\") " Dec 01 14:16:02 crc kubenswrapper[4585]: I1201 14:16:02.095794 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7dead942-d6c5-4a4a-aa5e-5c57b6da0c48-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "7dead942-d6c5-4a4a-aa5e-5c57b6da0c48" (UID: "7dead942-d6c5-4a4a-aa5e-5c57b6da0c48"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 14:16:02 crc kubenswrapper[4585]: I1201 14:16:02.104679 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/477277de-eaf5-4536-90d9-9091737cb66c-logs" (OuterVolumeSpecName: "logs") pod "477277de-eaf5-4536-90d9-9091737cb66c" (UID: "477277de-eaf5-4536-90d9-9091737cb66c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:16:02 crc kubenswrapper[4585]: I1201 14:16:02.127101 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dead942-d6c5-4a4a-aa5e-5c57b6da0c48-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "7dead942-d6c5-4a4a-aa5e-5c57b6da0c48" (UID: "7dead942-d6c5-4a4a-aa5e-5c57b6da0c48"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:16:02 crc kubenswrapper[4585]: I1201 14:16:02.127468 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7dead942-d6c5-4a4a-aa5e-5c57b6da0c48-kube-api-access-m8c27" (OuterVolumeSpecName: "kube-api-access-m8c27") pod "7dead942-d6c5-4a4a-aa5e-5c57b6da0c48" (UID: "7dead942-d6c5-4a4a-aa5e-5c57b6da0c48"). InnerVolumeSpecName "kube-api-access-m8c27". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:16:02 crc kubenswrapper[4585]: I1201 14:16:02.146486 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dead942-d6c5-4a4a-aa5e-5c57b6da0c48-scripts" (OuterVolumeSpecName: "scripts") pod "7dead942-d6c5-4a4a-aa5e-5c57b6da0c48" (UID: "7dead942-d6c5-4a4a-aa5e-5c57b6da0c48"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:16:02 crc kubenswrapper[4585]: I1201 14:16:02.203509 4585 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7dead942-d6c5-4a4a-aa5e-5c57b6da0c48-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:02 crc kubenswrapper[4585]: I1201 14:16:02.203540 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8c27\" (UniqueName: \"kubernetes.io/projected/7dead942-d6c5-4a4a-aa5e-5c57b6da0c48-kube-api-access-m8c27\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:02 crc kubenswrapper[4585]: I1201 14:16:02.203552 4585 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7dead942-d6c5-4a4a-aa5e-5c57b6da0c48-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:02 crc kubenswrapper[4585]: I1201 14:16:02.203560 4585 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7dead942-d6c5-4a4a-aa5e-5c57b6da0c48-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:02 crc kubenswrapper[4585]: I1201 14:16:02.203569 4585 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/477277de-eaf5-4536-90d9-9091737cb66c-logs\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:02 crc kubenswrapper[4585]: I1201 14:16:02.245894 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/477277de-eaf5-4536-90d9-9091737cb66c-scripts" (OuterVolumeSpecName: "scripts") pod "477277de-eaf5-4536-90d9-9091737cb66c" (UID: "477277de-eaf5-4536-90d9-9091737cb66c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:16:02 crc kubenswrapper[4585]: I1201 14:16:02.252275 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/477277de-eaf5-4536-90d9-9091737cb66c-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "477277de-eaf5-4536-90d9-9091737cb66c" (UID: "477277de-eaf5-4536-90d9-9091737cb66c"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:16:02 crc kubenswrapper[4585]: I1201 14:16:02.260807 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/477277de-eaf5-4536-90d9-9091737cb66c-config-data" (OuterVolumeSpecName: "config-data") pod "477277de-eaf5-4536-90d9-9091737cb66c" (UID: "477277de-eaf5-4536-90d9-9091737cb66c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:16:02 crc kubenswrapper[4585]: I1201 14:16:02.264770 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dead942-d6c5-4a4a-aa5e-5c57b6da0c48-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7dead942-d6c5-4a4a-aa5e-5c57b6da0c48" (UID: "7dead942-d6c5-4a4a-aa5e-5c57b6da0c48"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:16:02 crc kubenswrapper[4585]: I1201 14:16:02.305442 4585 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/477277de-eaf5-4536-90d9-9091737cb66c-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:02 crc kubenswrapper[4585]: I1201 14:16:02.305703 4585 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/477277de-eaf5-4536-90d9-9091737cb66c-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:02 crc kubenswrapper[4585]: I1201 14:16:02.305764 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dead942-d6c5-4a4a-aa5e-5c57b6da0c48-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:02 crc kubenswrapper[4585]: I1201 14:16:02.305933 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/477277de-eaf5-4536-90d9-9091737cb66c-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:02 crc kubenswrapper[4585]: I1201 14:16:02.307135 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/477277de-eaf5-4536-90d9-9091737cb66c-kube-api-access-xqpbj" (OuterVolumeSpecName: "kube-api-access-xqpbj") pod "477277de-eaf5-4536-90d9-9091737cb66c" (UID: "477277de-eaf5-4536-90d9-9091737cb66c"). InnerVolumeSpecName "kube-api-access-xqpbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:16:02 crc kubenswrapper[4585]: I1201 14:16:02.368885 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dead942-d6c5-4a4a-aa5e-5c57b6da0c48-config-data" (OuterVolumeSpecName: "config-data") pod "7dead942-d6c5-4a4a-aa5e-5c57b6da0c48" (UID: "7dead942-d6c5-4a4a-aa5e-5c57b6da0c48"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:16:02 crc kubenswrapper[4585]: I1201 14:16:02.421398 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dead942-d6c5-4a4a-aa5e-5c57b6da0c48-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:02 crc kubenswrapper[4585]: I1201 14:16:02.421429 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqpbj\" (UniqueName: \"kubernetes.io/projected/477277de-eaf5-4536-90d9-9091737cb66c-kube-api-access-xqpbj\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:02 crc kubenswrapper[4585]: I1201 14:16:02.450322 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-54d4dd665-7g5hl"] Dec 01 14:16:02 crc kubenswrapper[4585]: I1201 14:16:02.450913 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-54d4dd665-7g5hl"] Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.316190 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 14:16:03 crc kubenswrapper[4585]: E1201 14:16:03.316858 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="477277de-eaf5-4536-90d9-9091737cb66c" containerName="horizon" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.316871 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="477277de-eaf5-4536-90d9-9091737cb66c" containerName="horizon" Dec 01 14:16:03 crc kubenswrapper[4585]: E1201 14:16:03.316887 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dead942-d6c5-4a4a-aa5e-5c57b6da0c48" containerName="cinder-db-sync" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.316893 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dead942-d6c5-4a4a-aa5e-5c57b6da0c48" containerName="cinder-db-sync" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.317163 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dead942-d6c5-4a4a-aa5e-5c57b6da0c48" containerName="cinder-db-sync" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.317180 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="477277de-eaf5-4536-90d9-9091737cb66c" containerName="horizon" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.318121 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.340787 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.341227 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.341276 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.341496 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-9gnvq" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.358223 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.424316 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8556bc9b75-ldp9m" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.431893 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-tztg6"] Dec 01 14:16:03 crc kubenswrapper[4585]: E1201 14:16:03.432905 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a389fb6-e678-4552-8cf8-3aea857a545c" containerName="horizon-log" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.432939 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a389fb6-e678-4552-8cf8-3aea857a545c" containerName="horizon-log" Dec 01 14:16:03 crc kubenswrapper[4585]: E1201 14:16:03.433008 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a389fb6-e678-4552-8cf8-3aea857a545c" containerName="horizon" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.433019 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a389fb6-e678-4552-8cf8-3aea857a545c" containerName="horizon" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.434492 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a389fb6-e678-4552-8cf8-3aea857a545c" containerName="horizon-log" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.434524 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a389fb6-e678-4552-8cf8-3aea857a545c" containerName="horizon" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.439867 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-tztg6" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.462650 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1b331842-f530-45ae-92e4-59fd379833b0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1b331842-f530-45ae-92e4-59fd379833b0\") " pod="openstack/cinder-scheduler-0" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.462773 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1b331842-f530-45ae-92e4-59fd379833b0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1b331842-f530-45ae-92e4-59fd379833b0\") " pod="openstack/cinder-scheduler-0" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.462826 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b331842-f530-45ae-92e4-59fd379833b0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1b331842-f530-45ae-92e4-59fd379833b0\") " pod="openstack/cinder-scheduler-0" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.463025 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b331842-f530-45ae-92e4-59fd379833b0-config-data\") pod \"cinder-scheduler-0\" (UID: \"1b331842-f530-45ae-92e4-59fd379833b0\") " pod="openstack/cinder-scheduler-0" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.463043 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b331842-f530-45ae-92e4-59fd379833b0-scripts\") pod \"cinder-scheduler-0\" (UID: \"1b331842-f530-45ae-92e4-59fd379833b0\") " pod="openstack/cinder-scheduler-0" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.463071 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qfdg\" (UniqueName: \"kubernetes.io/projected/1b331842-f530-45ae-92e4-59fd379833b0-kube-api-access-9qfdg\") pod \"cinder-scheduler-0\" (UID: \"1b331842-f530-45ae-92e4-59fd379833b0\") " pod="openstack/cinder-scheduler-0" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.464783 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-szrfs" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.496206 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-tztg6"] Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.564292 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5a389fb6-e678-4552-8cf8-3aea857a545c-horizon-secret-key\") pod \"5a389fb6-e678-4552-8cf8-3aea857a545c\" (UID: \"5a389fb6-e678-4552-8cf8-3aea857a545c\") " Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.564350 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5a389fb6-e678-4552-8cf8-3aea857a545c-config-data\") pod \"5a389fb6-e678-4552-8cf8-3aea857a545c\" (UID: \"5a389fb6-e678-4552-8cf8-3aea857a545c\") " Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.564376 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a389fb6-e678-4552-8cf8-3aea857a545c-logs\") pod \"5a389fb6-e678-4552-8cf8-3aea857a545c\" (UID: \"5a389fb6-e678-4552-8cf8-3aea857a545c\") " Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.564395 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5a389fb6-e678-4552-8cf8-3aea857a545c-scripts\") pod \"5a389fb6-e678-4552-8cf8-3aea857a545c\" (UID: \"5a389fb6-e678-4552-8cf8-3aea857a545c\") " Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.564425 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f1901264-234a-4675-9867-3fb1f2689592-ovsdbserver-sb\") pod \"f1901264-234a-4675-9867-3fb1f2689592\" (UID: \"f1901264-234a-4675-9867-3fb1f2689592\") " Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.564453 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f1901264-234a-4675-9867-3fb1f2689592-dns-swift-storage-0\") pod \"f1901264-234a-4675-9867-3fb1f2689592\" (UID: \"f1901264-234a-4675-9867-3fb1f2689592\") " Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.564500 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzgm2\" (UniqueName: \"kubernetes.io/projected/f1901264-234a-4675-9867-3fb1f2689592-kube-api-access-wzgm2\") pod \"f1901264-234a-4675-9867-3fb1f2689592\" (UID: \"f1901264-234a-4675-9867-3fb1f2689592\") " Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.564580 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82jp9\" (UniqueName: \"kubernetes.io/projected/5a389fb6-e678-4552-8cf8-3aea857a545c-kube-api-access-82jp9\") pod \"5a389fb6-e678-4552-8cf8-3aea857a545c\" (UID: \"5a389fb6-e678-4552-8cf8-3aea857a545c\") " Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.564600 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f1901264-234a-4675-9867-3fb1f2689592-ovsdbserver-nb\") pod \"f1901264-234a-4675-9867-3fb1f2689592\" (UID: \"f1901264-234a-4675-9867-3fb1f2689592\") " Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.564666 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1901264-234a-4675-9867-3fb1f2689592-config\") pod \"f1901264-234a-4675-9867-3fb1f2689592\" (UID: \"f1901264-234a-4675-9867-3fb1f2689592\") " Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.564708 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f1901264-234a-4675-9867-3fb1f2689592-dns-svc\") pod \"f1901264-234a-4675-9867-3fb1f2689592\" (UID: \"f1901264-234a-4675-9867-3fb1f2689592\") " Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.564920 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b331842-f530-45ae-92e4-59fd379833b0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1b331842-f530-45ae-92e4-59fd379833b0\") " pod="openstack/cinder-scheduler-0" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.565034 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee30ed1d-c158-48eb-b68f-1e613718edeb-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-tztg6\" (UID: \"ee30ed1d-c158-48eb-b68f-1e613718edeb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-tztg6" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.565067 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ee30ed1d-c158-48eb-b68f-1e613718edeb-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-tztg6\" (UID: \"ee30ed1d-c158-48eb-b68f-1e613718edeb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-tztg6" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.565097 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjltc\" (UniqueName: \"kubernetes.io/projected/ee30ed1d-c158-48eb-b68f-1e613718edeb-kube-api-access-qjltc\") pod \"dnsmasq-dns-5c9776ccc5-tztg6\" (UID: \"ee30ed1d-c158-48eb-b68f-1e613718edeb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-tztg6" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.565122 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee30ed1d-c158-48eb-b68f-1e613718edeb-config\") pod \"dnsmasq-dns-5c9776ccc5-tztg6\" (UID: \"ee30ed1d-c158-48eb-b68f-1e613718edeb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-tztg6" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.565153 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b331842-f530-45ae-92e4-59fd379833b0-config-data\") pod \"cinder-scheduler-0\" (UID: \"1b331842-f530-45ae-92e4-59fd379833b0\") " pod="openstack/cinder-scheduler-0" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.565170 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b331842-f530-45ae-92e4-59fd379833b0-scripts\") pod \"cinder-scheduler-0\" (UID: \"1b331842-f530-45ae-92e4-59fd379833b0\") " pod="openstack/cinder-scheduler-0" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.565194 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qfdg\" (UniqueName: \"kubernetes.io/projected/1b331842-f530-45ae-92e4-59fd379833b0-kube-api-access-9qfdg\") pod \"cinder-scheduler-0\" (UID: \"1b331842-f530-45ae-92e4-59fd379833b0\") " pod="openstack/cinder-scheduler-0" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.565258 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee30ed1d-c158-48eb-b68f-1e613718edeb-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-tztg6\" (UID: \"ee30ed1d-c158-48eb-b68f-1e613718edeb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-tztg6" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.565276 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1b331842-f530-45ae-92e4-59fd379833b0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1b331842-f530-45ae-92e4-59fd379833b0\") " pod="openstack/cinder-scheduler-0" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.565302 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1b331842-f530-45ae-92e4-59fd379833b0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1b331842-f530-45ae-92e4-59fd379833b0\") " pod="openstack/cinder-scheduler-0" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.565318 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee30ed1d-c158-48eb-b68f-1e613718edeb-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-tztg6\" (UID: \"ee30ed1d-c158-48eb-b68f-1e613718edeb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-tztg6" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.566865 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a389fb6-e678-4552-8cf8-3aea857a545c-logs" (OuterVolumeSpecName: "logs") pod "5a389fb6-e678-4552-8cf8-3aea857a545c" (UID: "5a389fb6-e678-4552-8cf8-3aea857a545c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.594514 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1901264-234a-4675-9867-3fb1f2689592-kube-api-access-wzgm2" (OuterVolumeSpecName: "kube-api-access-wzgm2") pod "f1901264-234a-4675-9867-3fb1f2689592" (UID: "f1901264-234a-4675-9867-3fb1f2689592"). InnerVolumeSpecName "kube-api-access-wzgm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.614217 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b331842-f530-45ae-92e4-59fd379833b0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1b331842-f530-45ae-92e4-59fd379833b0\") " pod="openstack/cinder-scheduler-0" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.617791 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1b331842-f530-45ae-92e4-59fd379833b0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1b331842-f530-45ae-92e4-59fd379833b0\") " pod="openstack/cinder-scheduler-0" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.640265 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a389fb6-e678-4552-8cf8-3aea857a545c-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "5a389fb6-e678-4552-8cf8-3aea857a545c" (UID: "5a389fb6-e678-4552-8cf8-3aea857a545c"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.670755 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1b331842-f530-45ae-92e4-59fd379833b0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1b331842-f530-45ae-92e4-59fd379833b0\") " pod="openstack/cinder-scheduler-0" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.677649 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee30ed1d-c158-48eb-b68f-1e613718edeb-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-tztg6\" (UID: \"ee30ed1d-c158-48eb-b68f-1e613718edeb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-tztg6" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.677724 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee30ed1d-c158-48eb-b68f-1e613718edeb-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-tztg6\" (UID: \"ee30ed1d-c158-48eb-b68f-1e613718edeb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-tztg6" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.677812 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee30ed1d-c158-48eb-b68f-1e613718edeb-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-tztg6\" (UID: \"ee30ed1d-c158-48eb-b68f-1e613718edeb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-tztg6" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.677838 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ee30ed1d-c158-48eb-b68f-1e613718edeb-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-tztg6\" (UID: \"ee30ed1d-c158-48eb-b68f-1e613718edeb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-tztg6" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.677890 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjltc\" (UniqueName: \"kubernetes.io/projected/ee30ed1d-c158-48eb-b68f-1e613718edeb-kube-api-access-qjltc\") pod \"dnsmasq-dns-5c9776ccc5-tztg6\" (UID: \"ee30ed1d-c158-48eb-b68f-1e613718edeb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-tztg6" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.677911 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee30ed1d-c158-48eb-b68f-1e613718edeb-config\") pod \"dnsmasq-dns-5c9776ccc5-tztg6\" (UID: \"ee30ed1d-c158-48eb-b68f-1e613718edeb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-tztg6" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.678050 4585 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5a389fb6-e678-4552-8cf8-3aea857a545c-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.678062 4585 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a389fb6-e678-4552-8cf8-3aea857a545c-logs\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.678070 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzgm2\" (UniqueName: \"kubernetes.io/projected/f1901264-234a-4675-9867-3fb1f2689592-kube-api-access-wzgm2\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.679017 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee30ed1d-c158-48eb-b68f-1e613718edeb-config\") pod \"dnsmasq-dns-5c9776ccc5-tztg6\" (UID: \"ee30ed1d-c158-48eb-b68f-1e613718edeb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-tztg6" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.679325 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a389fb6-e678-4552-8cf8-3aea857a545c-kube-api-access-82jp9" (OuterVolumeSpecName: "kube-api-access-82jp9") pod "5a389fb6-e678-4552-8cf8-3aea857a545c" (UID: "5a389fb6-e678-4552-8cf8-3aea857a545c"). InnerVolumeSpecName "kube-api-access-82jp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.679676 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ee30ed1d-c158-48eb-b68f-1e613718edeb-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-tztg6\" (UID: \"ee30ed1d-c158-48eb-b68f-1e613718edeb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-tztg6" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.679890 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee30ed1d-c158-48eb-b68f-1e613718edeb-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-tztg6\" (UID: \"ee30ed1d-c158-48eb-b68f-1e613718edeb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-tztg6" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.680668 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee30ed1d-c158-48eb-b68f-1e613718edeb-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-tztg6\" (UID: \"ee30ed1d-c158-48eb-b68f-1e613718edeb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-tztg6" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.686148 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee30ed1d-c158-48eb-b68f-1e613718edeb-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-tztg6\" (UID: \"ee30ed1d-c158-48eb-b68f-1e613718edeb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-tztg6" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.700550 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qfdg\" (UniqueName: \"kubernetes.io/projected/1b331842-f530-45ae-92e4-59fd379833b0-kube-api-access-9qfdg\") pod \"cinder-scheduler-0\" (UID: \"1b331842-f530-45ae-92e4-59fd379833b0\") " pod="openstack/cinder-scheduler-0" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.706077 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b331842-f530-45ae-92e4-59fd379833b0-scripts\") pod \"cinder-scheduler-0\" (UID: \"1b331842-f530-45ae-92e4-59fd379833b0\") " pod="openstack/cinder-scheduler-0" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.722557 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a389fb6-e678-4552-8cf8-3aea857a545c-scripts" (OuterVolumeSpecName: "scripts") pod "5a389fb6-e678-4552-8cf8-3aea857a545c" (UID: "5a389fb6-e678-4552-8cf8-3aea857a545c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.724193 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b331842-f530-45ae-92e4-59fd379833b0-config-data\") pod \"cinder-scheduler-0\" (UID: \"1b331842-f530-45ae-92e4-59fd379833b0\") " pod="openstack/cinder-scheduler-0" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.725965 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjltc\" (UniqueName: \"kubernetes.io/projected/ee30ed1d-c158-48eb-b68f-1e613718edeb-kube-api-access-qjltc\") pod \"dnsmasq-dns-5c9776ccc5-tztg6\" (UID: \"ee30ed1d-c158-48eb-b68f-1e613718edeb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-tztg6" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.768643 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a389fb6-e678-4552-8cf8-3aea857a545c-config-data" (OuterVolumeSpecName: "config-data") pod "5a389fb6-e678-4552-8cf8-3aea857a545c" (UID: "5a389fb6-e678-4552-8cf8-3aea857a545c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.779882 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5a389fb6-e678-4552-8cf8-3aea857a545c-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.779905 4585 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5a389fb6-e678-4552-8cf8-3aea857a545c-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.779915 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82jp9\" (UniqueName: \"kubernetes.io/projected/5a389fb6-e678-4552-8cf8-3aea857a545c-kube-api-access-82jp9\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.780083 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.824660 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-tztg6" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.927217 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 01 14:16:03 crc kubenswrapper[4585]: E1201 14:16:03.927637 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1901264-234a-4675-9867-3fb1f2689592" containerName="dnsmasq-dns" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.927648 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1901264-234a-4675-9867-3fb1f2689592" containerName="dnsmasq-dns" Dec 01 14:16:03 crc kubenswrapper[4585]: E1201 14:16:03.927674 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1901264-234a-4675-9867-3fb1f2689592" containerName="init" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.927680 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1901264-234a-4675-9867-3fb1f2689592" containerName="init" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.927871 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1901264-234a-4675-9867-3fb1f2689592" containerName="dnsmasq-dns" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.951265 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 01 14:16:03 crc kubenswrapper[4585]: I1201 14:16:03.965947 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.004837 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.029037 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1901264-234a-4675-9867-3fb1f2689592-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f1901264-234a-4675-9867-3fb1f2689592" (UID: "f1901264-234a-4675-9867-3fb1f2689592"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.030449 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0f792a5-42c4-4a55-af66-55abc1be9684-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a0f792a5-42c4-4a55-af66-55abc1be9684\") " pod="openstack/cinder-api-0" Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.037643 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a0f792a5-42c4-4a55-af66-55abc1be9684-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a0f792a5-42c4-4a55-af66-55abc1be9684\") " pod="openstack/cinder-api-0" Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.037708 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0f792a5-42c4-4a55-af66-55abc1be9684-scripts\") pod \"cinder-api-0\" (UID: \"a0f792a5-42c4-4a55-af66-55abc1be9684\") " pod="openstack/cinder-api-0" Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.037797 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a0f792a5-42c4-4a55-af66-55abc1be9684-config-data-custom\") pod \"cinder-api-0\" (UID: \"a0f792a5-42c4-4a55-af66-55abc1be9684\") " pod="openstack/cinder-api-0" Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.037848 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0f792a5-42c4-4a55-af66-55abc1be9684-logs\") pod \"cinder-api-0\" (UID: \"a0f792a5-42c4-4a55-af66-55abc1be9684\") " pod="openstack/cinder-api-0" Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.037952 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0f792a5-42c4-4a55-af66-55abc1be9684-config-data\") pod \"cinder-api-0\" (UID: \"a0f792a5-42c4-4a55-af66-55abc1be9684\") " pod="openstack/cinder-api-0" Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.038002 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zljdp\" (UniqueName: \"kubernetes.io/projected/a0f792a5-42c4-4a55-af66-55abc1be9684-kube-api-access-zljdp\") pod \"cinder-api-0\" (UID: \"a0f792a5-42c4-4a55-af66-55abc1be9684\") " pod="openstack/cinder-api-0" Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.038137 4585 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f1901264-234a-4675-9867-3fb1f2689592-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.057207 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1901264-234a-4675-9867-3fb1f2689592-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f1901264-234a-4675-9867-3fb1f2689592" (UID: "f1901264-234a-4675-9867-3fb1f2689592"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.095569 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1901264-234a-4675-9867-3fb1f2689592-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f1901264-234a-4675-9867-3fb1f2689592" (UID: "f1901264-234a-4675-9867-3fb1f2689592"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.120584 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1901264-234a-4675-9867-3fb1f2689592-config" (OuterVolumeSpecName: "config") pod "f1901264-234a-4675-9867-3fb1f2689592" (UID: "f1901264-234a-4675-9867-3fb1f2689592"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.122156 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1901264-234a-4675-9867-3fb1f2689592-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f1901264-234a-4675-9867-3fb1f2689592" (UID: "f1901264-234a-4675-9867-3fb1f2689592"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.139880 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a0f792a5-42c4-4a55-af66-55abc1be9684-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a0f792a5-42c4-4a55-af66-55abc1be9684\") " pod="openstack/cinder-api-0" Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.139922 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0f792a5-42c4-4a55-af66-55abc1be9684-scripts\") pod \"cinder-api-0\" (UID: \"a0f792a5-42c4-4a55-af66-55abc1be9684\") " pod="openstack/cinder-api-0" Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.139954 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a0f792a5-42c4-4a55-af66-55abc1be9684-config-data-custom\") pod \"cinder-api-0\" (UID: \"a0f792a5-42c4-4a55-af66-55abc1be9684\") " pod="openstack/cinder-api-0" Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.139990 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0f792a5-42c4-4a55-af66-55abc1be9684-logs\") pod \"cinder-api-0\" (UID: \"a0f792a5-42c4-4a55-af66-55abc1be9684\") " pod="openstack/cinder-api-0" Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.140025 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0f792a5-42c4-4a55-af66-55abc1be9684-config-data\") pod \"cinder-api-0\" (UID: \"a0f792a5-42c4-4a55-af66-55abc1be9684\") " pod="openstack/cinder-api-0" Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.140041 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zljdp\" (UniqueName: \"kubernetes.io/projected/a0f792a5-42c4-4a55-af66-55abc1be9684-kube-api-access-zljdp\") pod \"cinder-api-0\" (UID: \"a0f792a5-42c4-4a55-af66-55abc1be9684\") " pod="openstack/cinder-api-0" Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.140047 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a0f792a5-42c4-4a55-af66-55abc1be9684-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a0f792a5-42c4-4a55-af66-55abc1be9684\") " pod="openstack/cinder-api-0" Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.140086 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0f792a5-42c4-4a55-af66-55abc1be9684-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a0f792a5-42c4-4a55-af66-55abc1be9684\") " pod="openstack/cinder-api-0" Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.140135 4585 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f1901264-234a-4675-9867-3fb1f2689592-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.140146 4585 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f1901264-234a-4675-9867-3fb1f2689592-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.140156 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1901264-234a-4675-9867-3fb1f2689592-config\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.140167 4585 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f1901264-234a-4675-9867-3fb1f2689592-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.140718 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0f792a5-42c4-4a55-af66-55abc1be9684-logs\") pod \"cinder-api-0\" (UID: \"a0f792a5-42c4-4a55-af66-55abc1be9684\") " pod="openstack/cinder-api-0" Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.156277 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0f792a5-42c4-4a55-af66-55abc1be9684-config-data\") pod \"cinder-api-0\" (UID: \"a0f792a5-42c4-4a55-af66-55abc1be9684\") " pod="openstack/cinder-api-0" Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.157812 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-szrfs" event={"ID":"f1901264-234a-4675-9867-3fb1f2689592","Type":"ContainerDied","Data":"b903e90908ddc4624c5f822114c3013766cc1d5db491b5554ccb41b61659ab5c"} Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.157858 4585 scope.go:117] "RemoveContainer" containerID="13d5e365d0fef2c5c4fcf1847fb20ef4fcd1cbf13e70aea0e07d21661c48f846" Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.157996 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-szrfs" Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.168636 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a0f792a5-42c4-4a55-af66-55abc1be9684-config-data-custom\") pod \"cinder-api-0\" (UID: \"a0f792a5-42c4-4a55-af66-55abc1be9684\") " pod="openstack/cinder-api-0" Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.180327 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zljdp\" (UniqueName: \"kubernetes.io/projected/a0f792a5-42c4-4a55-af66-55abc1be9684-kube-api-access-zljdp\") pod \"cinder-api-0\" (UID: \"a0f792a5-42c4-4a55-af66-55abc1be9684\") " pod="openstack/cinder-api-0" Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.181964 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0f792a5-42c4-4a55-af66-55abc1be9684-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a0f792a5-42c4-4a55-af66-55abc1be9684\") " pod="openstack/cinder-api-0" Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.189583 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0f792a5-42c4-4a55-af66-55abc1be9684-scripts\") pod \"cinder-api-0\" (UID: \"a0f792a5-42c4-4a55-af66-55abc1be9684\") " pod="openstack/cinder-api-0" Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.193248 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8556bc9b75-ldp9m" event={"ID":"5a389fb6-e678-4552-8cf8-3aea857a545c","Type":"ContainerDied","Data":"fa34beb20f110db1a486dd8ccaf3f004c4dcbbd684e658a0d5f61f4d952c9817"} Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.193294 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8556bc9b75-ldp9m" Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.198651 4585 generic.go:334] "Generic (PLEG): container finished" podID="1e9edaf4-b918-4370-8184-79de4b087dfc" containerID="8456e5822a825f57ff78e6f6cdf5ebaf37da8d4e69fe80ac6039bc9517e07e80" exitCode=0 Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.201667 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-84c9dcff9d-xn9xw" event={"ID":"1e9edaf4-b918-4370-8184-79de4b087dfc","Type":"ContainerDied","Data":"8456e5822a825f57ff78e6f6cdf5ebaf37da8d4e69fe80ac6039bc9517e07e80"} Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.216186 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-szrfs"] Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.223560 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-szrfs"] Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.248699 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-84c9dcff9d-xn9xw" Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.256387 4585 scope.go:117] "RemoveContainer" containerID="6558d1be25687287e0ed4143bbb3ddd6185caa9bb95debecc04ba71fdece357f" Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.268729 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-8556bc9b75-ldp9m"] Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.317233 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-8556bc9b75-ldp9m"] Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.333549 4585 scope.go:117] "RemoveContainer" containerID="ccdf936df12a71d20f624a17f99cc73a7a22471ec69998e0bd3b3bb0444d8792" Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.390237 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.434243 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="477277de-eaf5-4536-90d9-9091737cb66c" path="/var/lib/kubelet/pods/477277de-eaf5-4536-90d9-9091737cb66c/volumes" Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.434865 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a389fb6-e678-4552-8cf8-3aea857a545c" path="/var/lib/kubelet/pods/5a389fb6-e678-4552-8cf8-3aea857a545c/volumes" Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.435443 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1901264-234a-4675-9867-3fb1f2689592" path="/var/lib/kubelet/pods/f1901264-234a-4675-9867-3fb1f2689592/volumes" Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.490630 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x84tt\" (UniqueName: \"kubernetes.io/projected/1e9edaf4-b918-4370-8184-79de4b087dfc-kube-api-access-x84tt\") pod \"1e9edaf4-b918-4370-8184-79de4b087dfc\" (UID: \"1e9edaf4-b918-4370-8184-79de4b087dfc\") " Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.490786 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1e9edaf4-b918-4370-8184-79de4b087dfc-httpd-config\") pod \"1e9edaf4-b918-4370-8184-79de4b087dfc\" (UID: \"1e9edaf4-b918-4370-8184-79de4b087dfc\") " Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.490889 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e9edaf4-b918-4370-8184-79de4b087dfc-combined-ca-bundle\") pod \"1e9edaf4-b918-4370-8184-79de4b087dfc\" (UID: \"1e9edaf4-b918-4370-8184-79de4b087dfc\") " Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.490957 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e9edaf4-b918-4370-8184-79de4b087dfc-ovndb-tls-certs\") pod \"1e9edaf4-b918-4370-8184-79de4b087dfc\" (UID: \"1e9edaf4-b918-4370-8184-79de4b087dfc\") " Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.491030 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1e9edaf4-b918-4370-8184-79de4b087dfc-config\") pod \"1e9edaf4-b918-4370-8184-79de4b087dfc\" (UID: \"1e9edaf4-b918-4370-8184-79de4b087dfc\") " Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.505955 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e9edaf4-b918-4370-8184-79de4b087dfc-kube-api-access-x84tt" (OuterVolumeSpecName: "kube-api-access-x84tt") pod "1e9edaf4-b918-4370-8184-79de4b087dfc" (UID: "1e9edaf4-b918-4370-8184-79de4b087dfc"). InnerVolumeSpecName "kube-api-access-x84tt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.517236 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e9edaf4-b918-4370-8184-79de4b087dfc-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "1e9edaf4-b918-4370-8184-79de4b087dfc" (UID: "1e9edaf4-b918-4370-8184-79de4b087dfc"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.570203 4585 scope.go:117] "RemoveContainer" containerID="d838107b5f7119f26afc1894fb492640a7f5d895edcec345d4fda163059ae883" Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.570340 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.576759 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-tztg6"] Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.592705 4585 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1e9edaf4-b918-4370-8184-79de4b087dfc-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.592725 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x84tt\" (UniqueName: \"kubernetes.io/projected/1e9edaf4-b918-4370-8184-79de4b087dfc-kube-api-access-x84tt\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.657841 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e9edaf4-b918-4370-8184-79de4b087dfc-config" (OuterVolumeSpecName: "config") pod "1e9edaf4-b918-4370-8184-79de4b087dfc" (UID: "1e9edaf4-b918-4370-8184-79de4b087dfc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.663590 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e9edaf4-b918-4370-8184-79de4b087dfc-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "1e9edaf4-b918-4370-8184-79de4b087dfc" (UID: "1e9edaf4-b918-4370-8184-79de4b087dfc"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.688094 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e9edaf4-b918-4370-8184-79de4b087dfc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1e9edaf4-b918-4370-8184-79de4b087dfc" (UID: "1e9edaf4-b918-4370-8184-79de4b087dfc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.694926 4585 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e9edaf4-b918-4370-8184-79de4b087dfc-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.694961 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/1e9edaf4-b918-4370-8184-79de4b087dfc-config\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:04 crc kubenswrapper[4585]: I1201 14:16:04.694991 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e9edaf4-b918-4370-8184-79de4b087dfc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:05 crc kubenswrapper[4585]: I1201 14:16:05.211513 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6564675f78-48rkf" Dec 01 14:16:05 crc kubenswrapper[4585]: I1201 14:16:05.238698 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 01 14:16:05 crc kubenswrapper[4585]: I1201 14:16:05.257174 4585 generic.go:334] "Generic (PLEG): container finished" podID="ee30ed1d-c158-48eb-b68f-1e613718edeb" containerID="53be4dc2e21501649ee25680a49eabe364a6a4c3326076876a18fdee764d3604" exitCode=0 Dec 01 14:16:05 crc kubenswrapper[4585]: I1201 14:16:05.257237 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-tztg6" event={"ID":"ee30ed1d-c158-48eb-b68f-1e613718edeb","Type":"ContainerDied","Data":"53be4dc2e21501649ee25680a49eabe364a6a4c3326076876a18fdee764d3604"} Dec 01 14:16:05 crc kubenswrapper[4585]: I1201 14:16:05.257263 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-tztg6" event={"ID":"ee30ed1d-c158-48eb-b68f-1e613718edeb","Type":"ContainerStarted","Data":"1d301e9a19e493643cf7c617958a39eac70913defa3d4b43b0ee5055feae4ab1"} Dec 01 14:16:05 crc kubenswrapper[4585]: I1201 14:16:05.281933 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1b331842-f530-45ae-92e4-59fd379833b0","Type":"ContainerStarted","Data":"fdde948a4a318c9a93a5e724b97fa3471a798c0b5ef9238d83356ee264b89904"} Dec 01 14:16:05 crc kubenswrapper[4585]: I1201 14:16:05.291436 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"192bd785-3d80-4b5e-8db2-55a0e3846802","Type":"ContainerStarted","Data":"ccc5e3e663880685617e2d8343b2fcb96ecb36f91ce92ba25410114793298881"} Dec 01 14:16:05 crc kubenswrapper[4585]: I1201 14:16:05.291607 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="192bd785-3d80-4b5e-8db2-55a0e3846802" containerName="ceilometer-central-agent" containerID="cri-o://a78817c1306272222990b2fafaaff52eb95a6582122a7f87b0e25cce4a8798d6" gracePeriod=30 Dec 01 14:16:05 crc kubenswrapper[4585]: I1201 14:16:05.291678 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 14:16:05 crc kubenswrapper[4585]: I1201 14:16:05.291993 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="192bd785-3d80-4b5e-8db2-55a0e3846802" containerName="proxy-httpd" containerID="cri-o://ccc5e3e663880685617e2d8343b2fcb96ecb36f91ce92ba25410114793298881" gracePeriod=30 Dec 01 14:16:05 crc kubenswrapper[4585]: I1201 14:16:05.292042 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="192bd785-3d80-4b5e-8db2-55a0e3846802" containerName="sg-core" containerID="cri-o://e0f5c62f7600bf9dc4ace09b2dbd0f4fa25968e200a287e821eb0803416bbb1a" gracePeriod=30 Dec 01 14:16:05 crc kubenswrapper[4585]: I1201 14:16:05.292073 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="192bd785-3d80-4b5e-8db2-55a0e3846802" containerName="ceilometer-notification-agent" containerID="cri-o://192f8a3c9f0df4f6db339068d880c601748747fe695b451b112e9dbbdef72909" gracePeriod=30 Dec 01 14:16:05 crc kubenswrapper[4585]: I1201 14:16:05.314106 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-84c9dcff9d-xn9xw" Dec 01 14:16:05 crc kubenswrapper[4585]: I1201 14:16:05.315008 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-84c9dcff9d-xn9xw" event={"ID":"1e9edaf4-b918-4370-8184-79de4b087dfc","Type":"ContainerDied","Data":"59dc3cfa7006b8d561f40a55fed7dd5f9e8d0e9302623e9c33c850d19201eab9"} Dec 01 14:16:05 crc kubenswrapper[4585]: I1201 14:16:05.315058 4585 scope.go:117] "RemoveContainer" containerID="2d386bedb990d1f77ddf7db9f242792eaf5889a9e32ada4df5a3416b96df2c0d" Dec 01 14:16:05 crc kubenswrapper[4585]: I1201 14:16:05.332767 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.128020932 podStartE2EDuration="1m13.332750765s" podCreationTimestamp="2025-12-01 14:14:52 +0000 UTC" firstStartedPulling="2025-12-01 14:14:54.571352677 +0000 UTC m=+1008.555566533" lastFinishedPulling="2025-12-01 14:16:03.776082511 +0000 UTC m=+1077.760296366" observedRunningTime="2025-12-01 14:16:05.324069053 +0000 UTC m=+1079.308282908" watchObservedRunningTime="2025-12-01 14:16:05.332750765 +0000 UTC m=+1079.316964620" Dec 01 14:16:05 crc kubenswrapper[4585]: I1201 14:16:05.349305 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a0f792a5-42c4-4a55-af66-55abc1be9684","Type":"ContainerStarted","Data":"48e53bd1661e973c1addbe6cc6cd5d3056142db4a00908fe269811d06edec52a"} Dec 01 14:16:05 crc kubenswrapper[4585]: I1201 14:16:05.511091 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-84c9dcff9d-xn9xw"] Dec 01 14:16:05 crc kubenswrapper[4585]: I1201 14:16:05.522154 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-84c9dcff9d-xn9xw"] Dec 01 14:16:05 crc kubenswrapper[4585]: I1201 14:16:05.540082 4585 scope.go:117] "RemoveContainer" containerID="8456e5822a825f57ff78e6f6cdf5ebaf37da8d4e69fe80ac6039bc9517e07e80" Dec 01 14:16:05 crc kubenswrapper[4585]: I1201 14:16:05.902170 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6564675f78-48rkf" Dec 01 14:16:05 crc kubenswrapper[4585]: I1201 14:16:05.977823 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-77c848ff4-gtzqf"] Dec 01 14:16:05 crc kubenswrapper[4585]: I1201 14:16:05.978089 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-77c848ff4-gtzqf" podUID="9dce0a63-d2c7-4004-b78a-5e7016fe10e9" containerName="barbican-api-log" containerID="cri-o://6bc4be9fb2c8c795d2d716883a9fe496c594d9899aca60b4fb41a4607d1e190f" gracePeriod=30 Dec 01 14:16:05 crc kubenswrapper[4585]: I1201 14:16:05.978594 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-77c848ff4-gtzqf" podUID="9dce0a63-d2c7-4004-b78a-5e7016fe10e9" containerName="barbican-api" containerID="cri-o://b9bb77f521832e57692d2665547f08652dbf3fb329788eaea1b3c7ea2807b528" gracePeriod=30 Dec 01 14:16:06 crc kubenswrapper[4585]: I1201 14:16:06.373668 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a0f792a5-42c4-4a55-af66-55abc1be9684","Type":"ContainerStarted","Data":"4d4ac1f3d1190d82d6047f199043485ff97e61fab0a004e4c8f649462887feb6"} Dec 01 14:16:06 crc kubenswrapper[4585]: I1201 14:16:06.377163 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-tztg6" event={"ID":"ee30ed1d-c158-48eb-b68f-1e613718edeb","Type":"ContainerStarted","Data":"745b88980361c0a8994adb67d0c89d686e0571033b3b0a60d64f9b7fb2aead6f"} Dec 01 14:16:06 crc kubenswrapper[4585]: I1201 14:16:06.377305 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-tztg6" Dec 01 14:16:06 crc kubenswrapper[4585]: I1201 14:16:06.386412 4585 generic.go:334] "Generic (PLEG): container finished" podID="9dce0a63-d2c7-4004-b78a-5e7016fe10e9" containerID="6bc4be9fb2c8c795d2d716883a9fe496c594d9899aca60b4fb41a4607d1e190f" exitCode=143 Dec 01 14:16:06 crc kubenswrapper[4585]: I1201 14:16:06.386466 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-77c848ff4-gtzqf" event={"ID":"9dce0a63-d2c7-4004-b78a-5e7016fe10e9","Type":"ContainerDied","Data":"6bc4be9fb2c8c795d2d716883a9fe496c594d9899aca60b4fb41a4607d1e190f"} Dec 01 14:16:06 crc kubenswrapper[4585]: I1201 14:16:06.408618 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-tztg6" podStartSLOduration=3.408600223 podStartE2EDuration="3.408600223s" podCreationTimestamp="2025-12-01 14:16:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:16:06.406211389 +0000 UTC m=+1080.390425254" watchObservedRunningTime="2025-12-01 14:16:06.408600223 +0000 UTC m=+1080.392814078" Dec 01 14:16:06 crc kubenswrapper[4585]: I1201 14:16:06.431880 4585 generic.go:334] "Generic (PLEG): container finished" podID="192bd785-3d80-4b5e-8db2-55a0e3846802" containerID="ccc5e3e663880685617e2d8343b2fcb96ecb36f91ce92ba25410114793298881" exitCode=0 Dec 01 14:16:06 crc kubenswrapper[4585]: I1201 14:16:06.431918 4585 generic.go:334] "Generic (PLEG): container finished" podID="192bd785-3d80-4b5e-8db2-55a0e3846802" containerID="e0f5c62f7600bf9dc4ace09b2dbd0f4fa25968e200a287e821eb0803416bbb1a" exitCode=2 Dec 01 14:16:06 crc kubenswrapper[4585]: I1201 14:16:06.431926 4585 generic.go:334] "Generic (PLEG): container finished" podID="192bd785-3d80-4b5e-8db2-55a0e3846802" containerID="a78817c1306272222990b2fafaaff52eb95a6582122a7f87b0e25cce4a8798d6" exitCode=0 Dec 01 14:16:06 crc kubenswrapper[4585]: I1201 14:16:06.431885 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e9edaf4-b918-4370-8184-79de4b087dfc" path="/var/lib/kubelet/pods/1e9edaf4-b918-4370-8184-79de4b087dfc/volumes" Dec 01 14:16:06 crc kubenswrapper[4585]: I1201 14:16:06.432593 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"192bd785-3d80-4b5e-8db2-55a0e3846802","Type":"ContainerDied","Data":"ccc5e3e663880685617e2d8343b2fcb96ecb36f91ce92ba25410114793298881"} Dec 01 14:16:06 crc kubenswrapper[4585]: I1201 14:16:06.432620 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"192bd785-3d80-4b5e-8db2-55a0e3846802","Type":"ContainerDied","Data":"e0f5c62f7600bf9dc4ace09b2dbd0f4fa25968e200a287e821eb0803416bbb1a"} Dec 01 14:16:06 crc kubenswrapper[4585]: I1201 14:16:06.432629 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"192bd785-3d80-4b5e-8db2-55a0e3846802","Type":"ContainerDied","Data":"a78817c1306272222990b2fafaaff52eb95a6582122a7f87b0e25cce4a8798d6"} Dec 01 14:16:06 crc kubenswrapper[4585]: I1201 14:16:06.568170 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 01 14:16:07 crc kubenswrapper[4585]: I1201 14:16:07.461122 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1b331842-f530-45ae-92e4-59fd379833b0","Type":"ContainerStarted","Data":"d6854a6b5518f5a46d76ea04013f9dd8f5c9f68c5ce1c10210b74d60ea48925e"} Dec 01 14:16:07 crc kubenswrapper[4585]: I1201 14:16:07.464818 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="a0f792a5-42c4-4a55-af66-55abc1be9684" containerName="cinder-api-log" containerID="cri-o://4d4ac1f3d1190d82d6047f199043485ff97e61fab0a004e4c8f649462887feb6" gracePeriod=30 Dec 01 14:16:07 crc kubenswrapper[4585]: I1201 14:16:07.465093 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a0f792a5-42c4-4a55-af66-55abc1be9684","Type":"ContainerStarted","Data":"a4558e1522ce76ce77e9244033bd909cde8395398e9db6d8be4e40428e57641d"} Dec 01 14:16:07 crc kubenswrapper[4585]: I1201 14:16:07.465129 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 01 14:16:07 crc kubenswrapper[4585]: I1201 14:16:07.465318 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="a0f792a5-42c4-4a55-af66-55abc1be9684" containerName="cinder-api" containerID="cri-o://a4558e1522ce76ce77e9244033bd909cde8395398e9db6d8be4e40428e57641d" gracePeriod=30 Dec 01 14:16:07 crc kubenswrapper[4585]: I1201 14:16:07.497191 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.497174819 podStartE2EDuration="4.497174819s" podCreationTimestamp="2025-12-01 14:16:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:16:07.491355594 +0000 UTC m=+1081.475569449" watchObservedRunningTime="2025-12-01 14:16:07.497174819 +0000 UTC m=+1081.481388674" Dec 01 14:16:08 crc kubenswrapper[4585]: I1201 14:16:08.267242 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-55f844cf75-szrfs" podUID="f1901264-234a-4675-9867-3fb1f2689592" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.152:5353: i/o timeout" Dec 01 14:16:08 crc kubenswrapper[4585]: I1201 14:16:08.498788 4585 generic.go:334] "Generic (PLEG): container finished" podID="a0f792a5-42c4-4a55-af66-55abc1be9684" containerID="4d4ac1f3d1190d82d6047f199043485ff97e61fab0a004e4c8f649462887feb6" exitCode=143 Dec 01 14:16:08 crc kubenswrapper[4585]: I1201 14:16:08.498896 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a0f792a5-42c4-4a55-af66-55abc1be9684","Type":"ContainerDied","Data":"4d4ac1f3d1190d82d6047f199043485ff97e61fab0a004e4c8f649462887feb6"} Dec 01 14:16:08 crc kubenswrapper[4585]: I1201 14:16:08.507271 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1b331842-f530-45ae-92e4-59fd379833b0","Type":"ContainerStarted","Data":"19fb59a953575ee686aa9bbd83e688a83c3e716c8240424b250f91c4e1064636"} Dec 01 14:16:08 crc kubenswrapper[4585]: I1201 14:16:08.531002 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.9449866 podStartE2EDuration="5.530985737s" podCreationTimestamp="2025-12-01 14:16:03 +0000 UTC" firstStartedPulling="2025-12-01 14:16:04.591123176 +0000 UTC m=+1078.575337031" lastFinishedPulling="2025-12-01 14:16:06.177122313 +0000 UTC m=+1080.161336168" observedRunningTime="2025-12-01 14:16:08.525494911 +0000 UTC m=+1082.509708766" watchObservedRunningTime="2025-12-01 14:16:08.530985737 +0000 UTC m=+1082.515199592" Dec 01 14:16:08 crc kubenswrapper[4585]: I1201 14:16:08.781430 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.130452 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5594675dd-jdqsw" Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.259580 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-77c848ff4-gtzqf" podUID="9dce0a63-d2c7-4004-b78a-5e7016fe10e9" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": read tcp 10.217.0.2:36000->10.217.0.161:9311: read: connection reset by peer" Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.259623 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-77c848ff4-gtzqf" podUID="9dce0a63-d2c7-4004-b78a-5e7016fe10e9" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.161:9311/healthcheck\": read tcp 10.217.0.2:35992->10.217.0.161:9311: read: connection reset by peer" Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.414390 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5594675dd-jdqsw" Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.416754 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.507502 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/192bd785-3d80-4b5e-8db2-55a0e3846802-log-httpd\") pod \"192bd785-3d80-4b5e-8db2-55a0e3846802\" (UID: \"192bd785-3d80-4b5e-8db2-55a0e3846802\") " Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.507551 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/192bd785-3d80-4b5e-8db2-55a0e3846802-scripts\") pod \"192bd785-3d80-4b5e-8db2-55a0e3846802\" (UID: \"192bd785-3d80-4b5e-8db2-55a0e3846802\") " Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.507587 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25flm\" (UniqueName: \"kubernetes.io/projected/192bd785-3d80-4b5e-8db2-55a0e3846802-kube-api-access-25flm\") pod \"192bd785-3d80-4b5e-8db2-55a0e3846802\" (UID: \"192bd785-3d80-4b5e-8db2-55a0e3846802\") " Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.507634 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/192bd785-3d80-4b5e-8db2-55a0e3846802-run-httpd\") pod \"192bd785-3d80-4b5e-8db2-55a0e3846802\" (UID: \"192bd785-3d80-4b5e-8db2-55a0e3846802\") " Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.507653 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/192bd785-3d80-4b5e-8db2-55a0e3846802-config-data\") pod \"192bd785-3d80-4b5e-8db2-55a0e3846802\" (UID: \"192bd785-3d80-4b5e-8db2-55a0e3846802\") " Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.507738 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/192bd785-3d80-4b5e-8db2-55a0e3846802-combined-ca-bundle\") pod \"192bd785-3d80-4b5e-8db2-55a0e3846802\" (UID: \"192bd785-3d80-4b5e-8db2-55a0e3846802\") " Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.507791 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/192bd785-3d80-4b5e-8db2-55a0e3846802-sg-core-conf-yaml\") pod \"192bd785-3d80-4b5e-8db2-55a0e3846802\" (UID: \"192bd785-3d80-4b5e-8db2-55a0e3846802\") " Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.508368 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/192bd785-3d80-4b5e-8db2-55a0e3846802-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "192bd785-3d80-4b5e-8db2-55a0e3846802" (UID: "192bd785-3d80-4b5e-8db2-55a0e3846802"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.508572 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/192bd785-3d80-4b5e-8db2-55a0e3846802-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "192bd785-3d80-4b5e-8db2-55a0e3846802" (UID: "192bd785-3d80-4b5e-8db2-55a0e3846802"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.522537 4585 generic.go:334] "Generic (PLEG): container finished" podID="9dce0a63-d2c7-4004-b78a-5e7016fe10e9" containerID="b9bb77f521832e57692d2665547f08652dbf3fb329788eaea1b3c7ea2807b528" exitCode=0 Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.522593 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-77c848ff4-gtzqf" event={"ID":"9dce0a63-d2c7-4004-b78a-5e7016fe10e9","Type":"ContainerDied","Data":"b9bb77f521832e57692d2665547f08652dbf3fb329788eaea1b3c7ea2807b528"} Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.526131 4585 generic.go:334] "Generic (PLEG): container finished" podID="192bd785-3d80-4b5e-8db2-55a0e3846802" containerID="192f8a3c9f0df4f6db339068d880c601748747fe695b451b112e9dbbdef72909" exitCode=0 Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.526388 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.526668 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"192bd785-3d80-4b5e-8db2-55a0e3846802","Type":"ContainerDied","Data":"192f8a3c9f0df4f6db339068d880c601748747fe695b451b112e9dbbdef72909"} Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.526697 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"192bd785-3d80-4b5e-8db2-55a0e3846802","Type":"ContainerDied","Data":"300b84db12aa113144ca01a0ef795f55e74e70e671d8419994f67cc9a4ad413b"} Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.526712 4585 scope.go:117] "RemoveContainer" containerID="ccc5e3e663880685617e2d8343b2fcb96ecb36f91ce92ba25410114793298881" Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.537299 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/192bd785-3d80-4b5e-8db2-55a0e3846802-kube-api-access-25flm" (OuterVolumeSpecName: "kube-api-access-25flm") pod "192bd785-3d80-4b5e-8db2-55a0e3846802" (UID: "192bd785-3d80-4b5e-8db2-55a0e3846802"). InnerVolumeSpecName "kube-api-access-25flm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.537438 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/192bd785-3d80-4b5e-8db2-55a0e3846802-scripts" (OuterVolumeSpecName: "scripts") pod "192bd785-3d80-4b5e-8db2-55a0e3846802" (UID: "192bd785-3d80-4b5e-8db2-55a0e3846802"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.599058 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/192bd785-3d80-4b5e-8db2-55a0e3846802-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "192bd785-3d80-4b5e-8db2-55a0e3846802" (UID: "192bd785-3d80-4b5e-8db2-55a0e3846802"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.610332 4585 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/192bd785-3d80-4b5e-8db2-55a0e3846802-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.610358 4585 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/192bd785-3d80-4b5e-8db2-55a0e3846802-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.610386 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25flm\" (UniqueName: \"kubernetes.io/projected/192bd785-3d80-4b5e-8db2-55a0e3846802-kube-api-access-25flm\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.610398 4585 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/192bd785-3d80-4b5e-8db2-55a0e3846802-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.610406 4585 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/192bd785-3d80-4b5e-8db2-55a0e3846802-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.646299 4585 scope.go:117] "RemoveContainer" containerID="e0f5c62f7600bf9dc4ace09b2dbd0f4fa25968e200a287e821eb0803416bbb1a" Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.673197 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/192bd785-3d80-4b5e-8db2-55a0e3846802-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "192bd785-3d80-4b5e-8db2-55a0e3846802" (UID: "192bd785-3d80-4b5e-8db2-55a0e3846802"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.691142 4585 scope.go:117] "RemoveContainer" containerID="192f8a3c9f0df4f6db339068d880c601748747fe695b451b112e9dbbdef72909" Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.713731 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/192bd785-3d80-4b5e-8db2-55a0e3846802-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.735083 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-77c848ff4-gtzqf" Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.770333 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/192bd785-3d80-4b5e-8db2-55a0e3846802-config-data" (OuterVolumeSpecName: "config-data") pod "192bd785-3d80-4b5e-8db2-55a0e3846802" (UID: "192bd785-3d80-4b5e-8db2-55a0e3846802"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.798649 4585 scope.go:117] "RemoveContainer" containerID="a78817c1306272222990b2fafaaff52eb95a6582122a7f87b0e25cce4a8798d6" Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.818087 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dce0a63-d2c7-4004-b78a-5e7016fe10e9-config-data\") pod \"9dce0a63-d2c7-4004-b78a-5e7016fe10e9\" (UID: \"9dce0a63-d2c7-4004-b78a-5e7016fe10e9\") " Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.818214 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9dce0a63-d2c7-4004-b78a-5e7016fe10e9-config-data-custom\") pod \"9dce0a63-d2c7-4004-b78a-5e7016fe10e9\" (UID: \"9dce0a63-d2c7-4004-b78a-5e7016fe10e9\") " Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.818266 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9dce0a63-d2c7-4004-b78a-5e7016fe10e9-logs\") pod \"9dce0a63-d2c7-4004-b78a-5e7016fe10e9\" (UID: \"9dce0a63-d2c7-4004-b78a-5e7016fe10e9\") " Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.818286 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dce0a63-d2c7-4004-b78a-5e7016fe10e9-combined-ca-bundle\") pod \"9dce0a63-d2c7-4004-b78a-5e7016fe10e9\" (UID: \"9dce0a63-d2c7-4004-b78a-5e7016fe10e9\") " Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.818322 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwn6m\" (UniqueName: \"kubernetes.io/projected/9dce0a63-d2c7-4004-b78a-5e7016fe10e9-kube-api-access-lwn6m\") pod \"9dce0a63-d2c7-4004-b78a-5e7016fe10e9\" (UID: \"9dce0a63-d2c7-4004-b78a-5e7016fe10e9\") " Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.818676 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/192bd785-3d80-4b5e-8db2-55a0e3846802-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.823523 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dce0a63-d2c7-4004-b78a-5e7016fe10e9-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9dce0a63-d2c7-4004-b78a-5e7016fe10e9" (UID: "9dce0a63-d2c7-4004-b78a-5e7016fe10e9"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.823851 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9dce0a63-d2c7-4004-b78a-5e7016fe10e9-logs" (OuterVolumeSpecName: "logs") pod "9dce0a63-d2c7-4004-b78a-5e7016fe10e9" (UID: "9dce0a63-d2c7-4004-b78a-5e7016fe10e9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.833150 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9dce0a63-d2c7-4004-b78a-5e7016fe10e9-kube-api-access-lwn6m" (OuterVolumeSpecName: "kube-api-access-lwn6m") pod "9dce0a63-d2c7-4004-b78a-5e7016fe10e9" (UID: "9dce0a63-d2c7-4004-b78a-5e7016fe10e9"). InnerVolumeSpecName "kube-api-access-lwn6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.846911 4585 scope.go:117] "RemoveContainer" containerID="ccc5e3e663880685617e2d8343b2fcb96ecb36f91ce92ba25410114793298881" Dec 01 14:16:09 crc kubenswrapper[4585]: E1201 14:16:09.857080 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccc5e3e663880685617e2d8343b2fcb96ecb36f91ce92ba25410114793298881\": container with ID starting with ccc5e3e663880685617e2d8343b2fcb96ecb36f91ce92ba25410114793298881 not found: ID does not exist" containerID="ccc5e3e663880685617e2d8343b2fcb96ecb36f91ce92ba25410114793298881" Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.857290 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccc5e3e663880685617e2d8343b2fcb96ecb36f91ce92ba25410114793298881"} err="failed to get container status \"ccc5e3e663880685617e2d8343b2fcb96ecb36f91ce92ba25410114793298881\": rpc error: code = NotFound desc = could not find container \"ccc5e3e663880685617e2d8343b2fcb96ecb36f91ce92ba25410114793298881\": container with ID starting with ccc5e3e663880685617e2d8343b2fcb96ecb36f91ce92ba25410114793298881 not found: ID does not exist" Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.857390 4585 scope.go:117] "RemoveContainer" containerID="e0f5c62f7600bf9dc4ace09b2dbd0f4fa25968e200a287e821eb0803416bbb1a" Dec 01 14:16:09 crc kubenswrapper[4585]: E1201 14:16:09.859360 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0f5c62f7600bf9dc4ace09b2dbd0f4fa25968e200a287e821eb0803416bbb1a\": container with ID starting with e0f5c62f7600bf9dc4ace09b2dbd0f4fa25968e200a287e821eb0803416bbb1a not found: ID does not exist" containerID="e0f5c62f7600bf9dc4ace09b2dbd0f4fa25968e200a287e821eb0803416bbb1a" Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.859466 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0f5c62f7600bf9dc4ace09b2dbd0f4fa25968e200a287e821eb0803416bbb1a"} err="failed to get container status \"e0f5c62f7600bf9dc4ace09b2dbd0f4fa25968e200a287e821eb0803416bbb1a\": rpc error: code = NotFound desc = could not find container \"e0f5c62f7600bf9dc4ace09b2dbd0f4fa25968e200a287e821eb0803416bbb1a\": container with ID starting with e0f5c62f7600bf9dc4ace09b2dbd0f4fa25968e200a287e821eb0803416bbb1a not found: ID does not exist" Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.859535 4585 scope.go:117] "RemoveContainer" containerID="192f8a3c9f0df4f6db339068d880c601748747fe695b451b112e9dbbdef72909" Dec 01 14:16:09 crc kubenswrapper[4585]: E1201 14:16:09.863201 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"192f8a3c9f0df4f6db339068d880c601748747fe695b451b112e9dbbdef72909\": container with ID starting with 192f8a3c9f0df4f6db339068d880c601748747fe695b451b112e9dbbdef72909 not found: ID does not exist" containerID="192f8a3c9f0df4f6db339068d880c601748747fe695b451b112e9dbbdef72909" Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.863311 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"192f8a3c9f0df4f6db339068d880c601748747fe695b451b112e9dbbdef72909"} err="failed to get container status \"192f8a3c9f0df4f6db339068d880c601748747fe695b451b112e9dbbdef72909\": rpc error: code = NotFound desc = could not find container \"192f8a3c9f0df4f6db339068d880c601748747fe695b451b112e9dbbdef72909\": container with ID starting with 192f8a3c9f0df4f6db339068d880c601748747fe695b451b112e9dbbdef72909 not found: ID does not exist" Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.863378 4585 scope.go:117] "RemoveContainer" containerID="a78817c1306272222990b2fafaaff52eb95a6582122a7f87b0e25cce4a8798d6" Dec 01 14:16:09 crc kubenswrapper[4585]: E1201 14:16:09.870231 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a78817c1306272222990b2fafaaff52eb95a6582122a7f87b0e25cce4a8798d6\": container with ID starting with a78817c1306272222990b2fafaaff52eb95a6582122a7f87b0e25cce4a8798d6 not found: ID does not exist" containerID="a78817c1306272222990b2fafaaff52eb95a6582122a7f87b0e25cce4a8798d6" Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.870280 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a78817c1306272222990b2fafaaff52eb95a6582122a7f87b0e25cce4a8798d6"} err="failed to get container status \"a78817c1306272222990b2fafaaff52eb95a6582122a7f87b0e25cce4a8798d6\": rpc error: code = NotFound desc = could not find container \"a78817c1306272222990b2fafaaff52eb95a6582122a7f87b0e25cce4a8798d6\": container with ID starting with a78817c1306272222990b2fafaaff52eb95a6582122a7f87b0e25cce4a8798d6 not found: ID does not exist" Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.886424 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.898101 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dce0a63-d2c7-4004-b78a-5e7016fe10e9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9dce0a63-d2c7-4004-b78a-5e7016fe10e9" (UID: "9dce0a63-d2c7-4004-b78a-5e7016fe10e9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.912145 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.924746 4585 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9dce0a63-d2c7-4004-b78a-5e7016fe10e9-logs\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.924891 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dce0a63-d2c7-4004-b78a-5e7016fe10e9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.925207 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwn6m\" (UniqueName: \"kubernetes.io/projected/9dce0a63-d2c7-4004-b78a-5e7016fe10e9-kube-api-access-lwn6m\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.925384 4585 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9dce0a63-d2c7-4004-b78a-5e7016fe10e9-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.941022 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 14:16:09 crc kubenswrapper[4585]: E1201 14:16:09.941787 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="192bd785-3d80-4b5e-8db2-55a0e3846802" containerName="ceilometer-notification-agent" Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.941801 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="192bd785-3d80-4b5e-8db2-55a0e3846802" containerName="ceilometer-notification-agent" Dec 01 14:16:09 crc kubenswrapper[4585]: E1201 14:16:09.941837 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dce0a63-d2c7-4004-b78a-5e7016fe10e9" containerName="barbican-api-log" Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.941843 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dce0a63-d2c7-4004-b78a-5e7016fe10e9" containerName="barbican-api-log" Dec 01 14:16:09 crc kubenswrapper[4585]: E1201 14:16:09.941856 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="192bd785-3d80-4b5e-8db2-55a0e3846802" containerName="proxy-httpd" Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.941863 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="192bd785-3d80-4b5e-8db2-55a0e3846802" containerName="proxy-httpd" Dec 01 14:16:09 crc kubenswrapper[4585]: E1201 14:16:09.941883 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e9edaf4-b918-4370-8184-79de4b087dfc" containerName="neutron-api" Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.941890 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e9edaf4-b918-4370-8184-79de4b087dfc" containerName="neutron-api" Dec 01 14:16:09 crc kubenswrapper[4585]: E1201 14:16:09.941911 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e9edaf4-b918-4370-8184-79de4b087dfc" containerName="neutron-httpd" Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.941917 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e9edaf4-b918-4370-8184-79de4b087dfc" containerName="neutron-httpd" Dec 01 14:16:09 crc kubenswrapper[4585]: E1201 14:16:09.941939 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="192bd785-3d80-4b5e-8db2-55a0e3846802" containerName="ceilometer-central-agent" Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.941944 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="192bd785-3d80-4b5e-8db2-55a0e3846802" containerName="ceilometer-central-agent" Dec 01 14:16:09 crc kubenswrapper[4585]: E1201 14:16:09.941962 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dce0a63-d2c7-4004-b78a-5e7016fe10e9" containerName="barbican-api" Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.941982 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dce0a63-d2c7-4004-b78a-5e7016fe10e9" containerName="barbican-api" Dec 01 14:16:09 crc kubenswrapper[4585]: E1201 14:16:09.942000 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="192bd785-3d80-4b5e-8db2-55a0e3846802" containerName="sg-core" Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.942006 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="192bd785-3d80-4b5e-8db2-55a0e3846802" containerName="sg-core" Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.942323 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="192bd785-3d80-4b5e-8db2-55a0e3846802" containerName="sg-core" Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.942344 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="192bd785-3d80-4b5e-8db2-55a0e3846802" containerName="ceilometer-central-agent" Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.942366 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e9edaf4-b918-4370-8184-79de4b087dfc" containerName="neutron-httpd" Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.942381 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="192bd785-3d80-4b5e-8db2-55a0e3846802" containerName="proxy-httpd" Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.942401 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="192bd785-3d80-4b5e-8db2-55a0e3846802" containerName="ceilometer-notification-agent" Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.942407 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e9edaf4-b918-4370-8184-79de4b087dfc" containerName="neutron-api" Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.942417 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="9dce0a63-d2c7-4004-b78a-5e7016fe10e9" containerName="barbican-api" Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.942434 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="9dce0a63-d2c7-4004-b78a-5e7016fe10e9" containerName="barbican-api-log" Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.955451 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dce0a63-d2c7-4004-b78a-5e7016fe10e9-config-data" (OuterVolumeSpecName: "config-data") pod "9dce0a63-d2c7-4004-b78a-5e7016fe10e9" (UID: "9dce0a63-d2c7-4004-b78a-5e7016fe10e9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.961140 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.964741 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.965014 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 14:16:09 crc kubenswrapper[4585]: I1201 14:16:09.990244 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 14:16:10 crc kubenswrapper[4585]: I1201 14:16:10.026845 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvmrs\" (UniqueName: \"kubernetes.io/projected/50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0-kube-api-access-dvmrs\") pod \"ceilometer-0\" (UID: \"50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0\") " pod="openstack/ceilometer-0" Dec 01 14:16:10 crc kubenswrapper[4585]: I1201 14:16:10.026900 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0-scripts\") pod \"ceilometer-0\" (UID: \"50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0\") " pod="openstack/ceilometer-0" Dec 01 14:16:10 crc kubenswrapper[4585]: I1201 14:16:10.026927 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0-run-httpd\") pod \"ceilometer-0\" (UID: \"50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0\") " pod="openstack/ceilometer-0" Dec 01 14:16:10 crc kubenswrapper[4585]: I1201 14:16:10.026955 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0\") " pod="openstack/ceilometer-0" Dec 01 14:16:10 crc kubenswrapper[4585]: I1201 14:16:10.027020 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0\") " pod="openstack/ceilometer-0" Dec 01 14:16:10 crc kubenswrapper[4585]: I1201 14:16:10.027048 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0-config-data\") pod \"ceilometer-0\" (UID: \"50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0\") " pod="openstack/ceilometer-0" Dec 01 14:16:10 crc kubenswrapper[4585]: I1201 14:16:10.027078 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0-log-httpd\") pod \"ceilometer-0\" (UID: \"50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0\") " pod="openstack/ceilometer-0" Dec 01 14:16:10 crc kubenswrapper[4585]: I1201 14:16:10.027124 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dce0a63-d2c7-4004-b78a-5e7016fe10e9-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:10 crc kubenswrapper[4585]: I1201 14:16:10.128157 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0\") " pod="openstack/ceilometer-0" Dec 01 14:16:10 crc kubenswrapper[4585]: I1201 14:16:10.128221 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0\") " pod="openstack/ceilometer-0" Dec 01 14:16:10 crc kubenswrapper[4585]: I1201 14:16:10.128251 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0-config-data\") pod \"ceilometer-0\" (UID: \"50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0\") " pod="openstack/ceilometer-0" Dec 01 14:16:10 crc kubenswrapper[4585]: I1201 14:16:10.128282 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0-log-httpd\") pod \"ceilometer-0\" (UID: \"50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0\") " pod="openstack/ceilometer-0" Dec 01 14:16:10 crc kubenswrapper[4585]: I1201 14:16:10.128364 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvmrs\" (UniqueName: \"kubernetes.io/projected/50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0-kube-api-access-dvmrs\") pod \"ceilometer-0\" (UID: \"50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0\") " pod="openstack/ceilometer-0" Dec 01 14:16:10 crc kubenswrapper[4585]: I1201 14:16:10.128383 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0-scripts\") pod \"ceilometer-0\" (UID: \"50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0\") " pod="openstack/ceilometer-0" Dec 01 14:16:10 crc kubenswrapper[4585]: I1201 14:16:10.128410 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0-run-httpd\") pod \"ceilometer-0\" (UID: \"50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0\") " pod="openstack/ceilometer-0" Dec 01 14:16:10 crc kubenswrapper[4585]: I1201 14:16:10.128926 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0-run-httpd\") pod \"ceilometer-0\" (UID: \"50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0\") " pod="openstack/ceilometer-0" Dec 01 14:16:10 crc kubenswrapper[4585]: I1201 14:16:10.129175 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0-log-httpd\") pod \"ceilometer-0\" (UID: \"50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0\") " pod="openstack/ceilometer-0" Dec 01 14:16:10 crc kubenswrapper[4585]: I1201 14:16:10.132699 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0\") " pod="openstack/ceilometer-0" Dec 01 14:16:10 crc kubenswrapper[4585]: I1201 14:16:10.135753 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0\") " pod="openstack/ceilometer-0" Dec 01 14:16:10 crc kubenswrapper[4585]: I1201 14:16:10.142688 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0-config-data\") pod \"ceilometer-0\" (UID: \"50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0\") " pod="openstack/ceilometer-0" Dec 01 14:16:10 crc kubenswrapper[4585]: I1201 14:16:10.147342 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0-scripts\") pod \"ceilometer-0\" (UID: \"50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0\") " pod="openstack/ceilometer-0" Dec 01 14:16:10 crc kubenswrapper[4585]: I1201 14:16:10.147900 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvmrs\" (UniqueName: \"kubernetes.io/projected/50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0-kube-api-access-dvmrs\") pod \"ceilometer-0\" (UID: \"50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0\") " pod="openstack/ceilometer-0" Dec 01 14:16:10 crc kubenswrapper[4585]: I1201 14:16:10.282792 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 14:16:10 crc kubenswrapper[4585]: I1201 14:16:10.427606 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="192bd785-3d80-4b5e-8db2-55a0e3846802" path="/var/lib/kubelet/pods/192bd785-3d80-4b5e-8db2-55a0e3846802/volumes" Dec 01 14:16:10 crc kubenswrapper[4585]: I1201 14:16:10.545658 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-77c848ff4-gtzqf" Dec 01 14:16:10 crc kubenswrapper[4585]: I1201 14:16:10.546267 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-77c848ff4-gtzqf" event={"ID":"9dce0a63-d2c7-4004-b78a-5e7016fe10e9","Type":"ContainerDied","Data":"9b56d7d3c4b389b564a1a9b8ef560950db9b065727e314987d81a739139362e4"} Dec 01 14:16:10 crc kubenswrapper[4585]: I1201 14:16:10.546329 4585 scope.go:117] "RemoveContainer" containerID="b9bb77f521832e57692d2665547f08652dbf3fb329788eaea1b3c7ea2807b528" Dec 01 14:16:10 crc kubenswrapper[4585]: I1201 14:16:10.574704 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-77c848ff4-gtzqf"] Dec 01 14:16:10 crc kubenswrapper[4585]: I1201 14:16:10.575854 4585 scope.go:117] "RemoveContainer" containerID="6bc4be9fb2c8c795d2d716883a9fe496c594d9899aca60b4fb41a4607d1e190f" Dec 01 14:16:10 crc kubenswrapper[4585]: I1201 14:16:10.590946 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-77c848ff4-gtzqf"] Dec 01 14:16:10 crc kubenswrapper[4585]: I1201 14:16:10.909251 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 14:16:10 crc kubenswrapper[4585]: I1201 14:16:10.979272 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-576d96b8bf-jl74m" Dec 01 14:16:11 crc kubenswrapper[4585]: I1201 14:16:11.566126 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0","Type":"ContainerStarted","Data":"37ea6096c5168750f5d14f8fda1b339dc338d5e85c09775514a41af279e9af46"} Dec 01 14:16:12 crc kubenswrapper[4585]: I1201 14:16:12.422056 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9dce0a63-d2c7-4004-b78a-5e7016fe10e9" path="/var/lib/kubelet/pods/9dce0a63-d2c7-4004-b78a-5e7016fe10e9/volumes" Dec 01 14:16:12 crc kubenswrapper[4585]: I1201 14:16:12.602914 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0","Type":"ContainerStarted","Data":"b0eb593f0744ac672633eec711ebaac9fa0a287fef4a949cc99dcd5d1b19b7d3"} Dec 01 14:16:13 crc kubenswrapper[4585]: I1201 14:16:13.612655 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0","Type":"ContainerStarted","Data":"35efd55eac44182903f892f02ab338367b78ab9ae756aaa0f340e6e43426aacc"} Dec 01 14:16:13 crc kubenswrapper[4585]: I1201 14:16:13.827213 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-tztg6" Dec 01 14:16:13 crc kubenswrapper[4585]: I1201 14:16:13.914666 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-5g4hh"] Dec 01 14:16:13 crc kubenswrapper[4585]: I1201 14:16:13.915210 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-85ff748b95-5g4hh" podUID="a857dd33-a536-4375-80cc-318b8a8023b6" containerName="dnsmasq-dns" containerID="cri-o://f17be2060ba478c7cdd9bf40be0173d1531cfd8b94f864a02657c9ed11f06e32" gracePeriod=10 Dec 01 14:16:14 crc kubenswrapper[4585]: I1201 14:16:14.129594 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 01 14:16:14 crc kubenswrapper[4585]: I1201 14:16:14.173800 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 14:16:14 crc kubenswrapper[4585]: I1201 14:16:14.457837 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-5g4hh" Dec 01 14:16:14 crc kubenswrapper[4585]: I1201 14:16:14.544357 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a857dd33-a536-4375-80cc-318b8a8023b6-ovsdbserver-nb\") pod \"a857dd33-a536-4375-80cc-318b8a8023b6\" (UID: \"a857dd33-a536-4375-80cc-318b8a8023b6\") " Dec 01 14:16:14 crc kubenswrapper[4585]: I1201 14:16:14.544893 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zs5n8\" (UniqueName: \"kubernetes.io/projected/a857dd33-a536-4375-80cc-318b8a8023b6-kube-api-access-zs5n8\") pod \"a857dd33-a536-4375-80cc-318b8a8023b6\" (UID: \"a857dd33-a536-4375-80cc-318b8a8023b6\") " Dec 01 14:16:14 crc kubenswrapper[4585]: I1201 14:16:14.544943 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a857dd33-a536-4375-80cc-318b8a8023b6-ovsdbserver-sb\") pod \"a857dd33-a536-4375-80cc-318b8a8023b6\" (UID: \"a857dd33-a536-4375-80cc-318b8a8023b6\") " Dec 01 14:16:14 crc kubenswrapper[4585]: I1201 14:16:14.545016 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a857dd33-a536-4375-80cc-318b8a8023b6-dns-svc\") pod \"a857dd33-a536-4375-80cc-318b8a8023b6\" (UID: \"a857dd33-a536-4375-80cc-318b8a8023b6\") " Dec 01 14:16:14 crc kubenswrapper[4585]: I1201 14:16:14.545094 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a857dd33-a536-4375-80cc-318b8a8023b6-dns-swift-storage-0\") pod \"a857dd33-a536-4375-80cc-318b8a8023b6\" (UID: \"a857dd33-a536-4375-80cc-318b8a8023b6\") " Dec 01 14:16:14 crc kubenswrapper[4585]: I1201 14:16:14.545133 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a857dd33-a536-4375-80cc-318b8a8023b6-config\") pod \"a857dd33-a536-4375-80cc-318b8a8023b6\" (UID: \"a857dd33-a536-4375-80cc-318b8a8023b6\") " Dec 01 14:16:14 crc kubenswrapper[4585]: I1201 14:16:14.567317 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a857dd33-a536-4375-80cc-318b8a8023b6-kube-api-access-zs5n8" (OuterVolumeSpecName: "kube-api-access-zs5n8") pod "a857dd33-a536-4375-80cc-318b8a8023b6" (UID: "a857dd33-a536-4375-80cc-318b8a8023b6"). InnerVolumeSpecName "kube-api-access-zs5n8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:16:14 crc kubenswrapper[4585]: I1201 14:16:14.595666 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 01 14:16:14 crc kubenswrapper[4585]: E1201 14:16:14.596247 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a857dd33-a536-4375-80cc-318b8a8023b6" containerName="dnsmasq-dns" Dec 01 14:16:14 crc kubenswrapper[4585]: I1201 14:16:14.596264 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="a857dd33-a536-4375-80cc-318b8a8023b6" containerName="dnsmasq-dns" Dec 01 14:16:14 crc kubenswrapper[4585]: E1201 14:16:14.596307 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a857dd33-a536-4375-80cc-318b8a8023b6" containerName="init" Dec 01 14:16:14 crc kubenswrapper[4585]: I1201 14:16:14.596314 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="a857dd33-a536-4375-80cc-318b8a8023b6" containerName="init" Dec 01 14:16:14 crc kubenswrapper[4585]: I1201 14:16:14.596501 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="a857dd33-a536-4375-80cc-318b8a8023b6" containerName="dnsmasq-dns" Dec 01 14:16:14 crc kubenswrapper[4585]: I1201 14:16:14.597153 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 01 14:16:14 crc kubenswrapper[4585]: I1201 14:16:14.599619 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-hkrb7" Dec 01 14:16:14 crc kubenswrapper[4585]: I1201 14:16:14.600138 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 01 14:16:14 crc kubenswrapper[4585]: I1201 14:16:14.600265 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 01 14:16:14 crc kubenswrapper[4585]: I1201 14:16:14.613310 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 01 14:16:14 crc kubenswrapper[4585]: I1201 14:16:14.632236 4585 generic.go:334] "Generic (PLEG): container finished" podID="a857dd33-a536-4375-80cc-318b8a8023b6" containerID="f17be2060ba478c7cdd9bf40be0173d1531cfd8b94f864a02657c9ed11f06e32" exitCode=0 Dec 01 14:16:14 crc kubenswrapper[4585]: I1201 14:16:14.632309 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-5g4hh" event={"ID":"a857dd33-a536-4375-80cc-318b8a8023b6","Type":"ContainerDied","Data":"f17be2060ba478c7cdd9bf40be0173d1531cfd8b94f864a02657c9ed11f06e32"} Dec 01 14:16:14 crc kubenswrapper[4585]: I1201 14:16:14.632340 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-5g4hh" event={"ID":"a857dd33-a536-4375-80cc-318b8a8023b6","Type":"ContainerDied","Data":"9e57c7464ae33b9da0dc4fc44f2a21ca7cea020cbff53d2a34423f406f594a47"} Dec 01 14:16:14 crc kubenswrapper[4585]: I1201 14:16:14.632358 4585 scope.go:117] "RemoveContainer" containerID="f17be2060ba478c7cdd9bf40be0173d1531cfd8b94f864a02657c9ed11f06e32" Dec 01 14:16:14 crc kubenswrapper[4585]: I1201 14:16:14.632499 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-5g4hh" Dec 01 14:16:14 crc kubenswrapper[4585]: I1201 14:16:14.641472 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="1b331842-f530-45ae-92e4-59fd379833b0" containerName="cinder-scheduler" containerID="cri-o://d6854a6b5518f5a46d76ea04013f9dd8f5c9f68c5ce1c10210b74d60ea48925e" gracePeriod=30 Dec 01 14:16:14 crc kubenswrapper[4585]: I1201 14:16:14.641677 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0","Type":"ContainerStarted","Data":"44ac16e77a217a165b7ab75f4de65db78cc2a9b0babeb3e2eb14c57da0487bba"} Dec 01 14:16:14 crc kubenswrapper[4585]: I1201 14:16:14.641913 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="1b331842-f530-45ae-92e4-59fd379833b0" containerName="probe" containerID="cri-o://19fb59a953575ee686aa9bbd83e688a83c3e716c8240424b250f91c4e1064636" gracePeriod=30 Dec 01 14:16:14 crc kubenswrapper[4585]: I1201 14:16:14.648704 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29f23dd7-40f6-4677-bd4c-ebdf3152b72f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"29f23dd7-40f6-4677-bd4c-ebdf3152b72f\") " pod="openstack/openstackclient" Dec 01 14:16:14 crc kubenswrapper[4585]: I1201 14:16:14.648830 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/29f23dd7-40f6-4677-bd4c-ebdf3152b72f-openstack-config\") pod \"openstackclient\" (UID: \"29f23dd7-40f6-4677-bd4c-ebdf3152b72f\") " pod="openstack/openstackclient" Dec 01 14:16:14 crc kubenswrapper[4585]: I1201 14:16:14.648851 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvs8h\" (UniqueName: \"kubernetes.io/projected/29f23dd7-40f6-4677-bd4c-ebdf3152b72f-kube-api-access-gvs8h\") pod \"openstackclient\" (UID: \"29f23dd7-40f6-4677-bd4c-ebdf3152b72f\") " pod="openstack/openstackclient" Dec 01 14:16:14 crc kubenswrapper[4585]: I1201 14:16:14.648895 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/29f23dd7-40f6-4677-bd4c-ebdf3152b72f-openstack-config-secret\") pod \"openstackclient\" (UID: \"29f23dd7-40f6-4677-bd4c-ebdf3152b72f\") " pod="openstack/openstackclient" Dec 01 14:16:14 crc kubenswrapper[4585]: I1201 14:16:14.649010 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zs5n8\" (UniqueName: \"kubernetes.io/projected/a857dd33-a536-4375-80cc-318b8a8023b6-kube-api-access-zs5n8\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:14 crc kubenswrapper[4585]: I1201 14:16:14.659199 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a857dd33-a536-4375-80cc-318b8a8023b6-config" (OuterVolumeSpecName: "config") pod "a857dd33-a536-4375-80cc-318b8a8023b6" (UID: "a857dd33-a536-4375-80cc-318b8a8023b6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:16:14 crc kubenswrapper[4585]: I1201 14:16:14.665810 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a857dd33-a536-4375-80cc-318b8a8023b6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a857dd33-a536-4375-80cc-318b8a8023b6" (UID: "a857dd33-a536-4375-80cc-318b8a8023b6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:16:14 crc kubenswrapper[4585]: I1201 14:16:14.667862 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a857dd33-a536-4375-80cc-318b8a8023b6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a857dd33-a536-4375-80cc-318b8a8023b6" (UID: "a857dd33-a536-4375-80cc-318b8a8023b6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:16:14 crc kubenswrapper[4585]: I1201 14:16:14.686415 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a857dd33-a536-4375-80cc-318b8a8023b6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a857dd33-a536-4375-80cc-318b8a8023b6" (UID: "a857dd33-a536-4375-80cc-318b8a8023b6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:16:14 crc kubenswrapper[4585]: I1201 14:16:14.709680 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a857dd33-a536-4375-80cc-318b8a8023b6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a857dd33-a536-4375-80cc-318b8a8023b6" (UID: "a857dd33-a536-4375-80cc-318b8a8023b6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:16:14 crc kubenswrapper[4585]: I1201 14:16:14.750624 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/29f23dd7-40f6-4677-bd4c-ebdf3152b72f-openstack-config\") pod \"openstackclient\" (UID: \"29f23dd7-40f6-4677-bd4c-ebdf3152b72f\") " pod="openstack/openstackclient" Dec 01 14:16:14 crc kubenswrapper[4585]: I1201 14:16:14.750667 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvs8h\" (UniqueName: \"kubernetes.io/projected/29f23dd7-40f6-4677-bd4c-ebdf3152b72f-kube-api-access-gvs8h\") pod \"openstackclient\" (UID: \"29f23dd7-40f6-4677-bd4c-ebdf3152b72f\") " pod="openstack/openstackclient" Dec 01 14:16:14 crc kubenswrapper[4585]: I1201 14:16:14.750715 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/29f23dd7-40f6-4677-bd4c-ebdf3152b72f-openstack-config-secret\") pod \"openstackclient\" (UID: \"29f23dd7-40f6-4677-bd4c-ebdf3152b72f\") " pod="openstack/openstackclient" Dec 01 14:16:14 crc kubenswrapper[4585]: I1201 14:16:14.750754 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29f23dd7-40f6-4677-bd4c-ebdf3152b72f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"29f23dd7-40f6-4677-bd4c-ebdf3152b72f\") " pod="openstack/openstackclient" Dec 01 14:16:14 crc kubenswrapper[4585]: I1201 14:16:14.752173 4585 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a857dd33-a536-4375-80cc-318b8a8023b6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:14 crc kubenswrapper[4585]: I1201 14:16:14.752201 4585 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a857dd33-a536-4375-80cc-318b8a8023b6-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:14 crc kubenswrapper[4585]: I1201 14:16:14.752211 4585 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a857dd33-a536-4375-80cc-318b8a8023b6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:14 crc kubenswrapper[4585]: I1201 14:16:14.752222 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a857dd33-a536-4375-80cc-318b8a8023b6-config\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:14 crc kubenswrapper[4585]: I1201 14:16:14.752231 4585 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a857dd33-a536-4375-80cc-318b8a8023b6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:14 crc kubenswrapper[4585]: I1201 14:16:14.753293 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/29f23dd7-40f6-4677-bd4c-ebdf3152b72f-openstack-config\") pod \"openstackclient\" (UID: \"29f23dd7-40f6-4677-bd4c-ebdf3152b72f\") " pod="openstack/openstackclient" Dec 01 14:16:14 crc kubenswrapper[4585]: I1201 14:16:14.755517 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29f23dd7-40f6-4677-bd4c-ebdf3152b72f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"29f23dd7-40f6-4677-bd4c-ebdf3152b72f\") " pod="openstack/openstackclient" Dec 01 14:16:14 crc kubenswrapper[4585]: I1201 14:16:14.762398 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/29f23dd7-40f6-4677-bd4c-ebdf3152b72f-openstack-config-secret\") pod \"openstackclient\" (UID: \"29f23dd7-40f6-4677-bd4c-ebdf3152b72f\") " pod="openstack/openstackclient" Dec 01 14:16:14 crc kubenswrapper[4585]: I1201 14:16:14.775750 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvs8h\" (UniqueName: \"kubernetes.io/projected/29f23dd7-40f6-4677-bd4c-ebdf3152b72f-kube-api-access-gvs8h\") pod \"openstackclient\" (UID: \"29f23dd7-40f6-4677-bd4c-ebdf3152b72f\") " pod="openstack/openstackclient" Dec 01 14:16:14 crc kubenswrapper[4585]: I1201 14:16:14.859144 4585 scope.go:117] "RemoveContainer" containerID="d2c1ab56ed39195ab4749b25a23c8d4a0a9b9c1f9162e96ad7c146782c89f5a4" Dec 01 14:16:14 crc kubenswrapper[4585]: I1201 14:16:14.881134 4585 scope.go:117] "RemoveContainer" containerID="f17be2060ba478c7cdd9bf40be0173d1531cfd8b94f864a02657c9ed11f06e32" Dec 01 14:16:14 crc kubenswrapper[4585]: E1201 14:16:14.881497 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f17be2060ba478c7cdd9bf40be0173d1531cfd8b94f864a02657c9ed11f06e32\": container with ID starting with f17be2060ba478c7cdd9bf40be0173d1531cfd8b94f864a02657c9ed11f06e32 not found: ID does not exist" containerID="f17be2060ba478c7cdd9bf40be0173d1531cfd8b94f864a02657c9ed11f06e32" Dec 01 14:16:14 crc kubenswrapper[4585]: I1201 14:16:14.881534 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f17be2060ba478c7cdd9bf40be0173d1531cfd8b94f864a02657c9ed11f06e32"} err="failed to get container status \"f17be2060ba478c7cdd9bf40be0173d1531cfd8b94f864a02657c9ed11f06e32\": rpc error: code = NotFound desc = could not find container \"f17be2060ba478c7cdd9bf40be0173d1531cfd8b94f864a02657c9ed11f06e32\": container with ID starting with f17be2060ba478c7cdd9bf40be0173d1531cfd8b94f864a02657c9ed11f06e32 not found: ID does not exist" Dec 01 14:16:14 crc kubenswrapper[4585]: I1201 14:16:14.881557 4585 scope.go:117] "RemoveContainer" containerID="d2c1ab56ed39195ab4749b25a23c8d4a0a9b9c1f9162e96ad7c146782c89f5a4" Dec 01 14:16:14 crc kubenswrapper[4585]: E1201 14:16:14.882227 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2c1ab56ed39195ab4749b25a23c8d4a0a9b9c1f9162e96ad7c146782c89f5a4\": container with ID starting with d2c1ab56ed39195ab4749b25a23c8d4a0a9b9c1f9162e96ad7c146782c89f5a4 not found: ID does not exist" containerID="d2c1ab56ed39195ab4749b25a23c8d4a0a9b9c1f9162e96ad7c146782c89f5a4" Dec 01 14:16:14 crc kubenswrapper[4585]: I1201 14:16:14.882285 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2c1ab56ed39195ab4749b25a23c8d4a0a9b9c1f9162e96ad7c146782c89f5a4"} err="failed to get container status \"d2c1ab56ed39195ab4749b25a23c8d4a0a9b9c1f9162e96ad7c146782c89f5a4\": rpc error: code = NotFound desc = could not find container \"d2c1ab56ed39195ab4749b25a23c8d4a0a9b9c1f9162e96ad7c146782c89f5a4\": container with ID starting with d2c1ab56ed39195ab4749b25a23c8d4a0a9b9c1f9162e96ad7c146782c89f5a4 not found: ID does not exist" Dec 01 14:16:14 crc kubenswrapper[4585]: I1201 14:16:14.932518 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 01 14:16:14 crc kubenswrapper[4585]: I1201 14:16:14.980428 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-5g4hh"] Dec 01 14:16:14 crc kubenswrapper[4585]: I1201 14:16:14.988288 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-5g4hh"] Dec 01 14:16:15 crc kubenswrapper[4585]: I1201 14:16:15.484277 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 01 14:16:15 crc kubenswrapper[4585]: I1201 14:16:15.649769 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"29f23dd7-40f6-4677-bd4c-ebdf3152b72f","Type":"ContainerStarted","Data":"89e9420b920a2afa79ed6b16eb8af7761c338d2dd8b9ae62a744d00329354ab4"} Dec 01 14:16:15 crc kubenswrapper[4585]: I1201 14:16:15.653164 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0","Type":"ContainerStarted","Data":"3a9ba0070c1ab7d5e1a7069b52779c3f7403565c411ecec70078fa83ead02784"} Dec 01 14:16:15 crc kubenswrapper[4585]: I1201 14:16:15.654184 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 14:16:15 crc kubenswrapper[4585]: I1201 14:16:15.678372 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.683291405 podStartE2EDuration="6.678351937s" podCreationTimestamp="2025-12-01 14:16:09 +0000 UTC" firstStartedPulling="2025-12-01 14:16:10.916163867 +0000 UTC m=+1084.900377722" lastFinishedPulling="2025-12-01 14:16:14.911224399 +0000 UTC m=+1088.895438254" observedRunningTime="2025-12-01 14:16:15.673884128 +0000 UTC m=+1089.658097983" watchObservedRunningTime="2025-12-01 14:16:15.678351937 +0000 UTC m=+1089.662565792" Dec 01 14:16:16 crc kubenswrapper[4585]: I1201 14:16:16.263242 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 01 14:16:16 crc kubenswrapper[4585]: I1201 14:16:16.389515 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b331842-f530-45ae-92e4-59fd379833b0-combined-ca-bundle\") pod \"1b331842-f530-45ae-92e4-59fd379833b0\" (UID: \"1b331842-f530-45ae-92e4-59fd379833b0\") " Dec 01 14:16:16 crc kubenswrapper[4585]: I1201 14:16:16.389575 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qfdg\" (UniqueName: \"kubernetes.io/projected/1b331842-f530-45ae-92e4-59fd379833b0-kube-api-access-9qfdg\") pod \"1b331842-f530-45ae-92e4-59fd379833b0\" (UID: \"1b331842-f530-45ae-92e4-59fd379833b0\") " Dec 01 14:16:16 crc kubenswrapper[4585]: I1201 14:16:16.389611 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b331842-f530-45ae-92e4-59fd379833b0-scripts\") pod \"1b331842-f530-45ae-92e4-59fd379833b0\" (UID: \"1b331842-f530-45ae-92e4-59fd379833b0\") " Dec 01 14:16:16 crc kubenswrapper[4585]: I1201 14:16:16.389685 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1b331842-f530-45ae-92e4-59fd379833b0-etc-machine-id\") pod \"1b331842-f530-45ae-92e4-59fd379833b0\" (UID: \"1b331842-f530-45ae-92e4-59fd379833b0\") " Dec 01 14:16:16 crc kubenswrapper[4585]: I1201 14:16:16.389743 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1b331842-f530-45ae-92e4-59fd379833b0-config-data-custom\") pod \"1b331842-f530-45ae-92e4-59fd379833b0\" (UID: \"1b331842-f530-45ae-92e4-59fd379833b0\") " Dec 01 14:16:16 crc kubenswrapper[4585]: I1201 14:16:16.389784 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b331842-f530-45ae-92e4-59fd379833b0-config-data\") pod \"1b331842-f530-45ae-92e4-59fd379833b0\" (UID: \"1b331842-f530-45ae-92e4-59fd379833b0\") " Dec 01 14:16:16 crc kubenswrapper[4585]: I1201 14:16:16.390840 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1b331842-f530-45ae-92e4-59fd379833b0-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "1b331842-f530-45ae-92e4-59fd379833b0" (UID: "1b331842-f530-45ae-92e4-59fd379833b0"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 14:16:16 crc kubenswrapper[4585]: I1201 14:16:16.396322 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b331842-f530-45ae-92e4-59fd379833b0-kube-api-access-9qfdg" (OuterVolumeSpecName: "kube-api-access-9qfdg") pod "1b331842-f530-45ae-92e4-59fd379833b0" (UID: "1b331842-f530-45ae-92e4-59fd379833b0"). InnerVolumeSpecName "kube-api-access-9qfdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:16:16 crc kubenswrapper[4585]: I1201 14:16:16.404538 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b331842-f530-45ae-92e4-59fd379833b0-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1b331842-f530-45ae-92e4-59fd379833b0" (UID: "1b331842-f530-45ae-92e4-59fd379833b0"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:16:16 crc kubenswrapper[4585]: I1201 14:16:16.404826 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b331842-f530-45ae-92e4-59fd379833b0-scripts" (OuterVolumeSpecName: "scripts") pod "1b331842-f530-45ae-92e4-59fd379833b0" (UID: "1b331842-f530-45ae-92e4-59fd379833b0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:16:16 crc kubenswrapper[4585]: I1201 14:16:16.430691 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a857dd33-a536-4375-80cc-318b8a8023b6" path="/var/lib/kubelet/pods/a857dd33-a536-4375-80cc-318b8a8023b6/volumes" Dec 01 14:16:16 crc kubenswrapper[4585]: I1201 14:16:16.492847 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qfdg\" (UniqueName: \"kubernetes.io/projected/1b331842-f530-45ae-92e4-59fd379833b0-kube-api-access-9qfdg\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:16 crc kubenswrapper[4585]: I1201 14:16:16.492879 4585 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b331842-f530-45ae-92e4-59fd379833b0-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:16 crc kubenswrapper[4585]: I1201 14:16:16.492889 4585 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1b331842-f530-45ae-92e4-59fd379833b0-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:16 crc kubenswrapper[4585]: I1201 14:16:16.492902 4585 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1b331842-f530-45ae-92e4-59fd379833b0-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:16 crc kubenswrapper[4585]: I1201 14:16:16.502189 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b331842-f530-45ae-92e4-59fd379833b0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1b331842-f530-45ae-92e4-59fd379833b0" (UID: "1b331842-f530-45ae-92e4-59fd379833b0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:16:16 crc kubenswrapper[4585]: I1201 14:16:16.517866 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b331842-f530-45ae-92e4-59fd379833b0-config-data" (OuterVolumeSpecName: "config-data") pod "1b331842-f530-45ae-92e4-59fd379833b0" (UID: "1b331842-f530-45ae-92e4-59fd379833b0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:16:16 crc kubenswrapper[4585]: I1201 14:16:16.601575 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b331842-f530-45ae-92e4-59fd379833b0-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:16 crc kubenswrapper[4585]: I1201 14:16:16.604932 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b331842-f530-45ae-92e4-59fd379833b0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:16 crc kubenswrapper[4585]: I1201 14:16:16.671809 4585 generic.go:334] "Generic (PLEG): container finished" podID="1b331842-f530-45ae-92e4-59fd379833b0" containerID="19fb59a953575ee686aa9bbd83e688a83c3e716c8240424b250f91c4e1064636" exitCode=0 Dec 01 14:16:16 crc kubenswrapper[4585]: I1201 14:16:16.671839 4585 generic.go:334] "Generic (PLEG): container finished" podID="1b331842-f530-45ae-92e4-59fd379833b0" containerID="d6854a6b5518f5a46d76ea04013f9dd8f5c9f68c5ce1c10210b74d60ea48925e" exitCode=0 Dec 01 14:16:16 crc kubenswrapper[4585]: I1201 14:16:16.672132 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 01 14:16:16 crc kubenswrapper[4585]: I1201 14:16:16.673029 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1b331842-f530-45ae-92e4-59fd379833b0","Type":"ContainerDied","Data":"19fb59a953575ee686aa9bbd83e688a83c3e716c8240424b250f91c4e1064636"} Dec 01 14:16:16 crc kubenswrapper[4585]: I1201 14:16:16.673053 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1b331842-f530-45ae-92e4-59fd379833b0","Type":"ContainerDied","Data":"d6854a6b5518f5a46d76ea04013f9dd8f5c9f68c5ce1c10210b74d60ea48925e"} Dec 01 14:16:16 crc kubenswrapper[4585]: I1201 14:16:16.673064 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1b331842-f530-45ae-92e4-59fd379833b0","Type":"ContainerDied","Data":"fdde948a4a318c9a93a5e724b97fa3471a798c0b5ef9238d83356ee264b89904"} Dec 01 14:16:16 crc kubenswrapper[4585]: I1201 14:16:16.673077 4585 scope.go:117] "RemoveContainer" containerID="19fb59a953575ee686aa9bbd83e688a83c3e716c8240424b250f91c4e1064636" Dec 01 14:16:16 crc kubenswrapper[4585]: I1201 14:16:16.702201 4585 scope.go:117] "RemoveContainer" containerID="d6854a6b5518f5a46d76ea04013f9dd8f5c9f68c5ce1c10210b74d60ea48925e" Dec 01 14:16:16 crc kubenswrapper[4585]: I1201 14:16:16.727109 4585 scope.go:117] "RemoveContainer" containerID="19fb59a953575ee686aa9bbd83e688a83c3e716c8240424b250f91c4e1064636" Dec 01 14:16:16 crc kubenswrapper[4585]: I1201 14:16:16.727565 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 14:16:16 crc kubenswrapper[4585]: E1201 14:16:16.727903 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19fb59a953575ee686aa9bbd83e688a83c3e716c8240424b250f91c4e1064636\": container with ID starting with 19fb59a953575ee686aa9bbd83e688a83c3e716c8240424b250f91c4e1064636 not found: ID does not exist" containerID="19fb59a953575ee686aa9bbd83e688a83c3e716c8240424b250f91c4e1064636" Dec 01 14:16:16 crc kubenswrapper[4585]: I1201 14:16:16.727934 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19fb59a953575ee686aa9bbd83e688a83c3e716c8240424b250f91c4e1064636"} err="failed to get container status \"19fb59a953575ee686aa9bbd83e688a83c3e716c8240424b250f91c4e1064636\": rpc error: code = NotFound desc = could not find container \"19fb59a953575ee686aa9bbd83e688a83c3e716c8240424b250f91c4e1064636\": container with ID starting with 19fb59a953575ee686aa9bbd83e688a83c3e716c8240424b250f91c4e1064636 not found: ID does not exist" Dec 01 14:16:16 crc kubenswrapper[4585]: I1201 14:16:16.727963 4585 scope.go:117] "RemoveContainer" containerID="d6854a6b5518f5a46d76ea04013f9dd8f5c9f68c5ce1c10210b74d60ea48925e" Dec 01 14:16:16 crc kubenswrapper[4585]: E1201 14:16:16.731064 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6854a6b5518f5a46d76ea04013f9dd8f5c9f68c5ce1c10210b74d60ea48925e\": container with ID starting with d6854a6b5518f5a46d76ea04013f9dd8f5c9f68c5ce1c10210b74d60ea48925e not found: ID does not exist" containerID="d6854a6b5518f5a46d76ea04013f9dd8f5c9f68c5ce1c10210b74d60ea48925e" Dec 01 14:16:16 crc kubenswrapper[4585]: I1201 14:16:16.731094 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6854a6b5518f5a46d76ea04013f9dd8f5c9f68c5ce1c10210b74d60ea48925e"} err="failed to get container status \"d6854a6b5518f5a46d76ea04013f9dd8f5c9f68c5ce1c10210b74d60ea48925e\": rpc error: code = NotFound desc = could not find container \"d6854a6b5518f5a46d76ea04013f9dd8f5c9f68c5ce1c10210b74d60ea48925e\": container with ID starting with d6854a6b5518f5a46d76ea04013f9dd8f5c9f68c5ce1c10210b74d60ea48925e not found: ID does not exist" Dec 01 14:16:16 crc kubenswrapper[4585]: I1201 14:16:16.731113 4585 scope.go:117] "RemoveContainer" containerID="19fb59a953575ee686aa9bbd83e688a83c3e716c8240424b250f91c4e1064636" Dec 01 14:16:16 crc kubenswrapper[4585]: I1201 14:16:16.731820 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19fb59a953575ee686aa9bbd83e688a83c3e716c8240424b250f91c4e1064636"} err="failed to get container status \"19fb59a953575ee686aa9bbd83e688a83c3e716c8240424b250f91c4e1064636\": rpc error: code = NotFound desc = could not find container \"19fb59a953575ee686aa9bbd83e688a83c3e716c8240424b250f91c4e1064636\": container with ID starting with 19fb59a953575ee686aa9bbd83e688a83c3e716c8240424b250f91c4e1064636 not found: ID does not exist" Dec 01 14:16:16 crc kubenswrapper[4585]: I1201 14:16:16.731851 4585 scope.go:117] "RemoveContainer" containerID="d6854a6b5518f5a46d76ea04013f9dd8f5c9f68c5ce1c10210b74d60ea48925e" Dec 01 14:16:16 crc kubenswrapper[4585]: I1201 14:16:16.732217 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6854a6b5518f5a46d76ea04013f9dd8f5c9f68c5ce1c10210b74d60ea48925e"} err="failed to get container status \"d6854a6b5518f5a46d76ea04013f9dd8f5c9f68c5ce1c10210b74d60ea48925e\": rpc error: code = NotFound desc = could not find container \"d6854a6b5518f5a46d76ea04013f9dd8f5c9f68c5ce1c10210b74d60ea48925e\": container with ID starting with d6854a6b5518f5a46d76ea04013f9dd8f5c9f68c5ce1c10210b74d60ea48925e not found: ID does not exist" Dec 01 14:16:16 crc kubenswrapper[4585]: I1201 14:16:16.749031 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 14:16:16 crc kubenswrapper[4585]: I1201 14:16:16.756796 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 14:16:16 crc kubenswrapper[4585]: E1201 14:16:16.757201 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b331842-f530-45ae-92e4-59fd379833b0" containerName="probe" Dec 01 14:16:16 crc kubenswrapper[4585]: I1201 14:16:16.757217 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b331842-f530-45ae-92e4-59fd379833b0" containerName="probe" Dec 01 14:16:16 crc kubenswrapper[4585]: E1201 14:16:16.757245 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b331842-f530-45ae-92e4-59fd379833b0" containerName="cinder-scheduler" Dec 01 14:16:16 crc kubenswrapper[4585]: I1201 14:16:16.757251 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b331842-f530-45ae-92e4-59fd379833b0" containerName="cinder-scheduler" Dec 01 14:16:16 crc kubenswrapper[4585]: I1201 14:16:16.757430 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b331842-f530-45ae-92e4-59fd379833b0" containerName="cinder-scheduler" Dec 01 14:16:16 crc kubenswrapper[4585]: I1201 14:16:16.757452 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b331842-f530-45ae-92e4-59fd379833b0" containerName="probe" Dec 01 14:16:16 crc kubenswrapper[4585]: I1201 14:16:16.758354 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 01 14:16:16 crc kubenswrapper[4585]: I1201 14:16:16.761260 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 01 14:16:16 crc kubenswrapper[4585]: I1201 14:16:16.768951 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 14:16:16 crc kubenswrapper[4585]: I1201 14:16:16.910519 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68b9fd2e-0f05-46d6-86aa-319cbbf01db1-config-data\") pod \"cinder-scheduler-0\" (UID: \"68b9fd2e-0f05-46d6-86aa-319cbbf01db1\") " pod="openstack/cinder-scheduler-0" Dec 01 14:16:16 crc kubenswrapper[4585]: I1201 14:16:16.910639 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68b9fd2e-0f05-46d6-86aa-319cbbf01db1-scripts\") pod \"cinder-scheduler-0\" (UID: \"68b9fd2e-0f05-46d6-86aa-319cbbf01db1\") " pod="openstack/cinder-scheduler-0" Dec 01 14:16:16 crc kubenswrapper[4585]: I1201 14:16:16.910744 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68b9fd2e-0f05-46d6-86aa-319cbbf01db1-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"68b9fd2e-0f05-46d6-86aa-319cbbf01db1\") " pod="openstack/cinder-scheduler-0" Dec 01 14:16:16 crc kubenswrapper[4585]: I1201 14:16:16.910781 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/68b9fd2e-0f05-46d6-86aa-319cbbf01db1-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"68b9fd2e-0f05-46d6-86aa-319cbbf01db1\") " pod="openstack/cinder-scheduler-0" Dec 01 14:16:16 crc kubenswrapper[4585]: I1201 14:16:16.910936 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5r7g\" (UniqueName: \"kubernetes.io/projected/68b9fd2e-0f05-46d6-86aa-319cbbf01db1-kube-api-access-t5r7g\") pod \"cinder-scheduler-0\" (UID: \"68b9fd2e-0f05-46d6-86aa-319cbbf01db1\") " pod="openstack/cinder-scheduler-0" Dec 01 14:16:16 crc kubenswrapper[4585]: I1201 14:16:16.911165 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/68b9fd2e-0f05-46d6-86aa-319cbbf01db1-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"68b9fd2e-0f05-46d6-86aa-319cbbf01db1\") " pod="openstack/cinder-scheduler-0" Dec 01 14:16:17 crc kubenswrapper[4585]: I1201 14:16:17.013172 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5r7g\" (UniqueName: \"kubernetes.io/projected/68b9fd2e-0f05-46d6-86aa-319cbbf01db1-kube-api-access-t5r7g\") pod \"cinder-scheduler-0\" (UID: \"68b9fd2e-0f05-46d6-86aa-319cbbf01db1\") " pod="openstack/cinder-scheduler-0" Dec 01 14:16:17 crc kubenswrapper[4585]: I1201 14:16:17.013270 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/68b9fd2e-0f05-46d6-86aa-319cbbf01db1-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"68b9fd2e-0f05-46d6-86aa-319cbbf01db1\") " pod="openstack/cinder-scheduler-0" Dec 01 14:16:17 crc kubenswrapper[4585]: I1201 14:16:17.013317 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68b9fd2e-0f05-46d6-86aa-319cbbf01db1-config-data\") pod \"cinder-scheduler-0\" (UID: \"68b9fd2e-0f05-46d6-86aa-319cbbf01db1\") " pod="openstack/cinder-scheduler-0" Dec 01 14:16:17 crc kubenswrapper[4585]: I1201 14:16:17.013355 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68b9fd2e-0f05-46d6-86aa-319cbbf01db1-scripts\") pod \"cinder-scheduler-0\" (UID: \"68b9fd2e-0f05-46d6-86aa-319cbbf01db1\") " pod="openstack/cinder-scheduler-0" Dec 01 14:16:17 crc kubenswrapper[4585]: I1201 14:16:17.013382 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68b9fd2e-0f05-46d6-86aa-319cbbf01db1-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"68b9fd2e-0f05-46d6-86aa-319cbbf01db1\") " pod="openstack/cinder-scheduler-0" Dec 01 14:16:17 crc kubenswrapper[4585]: I1201 14:16:17.013397 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/68b9fd2e-0f05-46d6-86aa-319cbbf01db1-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"68b9fd2e-0f05-46d6-86aa-319cbbf01db1\") " pod="openstack/cinder-scheduler-0" Dec 01 14:16:17 crc kubenswrapper[4585]: I1201 14:16:17.013482 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/68b9fd2e-0f05-46d6-86aa-319cbbf01db1-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"68b9fd2e-0f05-46d6-86aa-319cbbf01db1\") " pod="openstack/cinder-scheduler-0" Dec 01 14:16:17 crc kubenswrapper[4585]: I1201 14:16:17.020684 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/68b9fd2e-0f05-46d6-86aa-319cbbf01db1-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"68b9fd2e-0f05-46d6-86aa-319cbbf01db1\") " pod="openstack/cinder-scheduler-0" Dec 01 14:16:17 crc kubenswrapper[4585]: I1201 14:16:17.022288 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68b9fd2e-0f05-46d6-86aa-319cbbf01db1-config-data\") pod \"cinder-scheduler-0\" (UID: \"68b9fd2e-0f05-46d6-86aa-319cbbf01db1\") " pod="openstack/cinder-scheduler-0" Dec 01 14:16:17 crc kubenswrapper[4585]: I1201 14:16:17.022433 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68b9fd2e-0f05-46d6-86aa-319cbbf01db1-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"68b9fd2e-0f05-46d6-86aa-319cbbf01db1\") " pod="openstack/cinder-scheduler-0" Dec 01 14:16:17 crc kubenswrapper[4585]: I1201 14:16:17.022480 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68b9fd2e-0f05-46d6-86aa-319cbbf01db1-scripts\") pod \"cinder-scheduler-0\" (UID: \"68b9fd2e-0f05-46d6-86aa-319cbbf01db1\") " pod="openstack/cinder-scheduler-0" Dec 01 14:16:17 crc kubenswrapper[4585]: I1201 14:16:17.036621 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5r7g\" (UniqueName: \"kubernetes.io/projected/68b9fd2e-0f05-46d6-86aa-319cbbf01db1-kube-api-access-t5r7g\") pod \"cinder-scheduler-0\" (UID: \"68b9fd2e-0f05-46d6-86aa-319cbbf01db1\") " pod="openstack/cinder-scheduler-0" Dec 01 14:16:17 crc kubenswrapper[4585]: I1201 14:16:17.096694 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 01 14:16:17 crc kubenswrapper[4585]: I1201 14:16:17.573056 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 01 14:16:17 crc kubenswrapper[4585]: I1201 14:16:17.645692 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 14:16:17 crc kubenswrapper[4585]: I1201 14:16:17.725519 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"68b9fd2e-0f05-46d6-86aa-319cbbf01db1","Type":"ContainerStarted","Data":"7e99f56bee1e82cabfc318b599e1a684527494af8398e2c68b642cac84c86200"} Dec 01 14:16:18 crc kubenswrapper[4585]: I1201 14:16:18.426960 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b331842-f530-45ae-92e4-59fd379833b0" path="/var/lib/kubelet/pods/1b331842-f530-45ae-92e4-59fd379833b0/volumes" Dec 01 14:16:18 crc kubenswrapper[4585]: I1201 14:16:18.752249 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"68b9fd2e-0f05-46d6-86aa-319cbbf01db1","Type":"ContainerStarted","Data":"2fec7cff1372ebc24ef9a13acfecea3707a0c1124948afda5f804f7b5c37a090"} Dec 01 14:16:19 crc kubenswrapper[4585]: I1201 14:16:19.684838 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 14:16:19 crc kubenswrapper[4585]: I1201 14:16:19.685373 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0" containerName="ceilometer-central-agent" containerID="cri-o://b0eb593f0744ac672633eec711ebaac9fa0a287fef4a949cc99dcd5d1b19b7d3" gracePeriod=30 Dec 01 14:16:19 crc kubenswrapper[4585]: I1201 14:16:19.685716 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0" containerName="sg-core" containerID="cri-o://44ac16e77a217a165b7ab75f4de65db78cc2a9b0babeb3e2eb14c57da0487bba" gracePeriod=30 Dec 01 14:16:19 crc kubenswrapper[4585]: I1201 14:16:19.685762 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0" containerName="ceilometer-notification-agent" containerID="cri-o://35efd55eac44182903f892f02ab338367b78ab9ae756aaa0f340e6e43426aacc" gracePeriod=30 Dec 01 14:16:19 crc kubenswrapper[4585]: I1201 14:16:19.685790 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0" containerName="proxy-httpd" containerID="cri-o://3a9ba0070c1ab7d5e1a7069b52779c3f7403565c411ecec70078fa83ead02784" gracePeriod=30 Dec 01 14:16:19 crc kubenswrapper[4585]: I1201 14:16:19.828020 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"68b9fd2e-0f05-46d6-86aa-319cbbf01db1","Type":"ContainerStarted","Data":"854985f13ff2c329d7e4cff48ee243680eb4eec284a62b2d6f332efcf5164705"} Dec 01 14:16:19 crc kubenswrapper[4585]: I1201 14:16:19.877040 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.877014157 podStartE2EDuration="3.877014157s" podCreationTimestamp="2025-12-01 14:16:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:16:19.859520041 +0000 UTC m=+1093.843733896" watchObservedRunningTime="2025-12-01 14:16:19.877014157 +0000 UTC m=+1093.861228012" Dec 01 14:16:20 crc kubenswrapper[4585]: I1201 14:16:20.849703 4585 generic.go:334] "Generic (PLEG): container finished" podID="50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0" containerID="3a9ba0070c1ab7d5e1a7069b52779c3f7403565c411ecec70078fa83ead02784" exitCode=0 Dec 01 14:16:20 crc kubenswrapper[4585]: I1201 14:16:20.850085 4585 generic.go:334] "Generic (PLEG): container finished" podID="50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0" containerID="44ac16e77a217a165b7ab75f4de65db78cc2a9b0babeb3e2eb14c57da0487bba" exitCode=2 Dec 01 14:16:20 crc kubenswrapper[4585]: I1201 14:16:20.850096 4585 generic.go:334] "Generic (PLEG): container finished" podID="50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0" containerID="35efd55eac44182903f892f02ab338367b78ab9ae756aaa0f340e6e43426aacc" exitCode=0 Dec 01 14:16:20 crc kubenswrapper[4585]: I1201 14:16:20.850103 4585 generic.go:334] "Generic (PLEG): container finished" podID="50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0" containerID="b0eb593f0744ac672633eec711ebaac9fa0a287fef4a949cc99dcd5d1b19b7d3" exitCode=0 Dec 01 14:16:20 crc kubenswrapper[4585]: I1201 14:16:20.849749 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0","Type":"ContainerDied","Data":"3a9ba0070c1ab7d5e1a7069b52779c3f7403565c411ecec70078fa83ead02784"} Dec 01 14:16:20 crc kubenswrapper[4585]: I1201 14:16:20.851183 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0","Type":"ContainerDied","Data":"44ac16e77a217a165b7ab75f4de65db78cc2a9b0babeb3e2eb14c57da0487bba"} Dec 01 14:16:20 crc kubenswrapper[4585]: I1201 14:16:20.851200 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0","Type":"ContainerDied","Data":"35efd55eac44182903f892f02ab338367b78ab9ae756aaa0f340e6e43426aacc"} Dec 01 14:16:20 crc kubenswrapper[4585]: I1201 14:16:20.851209 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0","Type":"ContainerDied","Data":"b0eb593f0744ac672633eec711ebaac9fa0a287fef4a949cc99dcd5d1b19b7d3"} Dec 01 14:16:20 crc kubenswrapper[4585]: I1201 14:16:20.851219 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0","Type":"ContainerDied","Data":"37ea6096c5168750f5d14f8fda1b339dc338d5e85c09775514a41af279e9af46"} Dec 01 14:16:20 crc kubenswrapper[4585]: I1201 14:16:20.851228 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37ea6096c5168750f5d14f8fda1b339dc338d5e85c09775514a41af279e9af46" Dec 01 14:16:20 crc kubenswrapper[4585]: I1201 14:16:20.873184 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 14:16:21 crc kubenswrapper[4585]: I1201 14:16:21.008467 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvmrs\" (UniqueName: \"kubernetes.io/projected/50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0-kube-api-access-dvmrs\") pod \"50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0\" (UID: \"50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0\") " Dec 01 14:16:21 crc kubenswrapper[4585]: I1201 14:16:21.008559 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0-combined-ca-bundle\") pod \"50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0\" (UID: \"50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0\") " Dec 01 14:16:21 crc kubenswrapper[4585]: I1201 14:16:21.008622 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0-scripts\") pod \"50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0\" (UID: \"50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0\") " Dec 01 14:16:21 crc kubenswrapper[4585]: I1201 14:16:21.008706 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0-config-data\") pod \"50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0\" (UID: \"50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0\") " Dec 01 14:16:21 crc kubenswrapper[4585]: I1201 14:16:21.008725 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0-sg-core-conf-yaml\") pod \"50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0\" (UID: \"50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0\") " Dec 01 14:16:21 crc kubenswrapper[4585]: I1201 14:16:21.008763 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0-log-httpd\") pod \"50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0\" (UID: \"50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0\") " Dec 01 14:16:21 crc kubenswrapper[4585]: I1201 14:16:21.008778 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0-run-httpd\") pod \"50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0\" (UID: \"50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0\") " Dec 01 14:16:21 crc kubenswrapper[4585]: I1201 14:16:21.010230 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0" (UID: "50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:16:21 crc kubenswrapper[4585]: I1201 14:16:21.010268 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0" (UID: "50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:16:21 crc kubenswrapper[4585]: I1201 14:16:21.018088 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0-scripts" (OuterVolumeSpecName: "scripts") pod "50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0" (UID: "50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:16:21 crc kubenswrapper[4585]: I1201 14:16:21.040204 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0-kube-api-access-dvmrs" (OuterVolumeSpecName: "kube-api-access-dvmrs") pod "50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0" (UID: "50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0"). InnerVolumeSpecName "kube-api-access-dvmrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:16:21 crc kubenswrapper[4585]: I1201 14:16:21.052763 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0" (UID: "50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:16:21 crc kubenswrapper[4585]: I1201 14:16:21.115347 4585 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:21 crc kubenswrapper[4585]: I1201 14:16:21.115381 4585 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:21 crc kubenswrapper[4585]: I1201 14:16:21.115394 4585 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:21 crc kubenswrapper[4585]: I1201 14:16:21.115402 4585 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:21 crc kubenswrapper[4585]: I1201 14:16:21.115412 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvmrs\" (UniqueName: \"kubernetes.io/projected/50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0-kube-api-access-dvmrs\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:21 crc kubenswrapper[4585]: I1201 14:16:21.140417 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0" (UID: "50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:16:21 crc kubenswrapper[4585]: I1201 14:16:21.195626 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0-config-data" (OuterVolumeSpecName: "config-data") pod "50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0" (UID: "50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:16:21 crc kubenswrapper[4585]: I1201 14:16:21.219549 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:21 crc kubenswrapper[4585]: I1201 14:16:21.219577 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:21 crc kubenswrapper[4585]: I1201 14:16:21.860623 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 14:16:21 crc kubenswrapper[4585]: I1201 14:16:21.920410 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 14:16:21 crc kubenswrapper[4585]: I1201 14:16:21.934609 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 01 14:16:21 crc kubenswrapper[4585]: I1201 14:16:21.944112 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 14:16:21 crc kubenswrapper[4585]: E1201 14:16:21.944581 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0" containerName="proxy-httpd" Dec 01 14:16:21 crc kubenswrapper[4585]: I1201 14:16:21.944605 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0" containerName="proxy-httpd" Dec 01 14:16:21 crc kubenswrapper[4585]: E1201 14:16:21.944629 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0" containerName="ceilometer-notification-agent" Dec 01 14:16:21 crc kubenswrapper[4585]: I1201 14:16:21.944640 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0" containerName="ceilometer-notification-agent" Dec 01 14:16:21 crc kubenswrapper[4585]: E1201 14:16:21.944672 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0" containerName="ceilometer-central-agent" Dec 01 14:16:21 crc kubenswrapper[4585]: I1201 14:16:21.944681 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0" containerName="ceilometer-central-agent" Dec 01 14:16:21 crc kubenswrapper[4585]: E1201 14:16:21.944720 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0" containerName="sg-core" Dec 01 14:16:21 crc kubenswrapper[4585]: I1201 14:16:21.944730 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0" containerName="sg-core" Dec 01 14:16:21 crc kubenswrapper[4585]: I1201 14:16:21.944959 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0" containerName="proxy-httpd" Dec 01 14:16:21 crc kubenswrapper[4585]: I1201 14:16:21.945012 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0" containerName="sg-core" Dec 01 14:16:21 crc kubenswrapper[4585]: I1201 14:16:21.945031 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0" containerName="ceilometer-central-agent" Dec 01 14:16:21 crc kubenswrapper[4585]: I1201 14:16:21.945042 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0" containerName="ceilometer-notification-agent" Dec 01 14:16:21 crc kubenswrapper[4585]: I1201 14:16:21.947312 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 14:16:21 crc kubenswrapper[4585]: I1201 14:16:21.951357 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 14:16:21 crc kubenswrapper[4585]: I1201 14:16:21.953428 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 14:16:21 crc kubenswrapper[4585]: I1201 14:16:21.953675 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 14:16:22 crc kubenswrapper[4585]: I1201 14:16:22.043198 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8915c969-9480-47da-90a9-311f504dbe66-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8915c969-9480-47da-90a9-311f504dbe66\") " pod="openstack/ceilometer-0" Dec 01 14:16:22 crc kubenswrapper[4585]: I1201 14:16:22.043264 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8915c969-9480-47da-90a9-311f504dbe66-config-data\") pod \"ceilometer-0\" (UID: \"8915c969-9480-47da-90a9-311f504dbe66\") " pod="openstack/ceilometer-0" Dec 01 14:16:22 crc kubenswrapper[4585]: I1201 14:16:22.043298 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtnws\" (UniqueName: \"kubernetes.io/projected/8915c969-9480-47da-90a9-311f504dbe66-kube-api-access-wtnws\") pod \"ceilometer-0\" (UID: \"8915c969-9480-47da-90a9-311f504dbe66\") " pod="openstack/ceilometer-0" Dec 01 14:16:22 crc kubenswrapper[4585]: I1201 14:16:22.043434 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8915c969-9480-47da-90a9-311f504dbe66-run-httpd\") pod \"ceilometer-0\" (UID: \"8915c969-9480-47da-90a9-311f504dbe66\") " pod="openstack/ceilometer-0" Dec 01 14:16:22 crc kubenswrapper[4585]: I1201 14:16:22.043548 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8915c969-9480-47da-90a9-311f504dbe66-scripts\") pod \"ceilometer-0\" (UID: \"8915c969-9480-47da-90a9-311f504dbe66\") " pod="openstack/ceilometer-0" Dec 01 14:16:22 crc kubenswrapper[4585]: I1201 14:16:22.043700 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8915c969-9480-47da-90a9-311f504dbe66-log-httpd\") pod \"ceilometer-0\" (UID: \"8915c969-9480-47da-90a9-311f504dbe66\") " pod="openstack/ceilometer-0" Dec 01 14:16:22 crc kubenswrapper[4585]: I1201 14:16:22.043816 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8915c969-9480-47da-90a9-311f504dbe66-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8915c969-9480-47da-90a9-311f504dbe66\") " pod="openstack/ceilometer-0" Dec 01 14:16:22 crc kubenswrapper[4585]: I1201 14:16:22.097473 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 01 14:16:22 crc kubenswrapper[4585]: I1201 14:16:22.146824 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtnws\" (UniqueName: \"kubernetes.io/projected/8915c969-9480-47da-90a9-311f504dbe66-kube-api-access-wtnws\") pod \"ceilometer-0\" (UID: \"8915c969-9480-47da-90a9-311f504dbe66\") " pod="openstack/ceilometer-0" Dec 01 14:16:22 crc kubenswrapper[4585]: I1201 14:16:22.146871 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8915c969-9480-47da-90a9-311f504dbe66-run-httpd\") pod \"ceilometer-0\" (UID: \"8915c969-9480-47da-90a9-311f504dbe66\") " pod="openstack/ceilometer-0" Dec 01 14:16:22 crc kubenswrapper[4585]: I1201 14:16:22.146917 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8915c969-9480-47da-90a9-311f504dbe66-scripts\") pod \"ceilometer-0\" (UID: \"8915c969-9480-47da-90a9-311f504dbe66\") " pod="openstack/ceilometer-0" Dec 01 14:16:22 crc kubenswrapper[4585]: I1201 14:16:22.147145 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8915c969-9480-47da-90a9-311f504dbe66-log-httpd\") pod \"ceilometer-0\" (UID: \"8915c969-9480-47da-90a9-311f504dbe66\") " pod="openstack/ceilometer-0" Dec 01 14:16:22 crc kubenswrapper[4585]: I1201 14:16:22.147206 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8915c969-9480-47da-90a9-311f504dbe66-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8915c969-9480-47da-90a9-311f504dbe66\") " pod="openstack/ceilometer-0" Dec 01 14:16:22 crc kubenswrapper[4585]: I1201 14:16:22.147256 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8915c969-9480-47da-90a9-311f504dbe66-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8915c969-9480-47da-90a9-311f504dbe66\") " pod="openstack/ceilometer-0" Dec 01 14:16:22 crc kubenswrapper[4585]: I1201 14:16:22.147321 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8915c969-9480-47da-90a9-311f504dbe66-config-data\") pod \"ceilometer-0\" (UID: \"8915c969-9480-47da-90a9-311f504dbe66\") " pod="openstack/ceilometer-0" Dec 01 14:16:22 crc kubenswrapper[4585]: I1201 14:16:22.147579 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8915c969-9480-47da-90a9-311f504dbe66-run-httpd\") pod \"ceilometer-0\" (UID: \"8915c969-9480-47da-90a9-311f504dbe66\") " pod="openstack/ceilometer-0" Dec 01 14:16:22 crc kubenswrapper[4585]: I1201 14:16:22.147799 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8915c969-9480-47da-90a9-311f504dbe66-log-httpd\") pod \"ceilometer-0\" (UID: \"8915c969-9480-47da-90a9-311f504dbe66\") " pod="openstack/ceilometer-0" Dec 01 14:16:22 crc kubenswrapper[4585]: I1201 14:16:22.152585 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8915c969-9480-47da-90a9-311f504dbe66-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8915c969-9480-47da-90a9-311f504dbe66\") " pod="openstack/ceilometer-0" Dec 01 14:16:22 crc kubenswrapper[4585]: I1201 14:16:22.152918 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8915c969-9480-47da-90a9-311f504dbe66-scripts\") pod \"ceilometer-0\" (UID: \"8915c969-9480-47da-90a9-311f504dbe66\") " pod="openstack/ceilometer-0" Dec 01 14:16:22 crc kubenswrapper[4585]: I1201 14:16:22.154656 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8915c969-9480-47da-90a9-311f504dbe66-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8915c969-9480-47da-90a9-311f504dbe66\") " pod="openstack/ceilometer-0" Dec 01 14:16:22 crc kubenswrapper[4585]: I1201 14:16:22.163039 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8915c969-9480-47da-90a9-311f504dbe66-config-data\") pod \"ceilometer-0\" (UID: \"8915c969-9480-47da-90a9-311f504dbe66\") " pod="openstack/ceilometer-0" Dec 01 14:16:22 crc kubenswrapper[4585]: I1201 14:16:22.166770 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtnws\" (UniqueName: \"kubernetes.io/projected/8915c969-9480-47da-90a9-311f504dbe66-kube-api-access-wtnws\") pod \"ceilometer-0\" (UID: \"8915c969-9480-47da-90a9-311f504dbe66\") " pod="openstack/ceilometer-0" Dec 01 14:16:22 crc kubenswrapper[4585]: I1201 14:16:22.270533 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 14:16:22 crc kubenswrapper[4585]: I1201 14:16:22.443691 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0" path="/var/lib/kubelet/pods/50beb2ee-bd9d-4e86-9c8e-eaa093bd7da0/volumes" Dec 01 14:16:22 crc kubenswrapper[4585]: I1201 14:16:22.862236 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 14:16:22 crc kubenswrapper[4585]: I1201 14:16:22.888353 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8915c969-9480-47da-90a9-311f504dbe66","Type":"ContainerStarted","Data":"312e5fc609f92a4eee4b75bc8b89ce949bea40137be21626fc69f9dbd5bb4724"} Dec 01 14:16:23 crc kubenswrapper[4585]: I1201 14:16:23.720896 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 14:16:23 crc kubenswrapper[4585]: I1201 14:16:23.887234 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-5dc687f89-lzwxh"] Dec 01 14:16:23 crc kubenswrapper[4585]: I1201 14:16:23.889110 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5dc687f89-lzwxh" Dec 01 14:16:23 crc kubenswrapper[4585]: I1201 14:16:23.896380 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Dec 01 14:16:23 crc kubenswrapper[4585]: I1201 14:16:23.896495 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 01 14:16:23 crc kubenswrapper[4585]: I1201 14:16:23.898584 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Dec 01 14:16:23 crc kubenswrapper[4585]: I1201 14:16:23.907645 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5dc687f89-lzwxh"] Dec 01 14:16:23 crc kubenswrapper[4585]: I1201 14:16:23.985396 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f12e73f-f03a-4a68-a3e5-d4373d8fc583-internal-tls-certs\") pod \"swift-proxy-5dc687f89-lzwxh\" (UID: \"8f12e73f-f03a-4a68-a3e5-d4373d8fc583\") " pod="openstack/swift-proxy-5dc687f89-lzwxh" Dec 01 14:16:23 crc kubenswrapper[4585]: I1201 14:16:23.985801 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8f12e73f-f03a-4a68-a3e5-d4373d8fc583-etc-swift\") pod \"swift-proxy-5dc687f89-lzwxh\" (UID: \"8f12e73f-f03a-4a68-a3e5-d4373d8fc583\") " pod="openstack/swift-proxy-5dc687f89-lzwxh" Dec 01 14:16:23 crc kubenswrapper[4585]: I1201 14:16:23.985830 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f12e73f-f03a-4a68-a3e5-d4373d8fc583-combined-ca-bundle\") pod \"swift-proxy-5dc687f89-lzwxh\" (UID: \"8f12e73f-f03a-4a68-a3e5-d4373d8fc583\") " pod="openstack/swift-proxy-5dc687f89-lzwxh" Dec 01 14:16:23 crc kubenswrapper[4585]: I1201 14:16:23.985887 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f12e73f-f03a-4a68-a3e5-d4373d8fc583-config-data\") pod \"swift-proxy-5dc687f89-lzwxh\" (UID: \"8f12e73f-f03a-4a68-a3e5-d4373d8fc583\") " pod="openstack/swift-proxy-5dc687f89-lzwxh" Dec 01 14:16:23 crc kubenswrapper[4585]: I1201 14:16:23.986291 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8f12e73f-f03a-4a68-a3e5-d4373d8fc583-log-httpd\") pod \"swift-proxy-5dc687f89-lzwxh\" (UID: \"8f12e73f-f03a-4a68-a3e5-d4373d8fc583\") " pod="openstack/swift-proxy-5dc687f89-lzwxh" Dec 01 14:16:23 crc kubenswrapper[4585]: I1201 14:16:23.986477 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8f12e73f-f03a-4a68-a3e5-d4373d8fc583-run-httpd\") pod \"swift-proxy-5dc687f89-lzwxh\" (UID: \"8f12e73f-f03a-4a68-a3e5-d4373d8fc583\") " pod="openstack/swift-proxy-5dc687f89-lzwxh" Dec 01 14:16:23 crc kubenswrapper[4585]: I1201 14:16:23.986560 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f12e73f-f03a-4a68-a3e5-d4373d8fc583-public-tls-certs\") pod \"swift-proxy-5dc687f89-lzwxh\" (UID: \"8f12e73f-f03a-4a68-a3e5-d4373d8fc583\") " pod="openstack/swift-proxy-5dc687f89-lzwxh" Dec 01 14:16:23 crc kubenswrapper[4585]: I1201 14:16:23.986598 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlcb4\" (UniqueName: \"kubernetes.io/projected/8f12e73f-f03a-4a68-a3e5-d4373d8fc583-kube-api-access-vlcb4\") pod \"swift-proxy-5dc687f89-lzwxh\" (UID: \"8f12e73f-f03a-4a68-a3e5-d4373d8fc583\") " pod="openstack/swift-proxy-5dc687f89-lzwxh" Dec 01 14:16:24 crc kubenswrapper[4585]: I1201 14:16:24.088146 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8f12e73f-f03a-4a68-a3e5-d4373d8fc583-log-httpd\") pod \"swift-proxy-5dc687f89-lzwxh\" (UID: \"8f12e73f-f03a-4a68-a3e5-d4373d8fc583\") " pod="openstack/swift-proxy-5dc687f89-lzwxh" Dec 01 14:16:24 crc kubenswrapper[4585]: I1201 14:16:24.088210 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8f12e73f-f03a-4a68-a3e5-d4373d8fc583-run-httpd\") pod \"swift-proxy-5dc687f89-lzwxh\" (UID: \"8f12e73f-f03a-4a68-a3e5-d4373d8fc583\") " pod="openstack/swift-proxy-5dc687f89-lzwxh" Dec 01 14:16:24 crc kubenswrapper[4585]: I1201 14:16:24.088240 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f12e73f-f03a-4a68-a3e5-d4373d8fc583-public-tls-certs\") pod \"swift-proxy-5dc687f89-lzwxh\" (UID: \"8f12e73f-f03a-4a68-a3e5-d4373d8fc583\") " pod="openstack/swift-proxy-5dc687f89-lzwxh" Dec 01 14:16:24 crc kubenswrapper[4585]: I1201 14:16:24.088258 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlcb4\" (UniqueName: \"kubernetes.io/projected/8f12e73f-f03a-4a68-a3e5-d4373d8fc583-kube-api-access-vlcb4\") pod \"swift-proxy-5dc687f89-lzwxh\" (UID: \"8f12e73f-f03a-4a68-a3e5-d4373d8fc583\") " pod="openstack/swift-proxy-5dc687f89-lzwxh" Dec 01 14:16:24 crc kubenswrapper[4585]: I1201 14:16:24.088292 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f12e73f-f03a-4a68-a3e5-d4373d8fc583-internal-tls-certs\") pod \"swift-proxy-5dc687f89-lzwxh\" (UID: \"8f12e73f-f03a-4a68-a3e5-d4373d8fc583\") " pod="openstack/swift-proxy-5dc687f89-lzwxh" Dec 01 14:16:24 crc kubenswrapper[4585]: I1201 14:16:24.088329 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8f12e73f-f03a-4a68-a3e5-d4373d8fc583-etc-swift\") pod \"swift-proxy-5dc687f89-lzwxh\" (UID: \"8f12e73f-f03a-4a68-a3e5-d4373d8fc583\") " pod="openstack/swift-proxy-5dc687f89-lzwxh" Dec 01 14:16:24 crc kubenswrapper[4585]: I1201 14:16:24.088345 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f12e73f-f03a-4a68-a3e5-d4373d8fc583-combined-ca-bundle\") pod \"swift-proxy-5dc687f89-lzwxh\" (UID: \"8f12e73f-f03a-4a68-a3e5-d4373d8fc583\") " pod="openstack/swift-proxy-5dc687f89-lzwxh" Dec 01 14:16:24 crc kubenswrapper[4585]: I1201 14:16:24.088368 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f12e73f-f03a-4a68-a3e5-d4373d8fc583-config-data\") pod \"swift-proxy-5dc687f89-lzwxh\" (UID: \"8f12e73f-f03a-4a68-a3e5-d4373d8fc583\") " pod="openstack/swift-proxy-5dc687f89-lzwxh" Dec 01 14:16:24 crc kubenswrapper[4585]: I1201 14:16:24.089221 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8f12e73f-f03a-4a68-a3e5-d4373d8fc583-log-httpd\") pod \"swift-proxy-5dc687f89-lzwxh\" (UID: \"8f12e73f-f03a-4a68-a3e5-d4373d8fc583\") " pod="openstack/swift-proxy-5dc687f89-lzwxh" Dec 01 14:16:24 crc kubenswrapper[4585]: I1201 14:16:24.089291 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8f12e73f-f03a-4a68-a3e5-d4373d8fc583-run-httpd\") pod \"swift-proxy-5dc687f89-lzwxh\" (UID: \"8f12e73f-f03a-4a68-a3e5-d4373d8fc583\") " pod="openstack/swift-proxy-5dc687f89-lzwxh" Dec 01 14:16:24 crc kubenswrapper[4585]: I1201 14:16:24.095222 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f12e73f-f03a-4a68-a3e5-d4373d8fc583-config-data\") pod \"swift-proxy-5dc687f89-lzwxh\" (UID: \"8f12e73f-f03a-4a68-a3e5-d4373d8fc583\") " pod="openstack/swift-proxy-5dc687f89-lzwxh" Dec 01 14:16:24 crc kubenswrapper[4585]: I1201 14:16:24.106027 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f12e73f-f03a-4a68-a3e5-d4373d8fc583-public-tls-certs\") pod \"swift-proxy-5dc687f89-lzwxh\" (UID: \"8f12e73f-f03a-4a68-a3e5-d4373d8fc583\") " pod="openstack/swift-proxy-5dc687f89-lzwxh" Dec 01 14:16:24 crc kubenswrapper[4585]: I1201 14:16:24.107000 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f12e73f-f03a-4a68-a3e5-d4373d8fc583-internal-tls-certs\") pod \"swift-proxy-5dc687f89-lzwxh\" (UID: \"8f12e73f-f03a-4a68-a3e5-d4373d8fc583\") " pod="openstack/swift-proxy-5dc687f89-lzwxh" Dec 01 14:16:24 crc kubenswrapper[4585]: I1201 14:16:24.108564 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f12e73f-f03a-4a68-a3e5-d4373d8fc583-combined-ca-bundle\") pod \"swift-proxy-5dc687f89-lzwxh\" (UID: \"8f12e73f-f03a-4a68-a3e5-d4373d8fc583\") " pod="openstack/swift-proxy-5dc687f89-lzwxh" Dec 01 14:16:24 crc kubenswrapper[4585]: I1201 14:16:24.110547 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8f12e73f-f03a-4a68-a3e5-d4373d8fc583-etc-swift\") pod \"swift-proxy-5dc687f89-lzwxh\" (UID: \"8f12e73f-f03a-4a68-a3e5-d4373d8fc583\") " pod="openstack/swift-proxy-5dc687f89-lzwxh" Dec 01 14:16:24 crc kubenswrapper[4585]: I1201 14:16:24.112571 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlcb4\" (UniqueName: \"kubernetes.io/projected/8f12e73f-f03a-4a68-a3e5-d4373d8fc583-kube-api-access-vlcb4\") pod \"swift-proxy-5dc687f89-lzwxh\" (UID: \"8f12e73f-f03a-4a68-a3e5-d4373d8fc583\") " pod="openstack/swift-proxy-5dc687f89-lzwxh" Dec 01 14:16:24 crc kubenswrapper[4585]: I1201 14:16:24.214781 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5dc687f89-lzwxh" Dec 01 14:16:27 crc kubenswrapper[4585]: I1201 14:16:27.396685 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 01 14:16:30 crc kubenswrapper[4585]: E1201 14:16:30.621842 4585 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/811ed845b4a6dffd38f18b8fb7878b32b66a184cdf9bf82b5b88531607da982d/diff" to get inode usage: stat /var/lib/containers/storage/overlay/811ed845b4a6dffd38f18b8fb7878b32b66a184cdf9bf82b5b88531607da982d/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/openstack_ceilometer-0_192bd785-3d80-4b5e-8db2-55a0e3846802/ceilometer-notification-agent/0.log" to get inode usage: stat /var/log/pods/openstack_ceilometer-0_192bd785-3d80-4b5e-8db2-55a0e3846802/ceilometer-notification-agent/0.log: no such file or directory Dec 01 14:16:30 crc kubenswrapper[4585]: W1201 14:16:30.899329 4585 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50beb2ee_bd9d_4e86_9c8e_eaa093bd7da0.slice/crio-conmon-b0eb593f0744ac672633eec711ebaac9fa0a287fef4a949cc99dcd5d1b19b7d3.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50beb2ee_bd9d_4e86_9c8e_eaa093bd7da0.slice/crio-conmon-b0eb593f0744ac672633eec711ebaac9fa0a287fef4a949cc99dcd5d1b19b7d3.scope: no such file or directory Dec 01 14:16:30 crc kubenswrapper[4585]: W1201 14:16:30.902391 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b331842_f530_45ae_92e4_59fd379833b0.slice/crio-d6854a6b5518f5a46d76ea04013f9dd8f5c9f68c5ce1c10210b74d60ea48925e.scope WatchSource:0}: Error finding container d6854a6b5518f5a46d76ea04013f9dd8f5c9f68c5ce1c10210b74d60ea48925e: Status 404 returned error can't find the container with id d6854a6b5518f5a46d76ea04013f9dd8f5c9f68c5ce1c10210b74d60ea48925e Dec 01 14:16:30 crc kubenswrapper[4585]: W1201 14:16:30.902676 4585 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50beb2ee_bd9d_4e86_9c8e_eaa093bd7da0.slice/crio-b0eb593f0744ac672633eec711ebaac9fa0a287fef4a949cc99dcd5d1b19b7d3.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50beb2ee_bd9d_4e86_9c8e_eaa093bd7da0.slice/crio-b0eb593f0744ac672633eec711ebaac9fa0a287fef4a949cc99dcd5d1b19b7d3.scope: no such file or directory Dec 01 14:16:30 crc kubenswrapper[4585]: W1201 14:16:30.902700 4585 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50beb2ee_bd9d_4e86_9c8e_eaa093bd7da0.slice/crio-conmon-35efd55eac44182903f892f02ab338367b78ab9ae756aaa0f340e6e43426aacc.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50beb2ee_bd9d_4e86_9c8e_eaa093bd7da0.slice/crio-conmon-35efd55eac44182903f892f02ab338367b78ab9ae756aaa0f340e6e43426aacc.scope: no such file or directory Dec 01 14:16:30 crc kubenswrapper[4585]: W1201 14:16:30.902725 4585 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50beb2ee_bd9d_4e86_9c8e_eaa093bd7da0.slice/crio-35efd55eac44182903f892f02ab338367b78ab9ae756aaa0f340e6e43426aacc.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50beb2ee_bd9d_4e86_9c8e_eaa093bd7da0.slice/crio-35efd55eac44182903f892f02ab338367b78ab9ae756aaa0f340e6e43426aacc.scope: no such file or directory Dec 01 14:16:30 crc kubenswrapper[4585]: W1201 14:16:30.902737 4585 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50beb2ee_bd9d_4e86_9c8e_eaa093bd7da0.slice/crio-conmon-44ac16e77a217a165b7ab75f4de65db78cc2a9b0babeb3e2eb14c57da0487bba.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50beb2ee_bd9d_4e86_9c8e_eaa093bd7da0.slice/crio-conmon-44ac16e77a217a165b7ab75f4de65db78cc2a9b0babeb3e2eb14c57da0487bba.scope: no such file or directory Dec 01 14:16:30 crc kubenswrapper[4585]: W1201 14:16:30.902750 4585 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50beb2ee_bd9d_4e86_9c8e_eaa093bd7da0.slice/crio-44ac16e77a217a165b7ab75f4de65db78cc2a9b0babeb3e2eb14c57da0487bba.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50beb2ee_bd9d_4e86_9c8e_eaa093bd7da0.slice/crio-44ac16e77a217a165b7ab75f4de65db78cc2a9b0babeb3e2eb14c57da0487bba.scope: no such file or directory Dec 01 14:16:30 crc kubenswrapper[4585]: W1201 14:16:30.903938 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b331842_f530_45ae_92e4_59fd379833b0.slice/crio-19fb59a953575ee686aa9bbd83e688a83c3e716c8240424b250f91c4e1064636.scope WatchSource:0}: Error finding container 19fb59a953575ee686aa9bbd83e688a83c3e716c8240424b250f91c4e1064636: Status 404 returned error can't find the container with id 19fb59a953575ee686aa9bbd83e688a83c3e716c8240424b250f91c4e1064636 Dec 01 14:16:30 crc kubenswrapper[4585]: W1201 14:16:30.904366 4585 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50beb2ee_bd9d_4e86_9c8e_eaa093bd7da0.slice/crio-conmon-3a9ba0070c1ab7d5e1a7069b52779c3f7403565c411ecec70078fa83ead02784.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50beb2ee_bd9d_4e86_9c8e_eaa093bd7da0.slice/crio-conmon-3a9ba0070c1ab7d5e1a7069b52779c3f7403565c411ecec70078fa83ead02784.scope: no such file or directory Dec 01 14:16:30 crc kubenswrapper[4585]: W1201 14:16:30.904389 4585 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50beb2ee_bd9d_4e86_9c8e_eaa093bd7da0.slice/crio-3a9ba0070c1ab7d5e1a7069b52779c3f7403565c411ecec70078fa83ead02784.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50beb2ee_bd9d_4e86_9c8e_eaa093bd7da0.slice/crio-3a9ba0070c1ab7d5e1a7069b52779c3f7403565c411ecec70078fa83ead02784.scope: no such file or directory Dec 01 14:16:31 crc kubenswrapper[4585]: I1201 14:16:31.060045 4585 generic.go:334] "Generic (PLEG): container finished" podID="d6fd44d5-e430-42bb-ad0b-7c78e7a1f288" containerID="93a074bee351885d31ad8a67b537a1943c95441154a2cae193cda36b50b6191f" exitCode=137 Dec 01 14:16:31 crc kubenswrapper[4585]: I1201 14:16:31.060126 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f5b64975d-2mfhq" event={"ID":"d6fd44d5-e430-42bb-ad0b-7c78e7a1f288","Type":"ContainerDied","Data":"93a074bee351885d31ad8a67b537a1943c95441154a2cae193cda36b50b6191f"} Dec 01 14:16:31 crc kubenswrapper[4585]: I1201 14:16:31.062212 4585 generic.go:334] "Generic (PLEG): container finished" podID="e3b59a9b-bef9-4f7c-b861-bf6bc2a8bac1" containerID="37ec8f025e77f78f9cd14818cc5d1fa36d23c0908d41f30996a803ffc9bcaa9d" exitCode=137 Dec 01 14:16:31 crc kubenswrapper[4585]: I1201 14:16:31.062241 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6bbf659b46-55tth" event={"ID":"e3b59a9b-bef9-4f7c-b861-bf6bc2a8bac1","Type":"ContainerDied","Data":"37ec8f025e77f78f9cd14818cc5d1fa36d23c0908d41f30996a803ffc9bcaa9d"} Dec 01 14:16:31 crc kubenswrapper[4585]: E1201 14:16:31.297917 4585 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3b59a9b_bef9_4f7c_b861_bf6bc2a8bac1.slice/crio-37ec8f025e77f78f9cd14818cc5d1fa36d23c0908d41f30996a803ffc9bcaa9d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6fd44d5_e430_42bb_ad0b_7c78e7a1f288.slice/crio-conmon-93a074bee351885d31ad8a67b537a1943c95441154a2cae193cda36b50b6191f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6fd44d5_e430_42bb_ad0b_7c78e7a1f288.slice/crio-93a074bee351885d31ad8a67b537a1943c95441154a2cae193cda36b50b6191f.scope\": RecentStats: unable to find data in memory cache]" Dec 01 14:16:31 crc kubenswrapper[4585]: I1201 14:16:31.520594 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5dc687f89-lzwxh"] Dec 01 14:16:31 crc kubenswrapper[4585]: W1201 14:16:31.524523 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f12e73f_f03a_4a68_a3e5_d4373d8fc583.slice/crio-73ee70d68a0b67e0133b208e675cbbd453e6f808e8ac57f53b9bb0d7684e3256 WatchSource:0}: Error finding container 73ee70d68a0b67e0133b208e675cbbd453e6f808e8ac57f53b9bb0d7684e3256: Status 404 returned error can't find the container with id 73ee70d68a0b67e0133b208e675cbbd453e6f808e8ac57f53b9bb0d7684e3256 Dec 01 14:16:32 crc kubenswrapper[4585]: I1201 14:16:32.071042 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"29f23dd7-40f6-4677-bd4c-ebdf3152b72f","Type":"ContainerStarted","Data":"c6a0dfa2e422a43283a38393515e0be3151d719b37548e19ba9aec32cbb057ff"} Dec 01 14:16:32 crc kubenswrapper[4585]: I1201 14:16:32.075665 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5dc687f89-lzwxh" event={"ID":"8f12e73f-f03a-4a68-a3e5-d4373d8fc583","Type":"ContainerStarted","Data":"54b3e59710cfd21ec3265c4da6802cad0aba3ddffcb56cc370c2ab79025a72f4"} Dec 01 14:16:32 crc kubenswrapper[4585]: I1201 14:16:32.075705 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5dc687f89-lzwxh" event={"ID":"8f12e73f-f03a-4a68-a3e5-d4373d8fc583","Type":"ContainerStarted","Data":"63cf2ea59aaea5ae69f73a6f601bef56e9edfbafc769974276f0e48331cdd9a2"} Dec 01 14:16:32 crc kubenswrapper[4585]: I1201 14:16:32.075717 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5dc687f89-lzwxh" event={"ID":"8f12e73f-f03a-4a68-a3e5-d4373d8fc583","Type":"ContainerStarted","Data":"73ee70d68a0b67e0133b208e675cbbd453e6f808e8ac57f53b9bb0d7684e3256"} Dec 01 14:16:32 crc kubenswrapper[4585]: I1201 14:16:32.075755 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5dc687f89-lzwxh" Dec 01 14:16:32 crc kubenswrapper[4585]: I1201 14:16:32.075809 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5dc687f89-lzwxh" Dec 01 14:16:32 crc kubenswrapper[4585]: I1201 14:16:32.078439 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6bbf659b46-55tth" event={"ID":"e3b59a9b-bef9-4f7c-b861-bf6bc2a8bac1","Type":"ContainerStarted","Data":"4d8796ea217ae4c4701247ee5a26fec0bb5a2ffd857d94857c57f52e8c1eb462"} Dec 01 14:16:32 crc kubenswrapper[4585]: I1201 14:16:32.080563 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8915c969-9480-47da-90a9-311f504dbe66","Type":"ContainerStarted","Data":"0af8bea6bbe61b702f056d278f1efb6cafa8c8cb41dcfd8c3043d6af7e26cc13"} Dec 01 14:16:32 crc kubenswrapper[4585]: I1201 14:16:32.082189 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f5b64975d-2mfhq" event={"ID":"d6fd44d5-e430-42bb-ad0b-7c78e7a1f288","Type":"ContainerStarted","Data":"1f18573e739a744b3eabfcf4b261bed85eaea03874ad4e993ab153ba3999ffcf"} Dec 01 14:16:32 crc kubenswrapper[4585]: I1201 14:16:32.117577 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.733403622 podStartE2EDuration="18.117560403s" podCreationTimestamp="2025-12-01 14:16:14 +0000 UTC" firstStartedPulling="2025-12-01 14:16:15.477595026 +0000 UTC m=+1089.461808881" lastFinishedPulling="2025-12-01 14:16:30.861751797 +0000 UTC m=+1104.845965662" observedRunningTime="2025-12-01 14:16:32.093150342 +0000 UTC m=+1106.077364197" watchObservedRunningTime="2025-12-01 14:16:32.117560403 +0000 UTC m=+1106.101774258" Dec 01 14:16:32 crc kubenswrapper[4585]: I1201 14:16:32.159469 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-5dc687f89-lzwxh" podStartSLOduration=9.15944822 podStartE2EDuration="9.15944822s" podCreationTimestamp="2025-12-01 14:16:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:16:32.126748438 +0000 UTC m=+1106.110962293" watchObservedRunningTime="2025-12-01 14:16:32.15944822 +0000 UTC m=+1106.143662075" Dec 01 14:16:33 crc kubenswrapper[4585]: I1201 14:16:33.093873 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8915c969-9480-47da-90a9-311f504dbe66","Type":"ContainerStarted","Data":"0f29d76ec0ec7144d6b4986a3a21b9d271e69c019dc63122d74087048b21176d"} Dec 01 14:16:34 crc kubenswrapper[4585]: I1201 14:16:34.102627 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8915c969-9480-47da-90a9-311f504dbe66","Type":"ContainerStarted","Data":"2e26278ab2529c1e6c5a0052605bfdd6d26ffd2a44a92454941601340c2f2076"} Dec 01 14:16:35 crc kubenswrapper[4585]: I1201 14:16:35.113150 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8915c969-9480-47da-90a9-311f504dbe66","Type":"ContainerStarted","Data":"0332d9fb0f3900b81d4e1cd08ad52c05c87ef2cec942a44c2959c9f7c374dd84"} Dec 01 14:16:35 crc kubenswrapper[4585]: I1201 14:16:35.113880 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8915c969-9480-47da-90a9-311f504dbe66" containerName="ceilometer-central-agent" containerID="cri-o://0af8bea6bbe61b702f056d278f1efb6cafa8c8cb41dcfd8c3043d6af7e26cc13" gracePeriod=30 Dec 01 14:16:35 crc kubenswrapper[4585]: I1201 14:16:35.114148 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 14:16:35 crc kubenswrapper[4585]: I1201 14:16:35.114398 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8915c969-9480-47da-90a9-311f504dbe66" containerName="proxy-httpd" containerID="cri-o://0332d9fb0f3900b81d4e1cd08ad52c05c87ef2cec942a44c2959c9f7c374dd84" gracePeriod=30 Dec 01 14:16:35 crc kubenswrapper[4585]: I1201 14:16:35.114439 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8915c969-9480-47da-90a9-311f504dbe66" containerName="sg-core" containerID="cri-o://2e26278ab2529c1e6c5a0052605bfdd6d26ffd2a44a92454941601340c2f2076" gracePeriod=30 Dec 01 14:16:35 crc kubenswrapper[4585]: I1201 14:16:35.114474 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8915c969-9480-47da-90a9-311f504dbe66" containerName="ceilometer-notification-agent" containerID="cri-o://0f29d76ec0ec7144d6b4986a3a21b9d271e69c019dc63122d74087048b21176d" gracePeriod=30 Dec 01 14:16:35 crc kubenswrapper[4585]: I1201 14:16:35.454244 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.503336379 podStartE2EDuration="14.454208774s" podCreationTimestamp="2025-12-01 14:16:21 +0000 UTC" firstStartedPulling="2025-12-01 14:16:22.856211761 +0000 UTC m=+1096.840425626" lastFinishedPulling="2025-12-01 14:16:34.807084166 +0000 UTC m=+1108.791298021" observedRunningTime="2025-12-01 14:16:35.157380508 +0000 UTC m=+1109.141594363" watchObservedRunningTime="2025-12-01 14:16:35.454208774 +0000 UTC m=+1109.438422629" Dec 01 14:16:35 crc kubenswrapper[4585]: I1201 14:16:35.459697 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 14:16:35 crc kubenswrapper[4585]: I1201 14:16:35.459989 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="25feba7f-466b-4d39-9096-b5101c68502b" containerName="glance-log" containerID="cri-o://54858fd5198f9e7bae6e73d060e6cf805b39e71f39dcdcd87fbed8a918271822" gracePeriod=30 Dec 01 14:16:35 crc kubenswrapper[4585]: I1201 14:16:35.460100 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="25feba7f-466b-4d39-9096-b5101c68502b" containerName="glance-httpd" containerID="cri-o://012370f09a16ff589a0d74c0ddb711e53d3edc2bbffa015966c5aaf1c9be3c39" gracePeriod=30 Dec 01 14:16:36 crc kubenswrapper[4585]: I1201 14:16:36.122958 4585 generic.go:334] "Generic (PLEG): container finished" podID="25feba7f-466b-4d39-9096-b5101c68502b" containerID="54858fd5198f9e7bae6e73d060e6cf805b39e71f39dcdcd87fbed8a918271822" exitCode=143 Dec 01 14:16:36 crc kubenswrapper[4585]: I1201 14:16:36.123032 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"25feba7f-466b-4d39-9096-b5101c68502b","Type":"ContainerDied","Data":"54858fd5198f9e7bae6e73d060e6cf805b39e71f39dcdcd87fbed8a918271822"} Dec 01 14:16:36 crc kubenswrapper[4585]: I1201 14:16:36.126221 4585 generic.go:334] "Generic (PLEG): container finished" podID="8915c969-9480-47da-90a9-311f504dbe66" containerID="2e26278ab2529c1e6c5a0052605bfdd6d26ffd2a44a92454941601340c2f2076" exitCode=2 Dec 01 14:16:36 crc kubenswrapper[4585]: I1201 14:16:36.126243 4585 generic.go:334] "Generic (PLEG): container finished" podID="8915c969-9480-47da-90a9-311f504dbe66" containerID="0af8bea6bbe61b702f056d278f1efb6cafa8c8cb41dcfd8c3043d6af7e26cc13" exitCode=0 Dec 01 14:16:36 crc kubenswrapper[4585]: I1201 14:16:36.126260 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8915c969-9480-47da-90a9-311f504dbe66","Type":"ContainerDied","Data":"2e26278ab2529c1e6c5a0052605bfdd6d26ffd2a44a92454941601340c2f2076"} Dec 01 14:16:36 crc kubenswrapper[4585]: I1201 14:16:36.126284 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8915c969-9480-47da-90a9-311f504dbe66","Type":"ContainerDied","Data":"0af8bea6bbe61b702f056d278f1efb6cafa8c8cb41dcfd8c3043d6af7e26cc13"} Dec 01 14:16:36 crc kubenswrapper[4585]: I1201 14:16:36.470701 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-xw7sm"] Dec 01 14:16:36 crc kubenswrapper[4585]: I1201 14:16:36.472149 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xw7sm" Dec 01 14:16:36 crc kubenswrapper[4585]: I1201 14:16:36.490791 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-xw7sm"] Dec 01 14:16:36 crc kubenswrapper[4585]: I1201 14:16:36.582432 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-3107-account-create-update-h6npx"] Dec 01 14:16:36 crc kubenswrapper[4585]: I1201 14:16:36.583898 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-3107-account-create-update-h6npx" Dec 01 14:16:36 crc kubenswrapper[4585]: I1201 14:16:36.587618 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 01 14:16:36 crc kubenswrapper[4585]: I1201 14:16:36.611081 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-3107-account-create-update-h6npx"] Dec 01 14:16:36 crc kubenswrapper[4585]: I1201 14:16:36.652103 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd5ea5ee-2cac-4fab-893d-83569ffbae4c-operator-scripts\") pod \"nova-api-db-create-xw7sm\" (UID: \"cd5ea5ee-2cac-4fab-893d-83569ffbae4c\") " pod="openstack/nova-api-db-create-xw7sm" Dec 01 14:16:36 crc kubenswrapper[4585]: I1201 14:16:36.652602 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrdpj\" (UniqueName: \"kubernetes.io/projected/cd5ea5ee-2cac-4fab-893d-83569ffbae4c-kube-api-access-qrdpj\") pod \"nova-api-db-create-xw7sm\" (UID: \"cd5ea5ee-2cac-4fab-893d-83569ffbae4c\") " pod="openstack/nova-api-db-create-xw7sm" Dec 01 14:16:36 crc kubenswrapper[4585]: I1201 14:16:36.678242 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-kwrm4"] Dec 01 14:16:36 crc kubenswrapper[4585]: I1201 14:16:36.679377 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-kwrm4" Dec 01 14:16:36 crc kubenswrapper[4585]: I1201 14:16:36.690097 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-kwrm4"] Dec 01 14:16:36 crc kubenswrapper[4585]: I1201 14:16:36.753903 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df1492cf-2c9b-4573-9e32-dd372af19bfe-operator-scripts\") pod \"nova-api-3107-account-create-update-h6npx\" (UID: \"df1492cf-2c9b-4573-9e32-dd372af19bfe\") " pod="openstack/nova-api-3107-account-create-update-h6npx" Dec 01 14:16:36 crc kubenswrapper[4585]: I1201 14:16:36.754014 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wp5bf\" (UniqueName: \"kubernetes.io/projected/df1492cf-2c9b-4573-9e32-dd372af19bfe-kube-api-access-wp5bf\") pod \"nova-api-3107-account-create-update-h6npx\" (UID: \"df1492cf-2c9b-4573-9e32-dd372af19bfe\") " pod="openstack/nova-api-3107-account-create-update-h6npx" Dec 01 14:16:36 crc kubenswrapper[4585]: I1201 14:16:36.754064 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd5ea5ee-2cac-4fab-893d-83569ffbae4c-operator-scripts\") pod \"nova-api-db-create-xw7sm\" (UID: \"cd5ea5ee-2cac-4fab-893d-83569ffbae4c\") " pod="openstack/nova-api-db-create-xw7sm" Dec 01 14:16:36 crc kubenswrapper[4585]: I1201 14:16:36.754375 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrdpj\" (UniqueName: \"kubernetes.io/projected/cd5ea5ee-2cac-4fab-893d-83569ffbae4c-kube-api-access-qrdpj\") pod \"nova-api-db-create-xw7sm\" (UID: \"cd5ea5ee-2cac-4fab-893d-83569ffbae4c\") " pod="openstack/nova-api-db-create-xw7sm" Dec 01 14:16:36 crc kubenswrapper[4585]: I1201 14:16:36.754962 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd5ea5ee-2cac-4fab-893d-83569ffbae4c-operator-scripts\") pod \"nova-api-db-create-xw7sm\" (UID: \"cd5ea5ee-2cac-4fab-893d-83569ffbae4c\") " pod="openstack/nova-api-db-create-xw7sm" Dec 01 14:16:36 crc kubenswrapper[4585]: I1201 14:16:36.771014 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-nj957"] Dec 01 14:16:36 crc kubenswrapper[4585]: I1201 14:16:36.772294 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-nj957" Dec 01 14:16:36 crc kubenswrapper[4585]: I1201 14:16:36.795486 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-nj957"] Dec 01 14:16:36 crc kubenswrapper[4585]: I1201 14:16:36.814710 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-96f3-account-create-update-gbhsj"] Dec 01 14:16:36 crc kubenswrapper[4585]: I1201 14:16:36.821555 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-96f3-account-create-update-gbhsj" Dec 01 14:16:36 crc kubenswrapper[4585]: I1201 14:16:36.823579 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 01 14:16:36 crc kubenswrapper[4585]: I1201 14:16:36.829829 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-96f3-account-create-update-gbhsj"] Dec 01 14:16:36 crc kubenswrapper[4585]: I1201 14:16:36.855650 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df1492cf-2c9b-4573-9e32-dd372af19bfe-operator-scripts\") pod \"nova-api-3107-account-create-update-h6npx\" (UID: \"df1492cf-2c9b-4573-9e32-dd372af19bfe\") " pod="openstack/nova-api-3107-account-create-update-h6npx" Dec 01 14:16:36 crc kubenswrapper[4585]: I1201 14:16:36.855710 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpjf2\" (UniqueName: \"kubernetes.io/projected/5e77338c-2d1b-4b3f-812a-d8419ed44fb8-kube-api-access-xpjf2\") pod \"nova-cell0-db-create-kwrm4\" (UID: \"5e77338c-2d1b-4b3f-812a-d8419ed44fb8\") " pod="openstack/nova-cell0-db-create-kwrm4" Dec 01 14:16:36 crc kubenswrapper[4585]: I1201 14:16:36.856689 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df1492cf-2c9b-4573-9e32-dd372af19bfe-operator-scripts\") pod \"nova-api-3107-account-create-update-h6npx\" (UID: \"df1492cf-2c9b-4573-9e32-dd372af19bfe\") " pod="openstack/nova-api-3107-account-create-update-h6npx" Dec 01 14:16:36 crc kubenswrapper[4585]: I1201 14:16:36.856840 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wp5bf\" (UniqueName: \"kubernetes.io/projected/df1492cf-2c9b-4573-9e32-dd372af19bfe-kube-api-access-wp5bf\") pod \"nova-api-3107-account-create-update-h6npx\" (UID: \"df1492cf-2c9b-4573-9e32-dd372af19bfe\") " pod="openstack/nova-api-3107-account-create-update-h6npx" Dec 01 14:16:36 crc kubenswrapper[4585]: I1201 14:16:36.857145 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e77338c-2d1b-4b3f-812a-d8419ed44fb8-operator-scripts\") pod \"nova-cell0-db-create-kwrm4\" (UID: \"5e77338c-2d1b-4b3f-812a-d8419ed44fb8\") " pod="openstack/nova-cell0-db-create-kwrm4" Dec 01 14:16:36 crc kubenswrapper[4585]: I1201 14:16:36.865886 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrdpj\" (UniqueName: \"kubernetes.io/projected/cd5ea5ee-2cac-4fab-893d-83569ffbae4c-kube-api-access-qrdpj\") pod \"nova-api-db-create-xw7sm\" (UID: \"cd5ea5ee-2cac-4fab-893d-83569ffbae4c\") " pod="openstack/nova-api-db-create-xw7sm" Dec 01 14:16:36 crc kubenswrapper[4585]: I1201 14:16:36.878124 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wp5bf\" (UniqueName: \"kubernetes.io/projected/df1492cf-2c9b-4573-9e32-dd372af19bfe-kube-api-access-wp5bf\") pod \"nova-api-3107-account-create-update-h6npx\" (UID: \"df1492cf-2c9b-4573-9e32-dd372af19bfe\") " pod="openstack/nova-api-3107-account-create-update-h6npx" Dec 01 14:16:36 crc kubenswrapper[4585]: I1201 14:16:36.900259 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-3107-account-create-update-h6npx" Dec 01 14:16:36 crc kubenswrapper[4585]: I1201 14:16:36.958486 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf5a50c8-3aeb-4d5c-b313-b2eed4da3517-operator-scripts\") pod \"nova-cell1-db-create-nj957\" (UID: \"cf5a50c8-3aeb-4d5c-b313-b2eed4da3517\") " pod="openstack/nova-cell1-db-create-nj957" Dec 01 14:16:36 crc kubenswrapper[4585]: I1201 14:16:36.958873 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpjf2\" (UniqueName: \"kubernetes.io/projected/5e77338c-2d1b-4b3f-812a-d8419ed44fb8-kube-api-access-xpjf2\") pod \"nova-cell0-db-create-kwrm4\" (UID: \"5e77338c-2d1b-4b3f-812a-d8419ed44fb8\") " pod="openstack/nova-cell0-db-create-kwrm4" Dec 01 14:16:36 crc kubenswrapper[4585]: I1201 14:16:36.958903 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jb6t\" (UniqueName: \"kubernetes.io/projected/cf5a50c8-3aeb-4d5c-b313-b2eed4da3517-kube-api-access-6jb6t\") pod \"nova-cell1-db-create-nj957\" (UID: \"cf5a50c8-3aeb-4d5c-b313-b2eed4da3517\") " pod="openstack/nova-cell1-db-create-nj957" Dec 01 14:16:36 crc kubenswrapper[4585]: I1201 14:16:36.958963 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvkrb\" (UniqueName: \"kubernetes.io/projected/8fae72ee-6647-4105-ab94-ed2ab6bed7da-kube-api-access-cvkrb\") pod \"nova-cell0-96f3-account-create-update-gbhsj\" (UID: \"8fae72ee-6647-4105-ab94-ed2ab6bed7da\") " pod="openstack/nova-cell0-96f3-account-create-update-gbhsj" Dec 01 14:16:36 crc kubenswrapper[4585]: I1201 14:16:36.959000 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fae72ee-6647-4105-ab94-ed2ab6bed7da-operator-scripts\") pod \"nova-cell0-96f3-account-create-update-gbhsj\" (UID: \"8fae72ee-6647-4105-ab94-ed2ab6bed7da\") " pod="openstack/nova-cell0-96f3-account-create-update-gbhsj" Dec 01 14:16:36 crc kubenswrapper[4585]: I1201 14:16:36.959035 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e77338c-2d1b-4b3f-812a-d8419ed44fb8-operator-scripts\") pod \"nova-cell0-db-create-kwrm4\" (UID: \"5e77338c-2d1b-4b3f-812a-d8419ed44fb8\") " pod="openstack/nova-cell0-db-create-kwrm4" Dec 01 14:16:36 crc kubenswrapper[4585]: I1201 14:16:36.959781 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e77338c-2d1b-4b3f-812a-d8419ed44fb8-operator-scripts\") pod \"nova-cell0-db-create-kwrm4\" (UID: \"5e77338c-2d1b-4b3f-812a-d8419ed44fb8\") " pod="openstack/nova-cell0-db-create-kwrm4" Dec 01 14:16:36 crc kubenswrapper[4585]: I1201 14:16:36.985696 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpjf2\" (UniqueName: \"kubernetes.io/projected/5e77338c-2d1b-4b3f-812a-d8419ed44fb8-kube-api-access-xpjf2\") pod \"nova-cell0-db-create-kwrm4\" (UID: \"5e77338c-2d1b-4b3f-812a-d8419ed44fb8\") " pod="openstack/nova-cell0-db-create-kwrm4" Dec 01 14:16:37 crc kubenswrapper[4585]: I1201 14:16:37.006173 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-kwrm4" Dec 01 14:16:37 crc kubenswrapper[4585]: I1201 14:16:37.008480 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-be71-account-create-update-jns4b"] Dec 01 14:16:37 crc kubenswrapper[4585]: I1201 14:16:37.009893 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-be71-account-create-update-jns4b" Dec 01 14:16:37 crc kubenswrapper[4585]: I1201 14:16:37.015379 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 01 14:16:37 crc kubenswrapper[4585]: I1201 14:16:37.016428 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-be71-account-create-update-jns4b"] Dec 01 14:16:37 crc kubenswrapper[4585]: I1201 14:16:37.060622 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvkrb\" (UniqueName: \"kubernetes.io/projected/8fae72ee-6647-4105-ab94-ed2ab6bed7da-kube-api-access-cvkrb\") pod \"nova-cell0-96f3-account-create-update-gbhsj\" (UID: \"8fae72ee-6647-4105-ab94-ed2ab6bed7da\") " pod="openstack/nova-cell0-96f3-account-create-update-gbhsj" Dec 01 14:16:37 crc kubenswrapper[4585]: I1201 14:16:37.060664 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fae72ee-6647-4105-ab94-ed2ab6bed7da-operator-scripts\") pod \"nova-cell0-96f3-account-create-update-gbhsj\" (UID: \"8fae72ee-6647-4105-ab94-ed2ab6bed7da\") " pod="openstack/nova-cell0-96f3-account-create-update-gbhsj" Dec 01 14:16:37 crc kubenswrapper[4585]: I1201 14:16:37.060726 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf5a50c8-3aeb-4d5c-b313-b2eed4da3517-operator-scripts\") pod \"nova-cell1-db-create-nj957\" (UID: \"cf5a50c8-3aeb-4d5c-b313-b2eed4da3517\") " pod="openstack/nova-cell1-db-create-nj957" Dec 01 14:16:37 crc kubenswrapper[4585]: I1201 14:16:37.060785 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jb6t\" (UniqueName: \"kubernetes.io/projected/cf5a50c8-3aeb-4d5c-b313-b2eed4da3517-kube-api-access-6jb6t\") pod \"nova-cell1-db-create-nj957\" (UID: \"cf5a50c8-3aeb-4d5c-b313-b2eed4da3517\") " pod="openstack/nova-cell1-db-create-nj957" Dec 01 14:16:37 crc kubenswrapper[4585]: I1201 14:16:37.061579 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fae72ee-6647-4105-ab94-ed2ab6bed7da-operator-scripts\") pod \"nova-cell0-96f3-account-create-update-gbhsj\" (UID: \"8fae72ee-6647-4105-ab94-ed2ab6bed7da\") " pod="openstack/nova-cell0-96f3-account-create-update-gbhsj" Dec 01 14:16:37 crc kubenswrapper[4585]: I1201 14:16:37.062044 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf5a50c8-3aeb-4d5c-b313-b2eed4da3517-operator-scripts\") pod \"nova-cell1-db-create-nj957\" (UID: \"cf5a50c8-3aeb-4d5c-b313-b2eed4da3517\") " pod="openstack/nova-cell1-db-create-nj957" Dec 01 14:16:37 crc kubenswrapper[4585]: I1201 14:16:37.087430 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jb6t\" (UniqueName: \"kubernetes.io/projected/cf5a50c8-3aeb-4d5c-b313-b2eed4da3517-kube-api-access-6jb6t\") pod \"nova-cell1-db-create-nj957\" (UID: \"cf5a50c8-3aeb-4d5c-b313-b2eed4da3517\") " pod="openstack/nova-cell1-db-create-nj957" Dec 01 14:16:37 crc kubenswrapper[4585]: I1201 14:16:37.087478 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xw7sm" Dec 01 14:16:37 crc kubenswrapper[4585]: I1201 14:16:37.119720 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvkrb\" (UniqueName: \"kubernetes.io/projected/8fae72ee-6647-4105-ab94-ed2ab6bed7da-kube-api-access-cvkrb\") pod \"nova-cell0-96f3-account-create-update-gbhsj\" (UID: \"8fae72ee-6647-4105-ab94-ed2ab6bed7da\") " pod="openstack/nova-cell0-96f3-account-create-update-gbhsj" Dec 01 14:16:37 crc kubenswrapper[4585]: I1201 14:16:37.144255 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-96f3-account-create-update-gbhsj" Dec 01 14:16:37 crc kubenswrapper[4585]: I1201 14:16:37.176117 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rvl4\" (UniqueName: \"kubernetes.io/projected/c9288a94-0029-4c7b-825a-ad1d005c736d-kube-api-access-5rvl4\") pod \"nova-cell1-be71-account-create-update-jns4b\" (UID: \"c9288a94-0029-4c7b-825a-ad1d005c736d\") " pod="openstack/nova-cell1-be71-account-create-update-jns4b" Dec 01 14:16:37 crc kubenswrapper[4585]: I1201 14:16:37.176238 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9288a94-0029-4c7b-825a-ad1d005c736d-operator-scripts\") pod \"nova-cell1-be71-account-create-update-jns4b\" (UID: \"c9288a94-0029-4c7b-825a-ad1d005c736d\") " pod="openstack/nova-cell1-be71-account-create-update-jns4b" Dec 01 14:16:37 crc kubenswrapper[4585]: I1201 14:16:37.287221 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9288a94-0029-4c7b-825a-ad1d005c736d-operator-scripts\") pod \"nova-cell1-be71-account-create-update-jns4b\" (UID: \"c9288a94-0029-4c7b-825a-ad1d005c736d\") " pod="openstack/nova-cell1-be71-account-create-update-jns4b" Dec 01 14:16:37 crc kubenswrapper[4585]: I1201 14:16:37.287669 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rvl4\" (UniqueName: \"kubernetes.io/projected/c9288a94-0029-4c7b-825a-ad1d005c736d-kube-api-access-5rvl4\") pod \"nova-cell1-be71-account-create-update-jns4b\" (UID: \"c9288a94-0029-4c7b-825a-ad1d005c736d\") " pod="openstack/nova-cell1-be71-account-create-update-jns4b" Dec 01 14:16:37 crc kubenswrapper[4585]: I1201 14:16:37.288524 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9288a94-0029-4c7b-825a-ad1d005c736d-operator-scripts\") pod \"nova-cell1-be71-account-create-update-jns4b\" (UID: \"c9288a94-0029-4c7b-825a-ad1d005c736d\") " pod="openstack/nova-cell1-be71-account-create-update-jns4b" Dec 01 14:16:37 crc kubenswrapper[4585]: I1201 14:16:37.328648 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 14:16:37 crc kubenswrapper[4585]: I1201 14:16:37.328854 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="6a302cf5-b263-4654-b7cc-e7122f4b11cb" containerName="glance-log" containerID="cri-o://7889205e17509d9adb2f2146398b2a92c586e75952713cd7b30905412665187d" gracePeriod=30 Dec 01 14:16:37 crc kubenswrapper[4585]: I1201 14:16:37.330034 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="6a302cf5-b263-4654-b7cc-e7122f4b11cb" containerName="glance-httpd" containerID="cri-o://a0f5bda5a277365c87c89fc629c83a60030e06321be8230e8eafb8508497289b" gracePeriod=30 Dec 01 14:16:37 crc kubenswrapper[4585]: I1201 14:16:37.334102 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rvl4\" (UniqueName: \"kubernetes.io/projected/c9288a94-0029-4c7b-825a-ad1d005c736d-kube-api-access-5rvl4\") pod \"nova-cell1-be71-account-create-update-jns4b\" (UID: \"c9288a94-0029-4c7b-825a-ad1d005c736d\") " pod="openstack/nova-cell1-be71-account-create-update-jns4b" Dec 01 14:16:37 crc kubenswrapper[4585]: I1201 14:16:37.359607 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-be71-account-create-update-jns4b" Dec 01 14:16:37 crc kubenswrapper[4585]: I1201 14:16:37.391726 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-nj957" Dec 01 14:16:37 crc kubenswrapper[4585]: I1201 14:16:37.794797 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-xw7sm"] Dec 01 14:16:37 crc kubenswrapper[4585]: I1201 14:16:37.822833 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-3107-account-create-update-h6npx"] Dec 01 14:16:37 crc kubenswrapper[4585]: I1201 14:16:37.981963 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-kwrm4"] Dec 01 14:16:38 crc kubenswrapper[4585]: I1201 14:16:38.208256 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-3107-account-create-update-h6npx" event={"ID":"df1492cf-2c9b-4573-9e32-dd372af19bfe","Type":"ContainerStarted","Data":"210faecfd5ab89f78c055cf460df633e587ccce7a03804af0f6291cc89d16943"} Dec 01 14:16:38 crc kubenswrapper[4585]: I1201 14:16:38.224117 4585 generic.go:334] "Generic (PLEG): container finished" podID="a0f792a5-42c4-4a55-af66-55abc1be9684" containerID="a4558e1522ce76ce77e9244033bd909cde8395398e9db6d8be4e40428e57641d" exitCode=137 Dec 01 14:16:38 crc kubenswrapper[4585]: I1201 14:16:38.224182 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a0f792a5-42c4-4a55-af66-55abc1be9684","Type":"ContainerDied","Data":"a4558e1522ce76ce77e9244033bd909cde8395398e9db6d8be4e40428e57641d"} Dec 01 14:16:38 crc kubenswrapper[4585]: I1201 14:16:38.230360 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-kwrm4" event={"ID":"5e77338c-2d1b-4b3f-812a-d8419ed44fb8","Type":"ContainerStarted","Data":"4bc92a92ce9e944811d18497801eb6e7147d985e8c1d698c9c6e449a4c0bb0c4"} Dec 01 14:16:38 crc kubenswrapper[4585]: I1201 14:16:38.236520 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-xw7sm" event={"ID":"cd5ea5ee-2cac-4fab-893d-83569ffbae4c","Type":"ContainerStarted","Data":"2edc900ea7546e78b2514bc6c0cd5d79910f70355e38d6964062cbd0a6d8a163"} Dec 01 14:16:38 crc kubenswrapper[4585]: I1201 14:16:38.258151 4585 generic.go:334] "Generic (PLEG): container finished" podID="6a302cf5-b263-4654-b7cc-e7122f4b11cb" containerID="7889205e17509d9adb2f2146398b2a92c586e75952713cd7b30905412665187d" exitCode=143 Dec 01 14:16:38 crc kubenswrapper[4585]: I1201 14:16:38.258197 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6a302cf5-b263-4654-b7cc-e7122f4b11cb","Type":"ContainerDied","Data":"7889205e17509d9adb2f2146398b2a92c586e75952713cd7b30905412665187d"} Dec 01 14:16:38 crc kubenswrapper[4585]: I1201 14:16:38.467105 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-96f3-account-create-update-gbhsj"] Dec 01 14:16:38 crc kubenswrapper[4585]: I1201 14:16:38.480148 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-be71-account-create-update-jns4b"] Dec 01 14:16:38 crc kubenswrapper[4585]: I1201 14:16:38.497037 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-nj957"] Dec 01 14:16:38 crc kubenswrapper[4585]: I1201 14:16:38.562836 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 01 14:16:38 crc kubenswrapper[4585]: I1201 14:16:38.743175 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a0f792a5-42c4-4a55-af66-55abc1be9684-etc-machine-id\") pod \"a0f792a5-42c4-4a55-af66-55abc1be9684\" (UID: \"a0f792a5-42c4-4a55-af66-55abc1be9684\") " Dec 01 14:16:38 crc kubenswrapper[4585]: I1201 14:16:38.743631 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0f792a5-42c4-4a55-af66-55abc1be9684-config-data\") pod \"a0f792a5-42c4-4a55-af66-55abc1be9684\" (UID: \"a0f792a5-42c4-4a55-af66-55abc1be9684\") " Dec 01 14:16:38 crc kubenswrapper[4585]: I1201 14:16:38.743652 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a0f792a5-42c4-4a55-af66-55abc1be9684-config-data-custom\") pod \"a0f792a5-42c4-4a55-af66-55abc1be9684\" (UID: \"a0f792a5-42c4-4a55-af66-55abc1be9684\") " Dec 01 14:16:38 crc kubenswrapper[4585]: I1201 14:16:38.743709 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zljdp\" (UniqueName: \"kubernetes.io/projected/a0f792a5-42c4-4a55-af66-55abc1be9684-kube-api-access-zljdp\") pod \"a0f792a5-42c4-4a55-af66-55abc1be9684\" (UID: \"a0f792a5-42c4-4a55-af66-55abc1be9684\") " Dec 01 14:16:38 crc kubenswrapper[4585]: I1201 14:16:38.743730 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0f792a5-42c4-4a55-af66-55abc1be9684-combined-ca-bundle\") pod \"a0f792a5-42c4-4a55-af66-55abc1be9684\" (UID: \"a0f792a5-42c4-4a55-af66-55abc1be9684\") " Dec 01 14:16:38 crc kubenswrapper[4585]: I1201 14:16:38.743793 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0f792a5-42c4-4a55-af66-55abc1be9684-scripts\") pod \"a0f792a5-42c4-4a55-af66-55abc1be9684\" (UID: \"a0f792a5-42c4-4a55-af66-55abc1be9684\") " Dec 01 14:16:38 crc kubenswrapper[4585]: I1201 14:16:38.743874 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0f792a5-42c4-4a55-af66-55abc1be9684-logs\") pod \"a0f792a5-42c4-4a55-af66-55abc1be9684\" (UID: \"a0f792a5-42c4-4a55-af66-55abc1be9684\") " Dec 01 14:16:38 crc kubenswrapper[4585]: I1201 14:16:38.744057 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a0f792a5-42c4-4a55-af66-55abc1be9684-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "a0f792a5-42c4-4a55-af66-55abc1be9684" (UID: "a0f792a5-42c4-4a55-af66-55abc1be9684"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 14:16:38 crc kubenswrapper[4585]: I1201 14:16:38.744338 4585 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a0f792a5-42c4-4a55-af66-55abc1be9684-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:38 crc kubenswrapper[4585]: I1201 14:16:38.744723 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0f792a5-42c4-4a55-af66-55abc1be9684-logs" (OuterVolumeSpecName: "logs") pod "a0f792a5-42c4-4a55-af66-55abc1be9684" (UID: "a0f792a5-42c4-4a55-af66-55abc1be9684"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:16:38 crc kubenswrapper[4585]: I1201 14:16:38.757639 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0f792a5-42c4-4a55-af66-55abc1be9684-kube-api-access-zljdp" (OuterVolumeSpecName: "kube-api-access-zljdp") pod "a0f792a5-42c4-4a55-af66-55abc1be9684" (UID: "a0f792a5-42c4-4a55-af66-55abc1be9684"). InnerVolumeSpecName "kube-api-access-zljdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:16:38 crc kubenswrapper[4585]: I1201 14:16:38.764117 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0f792a5-42c4-4a55-af66-55abc1be9684-scripts" (OuterVolumeSpecName: "scripts") pod "a0f792a5-42c4-4a55-af66-55abc1be9684" (UID: "a0f792a5-42c4-4a55-af66-55abc1be9684"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:16:38 crc kubenswrapper[4585]: I1201 14:16:38.764165 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0f792a5-42c4-4a55-af66-55abc1be9684-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a0f792a5-42c4-4a55-af66-55abc1be9684" (UID: "a0f792a5-42c4-4a55-af66-55abc1be9684"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:16:38 crc kubenswrapper[4585]: I1201 14:16:38.825908 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0f792a5-42c4-4a55-af66-55abc1be9684-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a0f792a5-42c4-4a55-af66-55abc1be9684" (UID: "a0f792a5-42c4-4a55-af66-55abc1be9684"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:16:38 crc kubenswrapper[4585]: I1201 14:16:38.847317 4585 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0f792a5-42c4-4a55-af66-55abc1be9684-logs\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:38 crc kubenswrapper[4585]: I1201 14:16:38.847345 4585 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a0f792a5-42c4-4a55-af66-55abc1be9684-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:38 crc kubenswrapper[4585]: I1201 14:16:38.847359 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zljdp\" (UniqueName: \"kubernetes.io/projected/a0f792a5-42c4-4a55-af66-55abc1be9684-kube-api-access-zljdp\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:38 crc kubenswrapper[4585]: I1201 14:16:38.847370 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0f792a5-42c4-4a55-af66-55abc1be9684-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:38 crc kubenswrapper[4585]: I1201 14:16:38.847383 4585 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0f792a5-42c4-4a55-af66-55abc1be9684-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:38 crc kubenswrapper[4585]: I1201 14:16:38.856159 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0f792a5-42c4-4a55-af66-55abc1be9684-config-data" (OuterVolumeSpecName: "config-data") pod "a0f792a5-42c4-4a55-af66-55abc1be9684" (UID: "a0f792a5-42c4-4a55-af66-55abc1be9684"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:16:38 crc kubenswrapper[4585]: I1201 14:16:38.949033 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0f792a5-42c4-4a55-af66-55abc1be9684-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:39 crc kubenswrapper[4585]: I1201 14:16:39.237114 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5dc687f89-lzwxh" Dec 01 14:16:39 crc kubenswrapper[4585]: I1201 14:16:39.289006 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5dc687f89-lzwxh" Dec 01 14:16:39 crc kubenswrapper[4585]: I1201 14:16:39.298435 4585 generic.go:334] "Generic (PLEG): container finished" podID="cd5ea5ee-2cac-4fab-893d-83569ffbae4c" containerID="56623aba05bacca8f53c48692379602533229bec61a02d4e039c500069208602" exitCode=0 Dec 01 14:16:39 crc kubenswrapper[4585]: I1201 14:16:39.298567 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-xw7sm" event={"ID":"cd5ea5ee-2cac-4fab-893d-83569ffbae4c","Type":"ContainerDied","Data":"56623aba05bacca8f53c48692379602533229bec61a02d4e039c500069208602"} Dec 01 14:16:39 crc kubenswrapper[4585]: I1201 14:16:39.304368 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-nj957" event={"ID":"cf5a50c8-3aeb-4d5c-b313-b2eed4da3517","Type":"ContainerStarted","Data":"277d5f3f1d56f731f4b7678d72eba1fd9a49388f248c0e874a5b1091c482f2ee"} Dec 01 14:16:39 crc kubenswrapper[4585]: I1201 14:16:39.310052 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-be71-account-create-update-jns4b" event={"ID":"c9288a94-0029-4c7b-825a-ad1d005c736d","Type":"ContainerStarted","Data":"e68645bbc5a4775bda75665356d68ad4171f5673bf6730506664aa42347b28d5"} Dec 01 14:16:39 crc kubenswrapper[4585]: I1201 14:16:39.310142 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-be71-account-create-update-jns4b" event={"ID":"c9288a94-0029-4c7b-825a-ad1d005c736d","Type":"ContainerStarted","Data":"dab6cc6b3ded6b3262dc903b66498f01162fb4421d9572c89ddf80f6f24d47e3"} Dec 01 14:16:39 crc kubenswrapper[4585]: I1201 14:16:39.334017 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-3107-account-create-update-h6npx" event={"ID":"df1492cf-2c9b-4573-9e32-dd372af19bfe","Type":"ContainerStarted","Data":"68f80228535ebc2a63137bc109a9b59ff5ca0b48b8f3cb66e64b313712519a9f"} Dec 01 14:16:39 crc kubenswrapper[4585]: I1201 14:16:39.355704 4585 generic.go:334] "Generic (PLEG): container finished" podID="25feba7f-466b-4d39-9096-b5101c68502b" containerID="012370f09a16ff589a0d74c0ddb711e53d3edc2bbffa015966c5aaf1c9be3c39" exitCode=0 Dec 01 14:16:39 crc kubenswrapper[4585]: I1201 14:16:39.355789 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"25feba7f-466b-4d39-9096-b5101c68502b","Type":"ContainerDied","Data":"012370f09a16ff589a0d74c0ddb711e53d3edc2bbffa015966c5aaf1c9be3c39"} Dec 01 14:16:39 crc kubenswrapper[4585]: I1201 14:16:39.357581 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-kwrm4" event={"ID":"5e77338c-2d1b-4b3f-812a-d8419ed44fb8","Type":"ContainerStarted","Data":"d82a1c29492bac4e19d334478f06a5f4e23b64ac27e343752d065d1407092ed0"} Dec 01 14:16:39 crc kubenswrapper[4585]: I1201 14:16:39.362626 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a0f792a5-42c4-4a55-af66-55abc1be9684","Type":"ContainerDied","Data":"48e53bd1661e973c1addbe6cc6cd5d3056142db4a00908fe269811d06edec52a"} Dec 01 14:16:39 crc kubenswrapper[4585]: I1201 14:16:39.362670 4585 scope.go:117] "RemoveContainer" containerID="a4558e1522ce76ce77e9244033bd909cde8395398e9db6d8be4e40428e57641d" Dec 01 14:16:39 crc kubenswrapper[4585]: I1201 14:16:39.362778 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 01 14:16:39 crc kubenswrapper[4585]: I1201 14:16:39.377403 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-be71-account-create-update-jns4b" podStartSLOduration=3.377382226 podStartE2EDuration="3.377382226s" podCreationTimestamp="2025-12-01 14:16:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:16:39.339899737 +0000 UTC m=+1113.324113592" watchObservedRunningTime="2025-12-01 14:16:39.377382226 +0000 UTC m=+1113.361596081" Dec 01 14:16:39 crc kubenswrapper[4585]: I1201 14:16:39.385045 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-96f3-account-create-update-gbhsj" event={"ID":"8fae72ee-6647-4105-ab94-ed2ab6bed7da","Type":"ContainerStarted","Data":"f26036630c013e3c3c70cb8cb34d25c08e53d28eecb756c76aaf19b2e2b04444"} Dec 01 14:16:39 crc kubenswrapper[4585]: I1201 14:16:39.428592 4585 scope.go:117] "RemoveContainer" containerID="4d4ac1f3d1190d82d6047f199043485ff97e61fab0a004e4c8f649462887feb6" Dec 01 14:16:39 crc kubenswrapper[4585]: I1201 14:16:39.496040 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-3107-account-create-update-h6npx" podStartSLOduration=3.49602092 podStartE2EDuration="3.49602092s" podCreationTimestamp="2025-12-01 14:16:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:16:39.408380753 +0000 UTC m=+1113.392594608" watchObservedRunningTime="2025-12-01 14:16:39.49602092 +0000 UTC m=+1113.480234775" Dec 01 14:16:39 crc kubenswrapper[4585]: I1201 14:16:39.506044 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-kwrm4" podStartSLOduration=3.506027957 podStartE2EDuration="3.506027957s" podCreationTimestamp="2025-12-01 14:16:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:16:39.428370096 +0000 UTC m=+1113.412583951" watchObservedRunningTime="2025-12-01 14:16:39.506027957 +0000 UTC m=+1113.490241812" Dec 01 14:16:39 crc kubenswrapper[4585]: I1201 14:16:39.557668 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 01 14:16:39 crc kubenswrapper[4585]: I1201 14:16:39.601677 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 01 14:16:39 crc kubenswrapper[4585]: I1201 14:16:39.611797 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-96f3-account-create-update-gbhsj" podStartSLOduration=3.611775107 podStartE2EDuration="3.611775107s" podCreationTimestamp="2025-12-01 14:16:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:16:39.483419004 +0000 UTC m=+1113.467632849" watchObservedRunningTime="2025-12-01 14:16:39.611775107 +0000 UTC m=+1113.595988962" Dec 01 14:16:39 crc kubenswrapper[4585]: I1201 14:16:39.641515 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 01 14:16:39 crc kubenswrapper[4585]: E1201 14:16:39.641906 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0f792a5-42c4-4a55-af66-55abc1be9684" containerName="cinder-api-log" Dec 01 14:16:39 crc kubenswrapper[4585]: I1201 14:16:39.641919 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0f792a5-42c4-4a55-af66-55abc1be9684" containerName="cinder-api-log" Dec 01 14:16:39 crc kubenswrapper[4585]: E1201 14:16:39.641931 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0f792a5-42c4-4a55-af66-55abc1be9684" containerName="cinder-api" Dec 01 14:16:39 crc kubenswrapper[4585]: I1201 14:16:39.641938 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0f792a5-42c4-4a55-af66-55abc1be9684" containerName="cinder-api" Dec 01 14:16:39 crc kubenswrapper[4585]: I1201 14:16:39.642130 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0f792a5-42c4-4a55-af66-55abc1be9684" containerName="cinder-api" Dec 01 14:16:39 crc kubenswrapper[4585]: I1201 14:16:39.642154 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0f792a5-42c4-4a55-af66-55abc1be9684" containerName="cinder-api-log" Dec 01 14:16:39 crc kubenswrapper[4585]: I1201 14:16:39.645530 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 01 14:16:39 crc kubenswrapper[4585]: I1201 14:16:39.648828 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 01 14:16:39 crc kubenswrapper[4585]: I1201 14:16:39.650517 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 01 14:16:39 crc kubenswrapper[4585]: I1201 14:16:39.652792 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 01 14:16:39 crc kubenswrapper[4585]: I1201 14:16:39.669381 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 01 14:16:39 crc kubenswrapper[4585]: I1201 14:16:39.804512 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/952acec2-d757-4b65-aaf3-61bb69e5d5d7-etc-machine-id\") pod \"cinder-api-0\" (UID: \"952acec2-d757-4b65-aaf3-61bb69e5d5d7\") " pod="openstack/cinder-api-0" Dec 01 14:16:39 crc kubenswrapper[4585]: I1201 14:16:39.804581 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/952acec2-d757-4b65-aaf3-61bb69e5d5d7-config-data-custom\") pod \"cinder-api-0\" (UID: \"952acec2-d757-4b65-aaf3-61bb69e5d5d7\") " pod="openstack/cinder-api-0" Dec 01 14:16:39 crc kubenswrapper[4585]: I1201 14:16:39.804613 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/952acec2-d757-4b65-aaf3-61bb69e5d5d7-public-tls-certs\") pod \"cinder-api-0\" (UID: \"952acec2-d757-4b65-aaf3-61bb69e5d5d7\") " pod="openstack/cinder-api-0" Dec 01 14:16:39 crc kubenswrapper[4585]: I1201 14:16:39.804662 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/952acec2-d757-4b65-aaf3-61bb69e5d5d7-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"952acec2-d757-4b65-aaf3-61bb69e5d5d7\") " pod="openstack/cinder-api-0" Dec 01 14:16:39 crc kubenswrapper[4585]: I1201 14:16:39.804704 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/952acec2-d757-4b65-aaf3-61bb69e5d5d7-config-data\") pod \"cinder-api-0\" (UID: \"952acec2-d757-4b65-aaf3-61bb69e5d5d7\") " pod="openstack/cinder-api-0" Dec 01 14:16:39 crc kubenswrapper[4585]: I1201 14:16:39.804741 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/952acec2-d757-4b65-aaf3-61bb69e5d5d7-scripts\") pod \"cinder-api-0\" (UID: \"952acec2-d757-4b65-aaf3-61bb69e5d5d7\") " pod="openstack/cinder-api-0" Dec 01 14:16:39 crc kubenswrapper[4585]: I1201 14:16:39.804761 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/952acec2-d757-4b65-aaf3-61bb69e5d5d7-logs\") pod \"cinder-api-0\" (UID: \"952acec2-d757-4b65-aaf3-61bb69e5d5d7\") " pod="openstack/cinder-api-0" Dec 01 14:16:39 crc kubenswrapper[4585]: I1201 14:16:39.804784 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nxtl\" (UniqueName: \"kubernetes.io/projected/952acec2-d757-4b65-aaf3-61bb69e5d5d7-kube-api-access-5nxtl\") pod \"cinder-api-0\" (UID: \"952acec2-d757-4b65-aaf3-61bb69e5d5d7\") " pod="openstack/cinder-api-0" Dec 01 14:16:39 crc kubenswrapper[4585]: I1201 14:16:39.804812 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/952acec2-d757-4b65-aaf3-61bb69e5d5d7-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"952acec2-d757-4b65-aaf3-61bb69e5d5d7\") " pod="openstack/cinder-api-0" Dec 01 14:16:39 crc kubenswrapper[4585]: I1201 14:16:39.906667 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/952acec2-d757-4b65-aaf3-61bb69e5d5d7-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"952acec2-d757-4b65-aaf3-61bb69e5d5d7\") " pod="openstack/cinder-api-0" Dec 01 14:16:39 crc kubenswrapper[4585]: I1201 14:16:39.906715 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/952acec2-d757-4b65-aaf3-61bb69e5d5d7-config-data\") pod \"cinder-api-0\" (UID: \"952acec2-d757-4b65-aaf3-61bb69e5d5d7\") " pod="openstack/cinder-api-0" Dec 01 14:16:39 crc kubenswrapper[4585]: I1201 14:16:39.906753 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/952acec2-d757-4b65-aaf3-61bb69e5d5d7-scripts\") pod \"cinder-api-0\" (UID: \"952acec2-d757-4b65-aaf3-61bb69e5d5d7\") " pod="openstack/cinder-api-0" Dec 01 14:16:39 crc kubenswrapper[4585]: I1201 14:16:39.906769 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/952acec2-d757-4b65-aaf3-61bb69e5d5d7-logs\") pod \"cinder-api-0\" (UID: \"952acec2-d757-4b65-aaf3-61bb69e5d5d7\") " pod="openstack/cinder-api-0" Dec 01 14:16:39 crc kubenswrapper[4585]: I1201 14:16:39.906792 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nxtl\" (UniqueName: \"kubernetes.io/projected/952acec2-d757-4b65-aaf3-61bb69e5d5d7-kube-api-access-5nxtl\") pod \"cinder-api-0\" (UID: \"952acec2-d757-4b65-aaf3-61bb69e5d5d7\") " pod="openstack/cinder-api-0" Dec 01 14:16:39 crc kubenswrapper[4585]: I1201 14:16:39.906825 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/952acec2-d757-4b65-aaf3-61bb69e5d5d7-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"952acec2-d757-4b65-aaf3-61bb69e5d5d7\") " pod="openstack/cinder-api-0" Dec 01 14:16:39 crc kubenswrapper[4585]: I1201 14:16:39.906882 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/952acec2-d757-4b65-aaf3-61bb69e5d5d7-etc-machine-id\") pod \"cinder-api-0\" (UID: \"952acec2-d757-4b65-aaf3-61bb69e5d5d7\") " pod="openstack/cinder-api-0" Dec 01 14:16:39 crc kubenswrapper[4585]: I1201 14:16:39.906920 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/952acec2-d757-4b65-aaf3-61bb69e5d5d7-config-data-custom\") pod \"cinder-api-0\" (UID: \"952acec2-d757-4b65-aaf3-61bb69e5d5d7\") " pod="openstack/cinder-api-0" Dec 01 14:16:39 crc kubenswrapper[4585]: I1201 14:16:39.906981 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/952acec2-d757-4b65-aaf3-61bb69e5d5d7-public-tls-certs\") pod \"cinder-api-0\" (UID: \"952acec2-d757-4b65-aaf3-61bb69e5d5d7\") " pod="openstack/cinder-api-0" Dec 01 14:16:39 crc kubenswrapper[4585]: I1201 14:16:39.907099 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/952acec2-d757-4b65-aaf3-61bb69e5d5d7-etc-machine-id\") pod \"cinder-api-0\" (UID: \"952acec2-d757-4b65-aaf3-61bb69e5d5d7\") " pod="openstack/cinder-api-0" Dec 01 14:16:39 crc kubenswrapper[4585]: I1201 14:16:39.907646 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/952acec2-d757-4b65-aaf3-61bb69e5d5d7-logs\") pod \"cinder-api-0\" (UID: \"952acec2-d757-4b65-aaf3-61bb69e5d5d7\") " pod="openstack/cinder-api-0" Dec 01 14:16:39 crc kubenswrapper[4585]: I1201 14:16:39.922414 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/952acec2-d757-4b65-aaf3-61bb69e5d5d7-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"952acec2-d757-4b65-aaf3-61bb69e5d5d7\") " pod="openstack/cinder-api-0" Dec 01 14:16:39 crc kubenswrapper[4585]: I1201 14:16:39.925500 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/952acec2-d757-4b65-aaf3-61bb69e5d5d7-config-data-custom\") pod \"cinder-api-0\" (UID: \"952acec2-d757-4b65-aaf3-61bb69e5d5d7\") " pod="openstack/cinder-api-0" Dec 01 14:16:39 crc kubenswrapper[4585]: I1201 14:16:39.929258 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/952acec2-d757-4b65-aaf3-61bb69e5d5d7-public-tls-certs\") pod \"cinder-api-0\" (UID: \"952acec2-d757-4b65-aaf3-61bb69e5d5d7\") " pod="openstack/cinder-api-0" Dec 01 14:16:39 crc kubenswrapper[4585]: I1201 14:16:39.932522 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nxtl\" (UniqueName: \"kubernetes.io/projected/952acec2-d757-4b65-aaf3-61bb69e5d5d7-kube-api-access-5nxtl\") pod \"cinder-api-0\" (UID: \"952acec2-d757-4b65-aaf3-61bb69e5d5d7\") " pod="openstack/cinder-api-0" Dec 01 14:16:39 crc kubenswrapper[4585]: I1201 14:16:39.935067 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/952acec2-d757-4b65-aaf3-61bb69e5d5d7-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"952acec2-d757-4b65-aaf3-61bb69e5d5d7\") " pod="openstack/cinder-api-0" Dec 01 14:16:39 crc kubenswrapper[4585]: I1201 14:16:39.936533 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/952acec2-d757-4b65-aaf3-61bb69e5d5d7-scripts\") pod \"cinder-api-0\" (UID: \"952acec2-d757-4b65-aaf3-61bb69e5d5d7\") " pod="openstack/cinder-api-0" Dec 01 14:16:39 crc kubenswrapper[4585]: I1201 14:16:39.946947 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/952acec2-d757-4b65-aaf3-61bb69e5d5d7-config-data\") pod \"cinder-api-0\" (UID: \"952acec2-d757-4b65-aaf3-61bb69e5d5d7\") " pod="openstack/cinder-api-0" Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.001354 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.049453 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.212322 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25feba7f-466b-4d39-9096-b5101c68502b-config-data\") pod \"25feba7f-466b-4d39-9096-b5101c68502b\" (UID: \"25feba7f-466b-4d39-9096-b5101c68502b\") " Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.212458 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/25feba7f-466b-4d39-9096-b5101c68502b-public-tls-certs\") pod \"25feba7f-466b-4d39-9096-b5101c68502b\" (UID: \"25feba7f-466b-4d39-9096-b5101c68502b\") " Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.212505 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"25feba7f-466b-4d39-9096-b5101c68502b\" (UID: \"25feba7f-466b-4d39-9096-b5101c68502b\") " Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.212551 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25feba7f-466b-4d39-9096-b5101c68502b-combined-ca-bundle\") pod \"25feba7f-466b-4d39-9096-b5101c68502b\" (UID: \"25feba7f-466b-4d39-9096-b5101c68502b\") " Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.212570 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25feba7f-466b-4d39-9096-b5101c68502b-scripts\") pod \"25feba7f-466b-4d39-9096-b5101c68502b\" (UID: \"25feba7f-466b-4d39-9096-b5101c68502b\") " Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.212590 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvvzt\" (UniqueName: \"kubernetes.io/projected/25feba7f-466b-4d39-9096-b5101c68502b-kube-api-access-mvvzt\") pod \"25feba7f-466b-4d39-9096-b5101c68502b\" (UID: \"25feba7f-466b-4d39-9096-b5101c68502b\") " Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.212608 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25feba7f-466b-4d39-9096-b5101c68502b-logs\") pod \"25feba7f-466b-4d39-9096-b5101c68502b\" (UID: \"25feba7f-466b-4d39-9096-b5101c68502b\") " Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.212642 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/25feba7f-466b-4d39-9096-b5101c68502b-httpd-run\") pod \"25feba7f-466b-4d39-9096-b5101c68502b\" (UID: \"25feba7f-466b-4d39-9096-b5101c68502b\") " Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.213534 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25feba7f-466b-4d39-9096-b5101c68502b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "25feba7f-466b-4d39-9096-b5101c68502b" (UID: "25feba7f-466b-4d39-9096-b5101c68502b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.214345 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25feba7f-466b-4d39-9096-b5101c68502b-logs" (OuterVolumeSpecName: "logs") pod "25feba7f-466b-4d39-9096-b5101c68502b" (UID: "25feba7f-466b-4d39-9096-b5101c68502b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.222195 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25feba7f-466b-4d39-9096-b5101c68502b-scripts" (OuterVolumeSpecName: "scripts") pod "25feba7f-466b-4d39-9096-b5101c68502b" (UID: "25feba7f-466b-4d39-9096-b5101c68502b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.222479 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25feba7f-466b-4d39-9096-b5101c68502b-kube-api-access-mvvzt" (OuterVolumeSpecName: "kube-api-access-mvvzt") pod "25feba7f-466b-4d39-9096-b5101c68502b" (UID: "25feba7f-466b-4d39-9096-b5101c68502b"). InnerVolumeSpecName "kube-api-access-mvvzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.227862 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "25feba7f-466b-4d39-9096-b5101c68502b" (UID: "25feba7f-466b-4d39-9096-b5101c68502b"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.258553 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25feba7f-466b-4d39-9096-b5101c68502b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "25feba7f-466b-4d39-9096-b5101c68502b" (UID: "25feba7f-466b-4d39-9096-b5101c68502b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.311128 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25feba7f-466b-4d39-9096-b5101c68502b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "25feba7f-466b-4d39-9096-b5101c68502b" (UID: "25feba7f-466b-4d39-9096-b5101c68502b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.314157 4585 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/25feba7f-466b-4d39-9096-b5101c68502b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.314204 4585 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.314216 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25feba7f-466b-4d39-9096-b5101c68502b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.314227 4585 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25feba7f-466b-4d39-9096-b5101c68502b-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.314236 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvvzt\" (UniqueName: \"kubernetes.io/projected/25feba7f-466b-4d39-9096-b5101c68502b-kube-api-access-mvvzt\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.314248 4585 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25feba7f-466b-4d39-9096-b5101c68502b-logs\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.314255 4585 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/25feba7f-466b-4d39-9096-b5101c68502b-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.325221 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25feba7f-466b-4d39-9096-b5101c68502b-config-data" (OuterVolumeSpecName: "config-data") pod "25feba7f-466b-4d39-9096-b5101c68502b" (UID: "25feba7f-466b-4d39-9096-b5101c68502b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.348077 4585 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.395677 4585 generic.go:334] "Generic (PLEG): container finished" podID="8fae72ee-6647-4105-ab94-ed2ab6bed7da" containerID="6c085fb6bd248981b08329c5f59cde937909cb1150b8d67b7030daab26098120" exitCode=0 Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.395734 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-96f3-account-create-update-gbhsj" event={"ID":"8fae72ee-6647-4105-ab94-ed2ab6bed7da","Type":"ContainerDied","Data":"6c085fb6bd248981b08329c5f59cde937909cb1150b8d67b7030daab26098120"} Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.398131 4585 generic.go:334] "Generic (PLEG): container finished" podID="cf5a50c8-3aeb-4d5c-b313-b2eed4da3517" containerID="6b10d374ff49f0b52403252a14d9078e06a365dc921dac51dd9a7129fea487fd" exitCode=0 Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.398176 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-nj957" event={"ID":"cf5a50c8-3aeb-4d5c-b313-b2eed4da3517","Type":"ContainerDied","Data":"6b10d374ff49f0b52403252a14d9078e06a365dc921dac51dd9a7129fea487fd"} Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.402710 4585 generic.go:334] "Generic (PLEG): container finished" podID="c9288a94-0029-4c7b-825a-ad1d005c736d" containerID="e68645bbc5a4775bda75665356d68ad4171f5673bf6730506664aa42347b28d5" exitCode=0 Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.402749 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-be71-account-create-update-jns4b" event={"ID":"c9288a94-0029-4c7b-825a-ad1d005c736d","Type":"ContainerDied","Data":"e68645bbc5a4775bda75665356d68ad4171f5673bf6730506664aa42347b28d5"} Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.403877 4585 generic.go:334] "Generic (PLEG): container finished" podID="df1492cf-2c9b-4573-9e32-dd372af19bfe" containerID="68f80228535ebc2a63137bc109a9b59ff5ca0b48b8f3cb66e64b313712519a9f" exitCode=0 Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.403913 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-3107-account-create-update-h6npx" event={"ID":"df1492cf-2c9b-4573-9e32-dd372af19bfe","Type":"ContainerDied","Data":"68f80228535ebc2a63137bc109a9b59ff5ca0b48b8f3cb66e64b313712519a9f"} Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.405575 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"25feba7f-466b-4d39-9096-b5101c68502b","Type":"ContainerDied","Data":"10b39410aa15cd2cd268c8c6a7d764a663a82ce26fef82aa409c44bb59811eaa"} Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.405605 4585 scope.go:117] "RemoveContainer" containerID="012370f09a16ff589a0d74c0ddb711e53d3edc2bbffa015966c5aaf1c9be3c39" Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.405709 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.413745 4585 generic.go:334] "Generic (PLEG): container finished" podID="5e77338c-2d1b-4b3f-812a-d8419ed44fb8" containerID="d82a1c29492bac4e19d334478f06a5f4e23b64ac27e343752d065d1407092ed0" exitCode=0 Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.415408 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25feba7f-466b-4d39-9096-b5101c68502b-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.415429 4585 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.439260 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0f792a5-42c4-4a55-af66-55abc1be9684" path="/var/lib/kubelet/pods/a0f792a5-42c4-4a55-af66-55abc1be9684/volumes" Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.458352 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-kwrm4" event={"ID":"5e77338c-2d1b-4b3f-812a-d8419ed44fb8","Type":"ContainerDied","Data":"d82a1c29492bac4e19d334478f06a5f4e23b64ac27e343752d065d1407092ed0"} Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.523550 4585 scope.go:117] "RemoveContainer" containerID="54858fd5198f9e7bae6e73d060e6cf805b39e71f39dcdcd87fbed8a918271822" Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.547749 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.555110 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.580449 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 14:16:40 crc kubenswrapper[4585]: E1201 14:16:40.580842 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25feba7f-466b-4d39-9096-b5101c68502b" containerName="glance-httpd" Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.580861 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="25feba7f-466b-4d39-9096-b5101c68502b" containerName="glance-httpd" Dec 01 14:16:40 crc kubenswrapper[4585]: E1201 14:16:40.580902 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25feba7f-466b-4d39-9096-b5101c68502b" containerName="glance-log" Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.580909 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="25feba7f-466b-4d39-9096-b5101c68502b" containerName="glance-log" Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.581086 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="25feba7f-466b-4d39-9096-b5101c68502b" containerName="glance-log" Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.581108 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="25feba7f-466b-4d39-9096-b5101c68502b" containerName="glance-httpd" Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.582022 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.585595 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.585834 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.602650 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.633014 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6f5b64975d-2mfhq" Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.633559 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6f5b64975d-2mfhq" Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.667902 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 01 14:16:40 crc kubenswrapper[4585]: W1201 14:16:40.692484 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod952acec2_d757_4b65_aaf3_61bb69e5d5d7.slice/crio-a7e1e517b70edd6932823d766345f95c2120e07b7d9ad7d3e6e04a8e38e28d3f WatchSource:0}: Error finding container a7e1e517b70edd6932823d766345f95c2120e07b7d9ad7d3e6e04a8e38e28d3f: Status 404 returned error can't find the container with id a7e1e517b70edd6932823d766345f95c2120e07b7d9ad7d3e6e04a8e38e28d3f Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.726073 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e570020-789a-4807-9cff-651caad31856-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2e570020-789a-4807-9cff-651caad31856\") " pod="openstack/glance-default-external-api-0" Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.726113 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e570020-789a-4807-9cff-651caad31856-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2e570020-789a-4807-9cff-651caad31856\") " pod="openstack/glance-default-external-api-0" Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.726199 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"2e570020-789a-4807-9cff-651caad31856\") " pod="openstack/glance-default-external-api-0" Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.728172 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2e570020-789a-4807-9cff-651caad31856-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2e570020-789a-4807-9cff-651caad31856\") " pod="openstack/glance-default-external-api-0" Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.728253 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e570020-789a-4807-9cff-651caad31856-logs\") pod \"glance-default-external-api-0\" (UID: \"2e570020-789a-4807-9cff-651caad31856\") " pod="openstack/glance-default-external-api-0" Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.728316 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e570020-789a-4807-9cff-651caad31856-config-data\") pod \"glance-default-external-api-0\" (UID: \"2e570020-789a-4807-9cff-651caad31856\") " pod="openstack/glance-default-external-api-0" Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.728365 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e570020-789a-4807-9cff-651caad31856-scripts\") pod \"glance-default-external-api-0\" (UID: \"2e570020-789a-4807-9cff-651caad31856\") " pod="openstack/glance-default-external-api-0" Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.728833 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jv98\" (UniqueName: \"kubernetes.io/projected/2e570020-789a-4807-9cff-651caad31856-kube-api-access-4jv98\") pod \"glance-default-external-api-0\" (UID: \"2e570020-789a-4807-9cff-651caad31856\") " pod="openstack/glance-default-external-api-0" Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.790501 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6bbf659b46-55tth" Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.791501 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6bbf659b46-55tth" Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.830570 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jv98\" (UniqueName: \"kubernetes.io/projected/2e570020-789a-4807-9cff-651caad31856-kube-api-access-4jv98\") pod \"glance-default-external-api-0\" (UID: \"2e570020-789a-4807-9cff-651caad31856\") " pod="openstack/glance-default-external-api-0" Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.830633 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e570020-789a-4807-9cff-651caad31856-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2e570020-789a-4807-9cff-651caad31856\") " pod="openstack/glance-default-external-api-0" Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.830652 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e570020-789a-4807-9cff-651caad31856-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2e570020-789a-4807-9cff-651caad31856\") " pod="openstack/glance-default-external-api-0" Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.830718 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"2e570020-789a-4807-9cff-651caad31856\") " pod="openstack/glance-default-external-api-0" Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.830740 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2e570020-789a-4807-9cff-651caad31856-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2e570020-789a-4807-9cff-651caad31856\") " pod="openstack/glance-default-external-api-0" Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.830759 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e570020-789a-4807-9cff-651caad31856-logs\") pod \"glance-default-external-api-0\" (UID: \"2e570020-789a-4807-9cff-651caad31856\") " pod="openstack/glance-default-external-api-0" Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.830781 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e570020-789a-4807-9cff-651caad31856-config-data\") pod \"glance-default-external-api-0\" (UID: \"2e570020-789a-4807-9cff-651caad31856\") " pod="openstack/glance-default-external-api-0" Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.830801 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e570020-789a-4807-9cff-651caad31856-scripts\") pod \"glance-default-external-api-0\" (UID: \"2e570020-789a-4807-9cff-651caad31856\") " pod="openstack/glance-default-external-api-0" Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.831490 4585 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"2e570020-789a-4807-9cff-651caad31856\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.831812 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e570020-789a-4807-9cff-651caad31856-logs\") pod \"glance-default-external-api-0\" (UID: \"2e570020-789a-4807-9cff-651caad31856\") " pod="openstack/glance-default-external-api-0" Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.831867 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2e570020-789a-4807-9cff-651caad31856-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2e570020-789a-4807-9cff-651caad31856\") " pod="openstack/glance-default-external-api-0" Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.837135 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e570020-789a-4807-9cff-651caad31856-config-data\") pod \"glance-default-external-api-0\" (UID: \"2e570020-789a-4807-9cff-651caad31856\") " pod="openstack/glance-default-external-api-0" Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.838485 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e570020-789a-4807-9cff-651caad31856-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2e570020-789a-4807-9cff-651caad31856\") " pod="openstack/glance-default-external-api-0" Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.839734 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e570020-789a-4807-9cff-651caad31856-scripts\") pod \"glance-default-external-api-0\" (UID: \"2e570020-789a-4807-9cff-651caad31856\") " pod="openstack/glance-default-external-api-0" Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.843604 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e570020-789a-4807-9cff-651caad31856-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2e570020-789a-4807-9cff-651caad31856\") " pod="openstack/glance-default-external-api-0" Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.865408 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jv98\" (UniqueName: \"kubernetes.io/projected/2e570020-789a-4807-9cff-651caad31856-kube-api-access-4jv98\") pod \"glance-default-external-api-0\" (UID: \"2e570020-789a-4807-9cff-651caad31856\") " pod="openstack/glance-default-external-api-0" Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.884605 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"2e570020-789a-4807-9cff-651caad31856\") " pod="openstack/glance-default-external-api-0" Dec 01 14:16:40 crc kubenswrapper[4585]: I1201 14:16:40.900692 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 14:16:41 crc kubenswrapper[4585]: I1201 14:16:41.095848 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xw7sm" Dec 01 14:16:41 crc kubenswrapper[4585]: I1201 14:16:41.240664 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd5ea5ee-2cac-4fab-893d-83569ffbae4c-operator-scripts\") pod \"cd5ea5ee-2cac-4fab-893d-83569ffbae4c\" (UID: \"cd5ea5ee-2cac-4fab-893d-83569ffbae4c\") " Dec 01 14:16:41 crc kubenswrapper[4585]: I1201 14:16:41.240755 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrdpj\" (UniqueName: \"kubernetes.io/projected/cd5ea5ee-2cac-4fab-893d-83569ffbae4c-kube-api-access-qrdpj\") pod \"cd5ea5ee-2cac-4fab-893d-83569ffbae4c\" (UID: \"cd5ea5ee-2cac-4fab-893d-83569ffbae4c\") " Dec 01 14:16:41 crc kubenswrapper[4585]: I1201 14:16:41.241415 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd5ea5ee-2cac-4fab-893d-83569ffbae4c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cd5ea5ee-2cac-4fab-893d-83569ffbae4c" (UID: "cd5ea5ee-2cac-4fab-893d-83569ffbae4c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:16:41 crc kubenswrapper[4585]: I1201 14:16:41.243829 4585 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd5ea5ee-2cac-4fab-893d-83569ffbae4c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:41 crc kubenswrapper[4585]: I1201 14:16:41.276426 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd5ea5ee-2cac-4fab-893d-83569ffbae4c-kube-api-access-qrdpj" (OuterVolumeSpecName: "kube-api-access-qrdpj") pod "cd5ea5ee-2cac-4fab-893d-83569ffbae4c" (UID: "cd5ea5ee-2cac-4fab-893d-83569ffbae4c"). InnerVolumeSpecName "kube-api-access-qrdpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:16:41 crc kubenswrapper[4585]: I1201 14:16:41.346958 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrdpj\" (UniqueName: \"kubernetes.io/projected/cd5ea5ee-2cac-4fab-893d-83569ffbae4c-kube-api-access-qrdpj\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:41 crc kubenswrapper[4585]: I1201 14:16:41.497230 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"952acec2-d757-4b65-aaf3-61bb69e5d5d7","Type":"ContainerStarted","Data":"a7e1e517b70edd6932823d766345f95c2120e07b7d9ad7d3e6e04a8e38e28d3f"} Dec 01 14:16:41 crc kubenswrapper[4585]: I1201 14:16:41.502439 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-xw7sm" event={"ID":"cd5ea5ee-2cac-4fab-893d-83569ffbae4c","Type":"ContainerDied","Data":"2edc900ea7546e78b2514bc6c0cd5d79910f70355e38d6964062cbd0a6d8a163"} Dec 01 14:16:41 crc kubenswrapper[4585]: I1201 14:16:41.502489 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2edc900ea7546e78b2514bc6c0cd5d79910f70355e38d6964062cbd0a6d8a163" Dec 01 14:16:41 crc kubenswrapper[4585]: I1201 14:16:41.502585 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xw7sm" Dec 01 14:16:41 crc kubenswrapper[4585]: I1201 14:16:41.542112 4585 generic.go:334] "Generic (PLEG): container finished" podID="6a302cf5-b263-4654-b7cc-e7122f4b11cb" containerID="a0f5bda5a277365c87c89fc629c83a60030e06321be8230e8eafb8508497289b" exitCode=0 Dec 01 14:16:41 crc kubenswrapper[4585]: I1201 14:16:41.542231 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6a302cf5-b263-4654-b7cc-e7122f4b11cb","Type":"ContainerDied","Data":"a0f5bda5a277365c87c89fc629c83a60030e06321be8230e8eafb8508497289b"} Dec 01 14:16:41 crc kubenswrapper[4585]: I1201 14:16:41.794105 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 14:16:41 crc kubenswrapper[4585]: E1201 14:16:41.807554 4585 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd5ea5ee_2cac_4fab_893d_83569ffbae4c.slice/crio-2edc900ea7546e78b2514bc6c0cd5d79910f70355e38d6964062cbd0a6d8a163\": RecentStats: unable to find data in memory cache]" Dec 01 14:16:41 crc kubenswrapper[4585]: I1201 14:16:41.836489 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 14:16:41 crc kubenswrapper[4585]: W1201 14:16:41.875144 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e570020_789a_4807_9cff_651caad31856.slice/crio-0220c1c7e1ba51f131400491785f0f65a9f6a09fe1cace441d1d3eb314a420e9 WatchSource:0}: Error finding container 0220c1c7e1ba51f131400491785f0f65a9f6a09fe1cace441d1d3eb314a420e9: Status 404 returned error can't find the container with id 0220c1c7e1ba51f131400491785f0f65a9f6a09fe1cace441d1d3eb314a420e9 Dec 01 14:16:41 crc kubenswrapper[4585]: I1201 14:16:41.969734 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"6a302cf5-b263-4654-b7cc-e7122f4b11cb\" (UID: \"6a302cf5-b263-4654-b7cc-e7122f4b11cb\") " Dec 01 14:16:41 crc kubenswrapper[4585]: I1201 14:16:41.969871 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a302cf5-b263-4654-b7cc-e7122f4b11cb-config-data\") pod \"6a302cf5-b263-4654-b7cc-e7122f4b11cb\" (UID: \"6a302cf5-b263-4654-b7cc-e7122f4b11cb\") " Dec 01 14:16:41 crc kubenswrapper[4585]: I1201 14:16:41.969939 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a302cf5-b263-4654-b7cc-e7122f4b11cb-scripts\") pod \"6a302cf5-b263-4654-b7cc-e7122f4b11cb\" (UID: \"6a302cf5-b263-4654-b7cc-e7122f4b11cb\") " Dec 01 14:16:41 crc kubenswrapper[4585]: I1201 14:16:41.969956 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a302cf5-b263-4654-b7cc-e7122f4b11cb-combined-ca-bundle\") pod \"6a302cf5-b263-4654-b7cc-e7122f4b11cb\" (UID: \"6a302cf5-b263-4654-b7cc-e7122f4b11cb\") " Dec 01 14:16:41 crc kubenswrapper[4585]: I1201 14:16:41.970027 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a302cf5-b263-4654-b7cc-e7122f4b11cb-internal-tls-certs\") pod \"6a302cf5-b263-4654-b7cc-e7122f4b11cb\" (UID: \"6a302cf5-b263-4654-b7cc-e7122f4b11cb\") " Dec 01 14:16:41 crc kubenswrapper[4585]: I1201 14:16:41.970052 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6a302cf5-b263-4654-b7cc-e7122f4b11cb-httpd-run\") pod \"6a302cf5-b263-4654-b7cc-e7122f4b11cb\" (UID: \"6a302cf5-b263-4654-b7cc-e7122f4b11cb\") " Dec 01 14:16:41 crc kubenswrapper[4585]: I1201 14:16:41.970074 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a302cf5-b263-4654-b7cc-e7122f4b11cb-logs\") pod \"6a302cf5-b263-4654-b7cc-e7122f4b11cb\" (UID: \"6a302cf5-b263-4654-b7cc-e7122f4b11cb\") " Dec 01 14:16:41 crc kubenswrapper[4585]: I1201 14:16:41.970098 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mj55m\" (UniqueName: \"kubernetes.io/projected/6a302cf5-b263-4654-b7cc-e7122f4b11cb-kube-api-access-mj55m\") pod \"6a302cf5-b263-4654-b7cc-e7122f4b11cb\" (UID: \"6a302cf5-b263-4654-b7cc-e7122f4b11cb\") " Dec 01 14:16:41 crc kubenswrapper[4585]: I1201 14:16:41.984062 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a302cf5-b263-4654-b7cc-e7122f4b11cb-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6a302cf5-b263-4654-b7cc-e7122f4b11cb" (UID: "6a302cf5-b263-4654-b7cc-e7122f4b11cb"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:16:41 crc kubenswrapper[4585]: I1201 14:16:41.984395 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a302cf5-b263-4654-b7cc-e7122f4b11cb-logs" (OuterVolumeSpecName: "logs") pod "6a302cf5-b263-4654-b7cc-e7122f4b11cb" (UID: "6a302cf5-b263-4654-b7cc-e7122f4b11cb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:16:41 crc kubenswrapper[4585]: I1201 14:16:41.998126 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a302cf5-b263-4654-b7cc-e7122f4b11cb-scripts" (OuterVolumeSpecName: "scripts") pod "6a302cf5-b263-4654-b7cc-e7122f4b11cb" (UID: "6a302cf5-b263-4654-b7cc-e7122f4b11cb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:41.998850 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "6a302cf5-b263-4654-b7cc-e7122f4b11cb" (UID: "6a302cf5-b263-4654-b7cc-e7122f4b11cb"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.006319 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a302cf5-b263-4654-b7cc-e7122f4b11cb-kube-api-access-mj55m" (OuterVolumeSpecName: "kube-api-access-mj55m") pod "6a302cf5-b263-4654-b7cc-e7122f4b11cb" (UID: "6a302cf5-b263-4654-b7cc-e7122f4b11cb"). InnerVolumeSpecName "kube-api-access-mj55m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.072577 4585 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6a302cf5-b263-4654-b7cc-e7122f4b11cb-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.072807 4585 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a302cf5-b263-4654-b7cc-e7122f4b11cb-logs\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.072873 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mj55m\" (UniqueName: \"kubernetes.io/projected/6a302cf5-b263-4654-b7cc-e7122f4b11cb-kube-api-access-mj55m\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.072944 4585 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.073018 4585 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a302cf5-b263-4654-b7cc-e7122f4b11cb-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.105132 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a302cf5-b263-4654-b7cc-e7122f4b11cb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6a302cf5-b263-4654-b7cc-e7122f4b11cb" (UID: "6a302cf5-b263-4654-b7cc-e7122f4b11cb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.147336 4585 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.189854 4585 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.200328 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a302cf5-b263-4654-b7cc-e7122f4b11cb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.233198 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a302cf5-b263-4654-b7cc-e7122f4b11cb-config-data" (OuterVolumeSpecName: "config-data") pod "6a302cf5-b263-4654-b7cc-e7122f4b11cb" (UID: "6a302cf5-b263-4654-b7cc-e7122f4b11cb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.287754 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a302cf5-b263-4654-b7cc-e7122f4b11cb-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6a302cf5-b263-4654-b7cc-e7122f4b11cb" (UID: "6a302cf5-b263-4654-b7cc-e7122f4b11cb"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.302452 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a302cf5-b263-4654-b7cc-e7122f4b11cb-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.302669 4585 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a302cf5-b263-4654-b7cc-e7122f4b11cb-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.370826 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-3107-account-create-update-h6npx" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.466591 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25feba7f-466b-4d39-9096-b5101c68502b" path="/var/lib/kubelet/pods/25feba7f-466b-4d39-9096-b5101c68502b/volumes" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.507693 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wp5bf\" (UniqueName: \"kubernetes.io/projected/df1492cf-2c9b-4573-9e32-dd372af19bfe-kube-api-access-wp5bf\") pod \"df1492cf-2c9b-4573-9e32-dd372af19bfe\" (UID: \"df1492cf-2c9b-4573-9e32-dd372af19bfe\") " Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.507782 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df1492cf-2c9b-4573-9e32-dd372af19bfe-operator-scripts\") pod \"df1492cf-2c9b-4573-9e32-dd372af19bfe\" (UID: \"df1492cf-2c9b-4573-9e32-dd372af19bfe\") " Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.508519 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df1492cf-2c9b-4573-9e32-dd372af19bfe-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "df1492cf-2c9b-4573-9e32-dd372af19bfe" (UID: "df1492cf-2c9b-4573-9e32-dd372af19bfe"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.510618 4585 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df1492cf-2c9b-4573-9e32-dd372af19bfe-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.516136 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df1492cf-2c9b-4573-9e32-dd372af19bfe-kube-api-access-wp5bf" (OuterVolumeSpecName: "kube-api-access-wp5bf") pod "df1492cf-2c9b-4573-9e32-dd372af19bfe" (UID: "df1492cf-2c9b-4573-9e32-dd372af19bfe"). InnerVolumeSpecName "kube-api-access-wp5bf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.614126 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wp5bf\" (UniqueName: \"kubernetes.io/projected/df1492cf-2c9b-4573-9e32-dd372af19bfe-kube-api-access-wp5bf\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.626913 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6a302cf5-b263-4654-b7cc-e7122f4b11cb","Type":"ContainerDied","Data":"77c3b35fca68462f875efade018683047afcdbd79fbbaf2aa8a2840fd6ad625a"} Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.626961 4585 scope.go:117] "RemoveContainer" containerID="a0f5bda5a277365c87c89fc629c83a60030e06321be8230e8eafb8508497289b" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.627033 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-kwrm4" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.627123 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.662856 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-3107-account-create-update-h6npx" event={"ID":"df1492cf-2c9b-4573-9e32-dd372af19bfe","Type":"ContainerDied","Data":"210faecfd5ab89f78c055cf460df633e587ccce7a03804af0f6291cc89d16943"} Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.663512 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="210faecfd5ab89f78c055cf460df633e587ccce7a03804af0f6291cc89d16943" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.663671 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-3107-account-create-update-h6npx" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.717573 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e77338c-2d1b-4b3f-812a-d8419ed44fb8-operator-scripts\") pod \"5e77338c-2d1b-4b3f-812a-d8419ed44fb8\" (UID: \"5e77338c-2d1b-4b3f-812a-d8419ed44fb8\") " Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.717626 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpjf2\" (UniqueName: \"kubernetes.io/projected/5e77338c-2d1b-4b3f-812a-d8419ed44fb8-kube-api-access-xpjf2\") pod \"5e77338c-2d1b-4b3f-812a-d8419ed44fb8\" (UID: \"5e77338c-2d1b-4b3f-812a-d8419ed44fb8\") " Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.718579 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e77338c-2d1b-4b3f-812a-d8419ed44fb8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5e77338c-2d1b-4b3f-812a-d8419ed44fb8" (UID: "5e77338c-2d1b-4b3f-812a-d8419ed44fb8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.725226 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-kwrm4" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.725687 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-kwrm4" event={"ID":"5e77338c-2d1b-4b3f-812a-d8419ed44fb8","Type":"ContainerDied","Data":"4bc92a92ce9e944811d18497801eb6e7147d985e8c1d698c9c6e449a4c0bb0c4"} Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.725709 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4bc92a92ce9e944811d18497801eb6e7147d985e8c1d698c9c6e449a4c0bb0c4" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.731647 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e77338c-2d1b-4b3f-812a-d8419ed44fb8-kube-api-access-xpjf2" (OuterVolumeSpecName: "kube-api-access-xpjf2") pod "5e77338c-2d1b-4b3f-812a-d8419ed44fb8" (UID: "5e77338c-2d1b-4b3f-812a-d8419ed44fb8"). InnerVolumeSpecName "kube-api-access-xpjf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.732291 4585 scope.go:117] "RemoveContainer" containerID="7889205e17509d9adb2f2146398b2a92c586e75952713cd7b30905412665187d" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.745203 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.753167 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.757305 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 14:16:42 crc kubenswrapper[4585]: E1201 14:16:42.757632 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd5ea5ee-2cac-4fab-893d-83569ffbae4c" containerName="mariadb-database-create" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.757649 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd5ea5ee-2cac-4fab-893d-83569ffbae4c" containerName="mariadb-database-create" Dec 01 14:16:42 crc kubenswrapper[4585]: E1201 14:16:42.757666 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a302cf5-b263-4654-b7cc-e7122f4b11cb" containerName="glance-log" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.757673 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a302cf5-b263-4654-b7cc-e7122f4b11cb" containerName="glance-log" Dec 01 14:16:42 crc kubenswrapper[4585]: E1201 14:16:42.757685 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e77338c-2d1b-4b3f-812a-d8419ed44fb8" containerName="mariadb-database-create" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.757691 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e77338c-2d1b-4b3f-812a-d8419ed44fb8" containerName="mariadb-database-create" Dec 01 14:16:42 crc kubenswrapper[4585]: E1201 14:16:42.757706 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a302cf5-b263-4654-b7cc-e7122f4b11cb" containerName="glance-httpd" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.757712 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a302cf5-b263-4654-b7cc-e7122f4b11cb" containerName="glance-httpd" Dec 01 14:16:42 crc kubenswrapper[4585]: E1201 14:16:42.757729 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df1492cf-2c9b-4573-9e32-dd372af19bfe" containerName="mariadb-account-create-update" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.757735 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="df1492cf-2c9b-4573-9e32-dd372af19bfe" containerName="mariadb-account-create-update" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.757895 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e77338c-2d1b-4b3f-812a-d8419ed44fb8" containerName="mariadb-database-create" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.757913 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a302cf5-b263-4654-b7cc-e7122f4b11cb" containerName="glance-httpd" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.757923 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a302cf5-b263-4654-b7cc-e7122f4b11cb" containerName="glance-log" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.757936 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="df1492cf-2c9b-4573-9e32-dd372af19bfe" containerName="mariadb-account-create-update" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.757947 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd5ea5ee-2cac-4fab-893d-83569ffbae4c" containerName="mariadb-database-create" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.758993 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.765289 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.765553 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.775361 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2e570020-789a-4807-9cff-651caad31856","Type":"ContainerStarted","Data":"0220c1c7e1ba51f131400491785f0f65a9f6a09fe1cace441d1d3eb314a420e9"} Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.818671 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.819895 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bba1bc55-ebde-45b0-a2bf-1b05117fc134-logs\") pod \"glance-default-internal-api-0\" (UID: \"bba1bc55-ebde-45b0-a2bf-1b05117fc134\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.819957 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bba1bc55-ebde-45b0-a2bf-1b05117fc134-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"bba1bc55-ebde-45b0-a2bf-1b05117fc134\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.820680 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bba1bc55-ebde-45b0-a2bf-1b05117fc134-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"bba1bc55-ebde-45b0-a2bf-1b05117fc134\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.820707 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq2jg\" (UniqueName: \"kubernetes.io/projected/bba1bc55-ebde-45b0-a2bf-1b05117fc134-kube-api-access-lq2jg\") pod \"glance-default-internal-api-0\" (UID: \"bba1bc55-ebde-45b0-a2bf-1b05117fc134\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.820732 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bba1bc55-ebde-45b0-a2bf-1b05117fc134-scripts\") pod \"glance-default-internal-api-0\" (UID: \"bba1bc55-ebde-45b0-a2bf-1b05117fc134\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.820761 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bba1bc55-ebde-45b0-a2bf-1b05117fc134-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"bba1bc55-ebde-45b0-a2bf-1b05117fc134\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.820782 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bba1bc55-ebde-45b0-a2bf-1b05117fc134-config-data\") pod \"glance-default-internal-api-0\" (UID: \"bba1bc55-ebde-45b0-a2bf-1b05117fc134\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.820830 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"bba1bc55-ebde-45b0-a2bf-1b05117fc134\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.820927 4585 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e77338c-2d1b-4b3f-812a-d8419ed44fb8-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.820939 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xpjf2\" (UniqueName: \"kubernetes.io/projected/5e77338c-2d1b-4b3f-812a-d8419ed44fb8-kube-api-access-xpjf2\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.923071 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bba1bc55-ebde-45b0-a2bf-1b05117fc134-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"bba1bc55-ebde-45b0-a2bf-1b05117fc134\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.923128 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bba1bc55-ebde-45b0-a2bf-1b05117fc134-config-data\") pod \"glance-default-internal-api-0\" (UID: \"bba1bc55-ebde-45b0-a2bf-1b05117fc134\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.923201 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"bba1bc55-ebde-45b0-a2bf-1b05117fc134\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.923271 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bba1bc55-ebde-45b0-a2bf-1b05117fc134-logs\") pod \"glance-default-internal-api-0\" (UID: \"bba1bc55-ebde-45b0-a2bf-1b05117fc134\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.923323 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bba1bc55-ebde-45b0-a2bf-1b05117fc134-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"bba1bc55-ebde-45b0-a2bf-1b05117fc134\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.923368 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bba1bc55-ebde-45b0-a2bf-1b05117fc134-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"bba1bc55-ebde-45b0-a2bf-1b05117fc134\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.923396 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq2jg\" (UniqueName: \"kubernetes.io/projected/bba1bc55-ebde-45b0-a2bf-1b05117fc134-kube-api-access-lq2jg\") pod \"glance-default-internal-api-0\" (UID: \"bba1bc55-ebde-45b0-a2bf-1b05117fc134\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.923426 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bba1bc55-ebde-45b0-a2bf-1b05117fc134-scripts\") pod \"glance-default-internal-api-0\" (UID: \"bba1bc55-ebde-45b0-a2bf-1b05117fc134\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.930855 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bba1bc55-ebde-45b0-a2bf-1b05117fc134-logs\") pod \"glance-default-internal-api-0\" (UID: \"bba1bc55-ebde-45b0-a2bf-1b05117fc134\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.930990 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bba1bc55-ebde-45b0-a2bf-1b05117fc134-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"bba1bc55-ebde-45b0-a2bf-1b05117fc134\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.932922 4585 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"bba1bc55-ebde-45b0-a2bf-1b05117fc134\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-internal-api-0" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.942211 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bba1bc55-ebde-45b0-a2bf-1b05117fc134-config-data\") pod \"glance-default-internal-api-0\" (UID: \"bba1bc55-ebde-45b0-a2bf-1b05117fc134\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.956265 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bba1bc55-ebde-45b0-a2bf-1b05117fc134-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"bba1bc55-ebde-45b0-a2bf-1b05117fc134\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.958454 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bba1bc55-ebde-45b0-a2bf-1b05117fc134-scripts\") pod \"glance-default-internal-api-0\" (UID: \"bba1bc55-ebde-45b0-a2bf-1b05117fc134\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:16:42 crc kubenswrapper[4585]: I1201 14:16:42.970261 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bba1bc55-ebde-45b0-a2bf-1b05117fc134-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"bba1bc55-ebde-45b0-a2bf-1b05117fc134\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:16:43 crc kubenswrapper[4585]: I1201 14:16:43.006461 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq2jg\" (UniqueName: \"kubernetes.io/projected/bba1bc55-ebde-45b0-a2bf-1b05117fc134-kube-api-access-lq2jg\") pod \"glance-default-internal-api-0\" (UID: \"bba1bc55-ebde-45b0-a2bf-1b05117fc134\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:16:43 crc kubenswrapper[4585]: I1201 14:16:43.097450 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-be71-account-create-update-jns4b" Dec 01 14:16:43 crc kubenswrapper[4585]: I1201 14:16:43.116264 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-96f3-account-create-update-gbhsj" Dec 01 14:16:43 crc kubenswrapper[4585]: I1201 14:16:43.128257 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"bba1bc55-ebde-45b0-a2bf-1b05117fc134\") " pod="openstack/glance-default-internal-api-0" Dec 01 14:16:43 crc kubenswrapper[4585]: I1201 14:16:43.165443 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9288a94-0029-4c7b-825a-ad1d005c736d-operator-scripts\") pod \"c9288a94-0029-4c7b-825a-ad1d005c736d\" (UID: \"c9288a94-0029-4c7b-825a-ad1d005c736d\") " Dec 01 14:16:43 crc kubenswrapper[4585]: I1201 14:16:43.165791 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rvl4\" (UniqueName: \"kubernetes.io/projected/c9288a94-0029-4c7b-825a-ad1d005c736d-kube-api-access-5rvl4\") pod \"c9288a94-0029-4c7b-825a-ad1d005c736d\" (UID: \"c9288a94-0029-4c7b-825a-ad1d005c736d\") " Dec 01 14:16:43 crc kubenswrapper[4585]: I1201 14:16:43.165813 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fae72ee-6647-4105-ab94-ed2ab6bed7da-operator-scripts\") pod \"8fae72ee-6647-4105-ab94-ed2ab6bed7da\" (UID: \"8fae72ee-6647-4105-ab94-ed2ab6bed7da\") " Dec 01 14:16:43 crc kubenswrapper[4585]: I1201 14:16:43.165917 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvkrb\" (UniqueName: \"kubernetes.io/projected/8fae72ee-6647-4105-ab94-ed2ab6bed7da-kube-api-access-cvkrb\") pod \"8fae72ee-6647-4105-ab94-ed2ab6bed7da\" (UID: \"8fae72ee-6647-4105-ab94-ed2ab6bed7da\") " Dec 01 14:16:43 crc kubenswrapper[4585]: I1201 14:16:43.170842 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fae72ee-6647-4105-ab94-ed2ab6bed7da-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8fae72ee-6647-4105-ab94-ed2ab6bed7da" (UID: "8fae72ee-6647-4105-ab94-ed2ab6bed7da"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:16:43 crc kubenswrapper[4585]: I1201 14:16:43.171290 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9288a94-0029-4c7b-825a-ad1d005c736d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c9288a94-0029-4c7b-825a-ad1d005c736d" (UID: "c9288a94-0029-4c7b-825a-ad1d005c736d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:16:43 crc kubenswrapper[4585]: I1201 14:16:43.173282 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fae72ee-6647-4105-ab94-ed2ab6bed7da-kube-api-access-cvkrb" (OuterVolumeSpecName: "kube-api-access-cvkrb") pod "8fae72ee-6647-4105-ab94-ed2ab6bed7da" (UID: "8fae72ee-6647-4105-ab94-ed2ab6bed7da"). InnerVolumeSpecName "kube-api-access-cvkrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:16:43 crc kubenswrapper[4585]: I1201 14:16:43.186312 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9288a94-0029-4c7b-825a-ad1d005c736d-kube-api-access-5rvl4" (OuterVolumeSpecName: "kube-api-access-5rvl4") pod "c9288a94-0029-4c7b-825a-ad1d005c736d" (UID: "c9288a94-0029-4c7b-825a-ad1d005c736d"). InnerVolumeSpecName "kube-api-access-5rvl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:16:43 crc kubenswrapper[4585]: I1201 14:16:43.241636 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-nj957" Dec 01 14:16:43 crc kubenswrapper[4585]: I1201 14:16:43.267250 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jb6t\" (UniqueName: \"kubernetes.io/projected/cf5a50c8-3aeb-4d5c-b313-b2eed4da3517-kube-api-access-6jb6t\") pod \"cf5a50c8-3aeb-4d5c-b313-b2eed4da3517\" (UID: \"cf5a50c8-3aeb-4d5c-b313-b2eed4da3517\") " Dec 01 14:16:43 crc kubenswrapper[4585]: I1201 14:16:43.267522 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf5a50c8-3aeb-4d5c-b313-b2eed4da3517-operator-scripts\") pod \"cf5a50c8-3aeb-4d5c-b313-b2eed4da3517\" (UID: \"cf5a50c8-3aeb-4d5c-b313-b2eed4da3517\") " Dec 01 14:16:43 crc kubenswrapper[4585]: I1201 14:16:43.267890 4585 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9288a94-0029-4c7b-825a-ad1d005c736d-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:43 crc kubenswrapper[4585]: I1201 14:16:43.267961 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rvl4\" (UniqueName: \"kubernetes.io/projected/c9288a94-0029-4c7b-825a-ad1d005c736d-kube-api-access-5rvl4\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:43 crc kubenswrapper[4585]: I1201 14:16:43.268057 4585 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fae72ee-6647-4105-ab94-ed2ab6bed7da-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:43 crc kubenswrapper[4585]: I1201 14:16:43.268185 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvkrb\" (UniqueName: \"kubernetes.io/projected/8fae72ee-6647-4105-ab94-ed2ab6bed7da-kube-api-access-cvkrb\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:43 crc kubenswrapper[4585]: I1201 14:16:43.267934 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf5a50c8-3aeb-4d5c-b313-b2eed4da3517-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cf5a50c8-3aeb-4d5c-b313-b2eed4da3517" (UID: "cf5a50c8-3aeb-4d5c-b313-b2eed4da3517"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:16:43 crc kubenswrapper[4585]: I1201 14:16:43.276990 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf5a50c8-3aeb-4d5c-b313-b2eed4da3517-kube-api-access-6jb6t" (OuterVolumeSpecName: "kube-api-access-6jb6t") pod "cf5a50c8-3aeb-4d5c-b313-b2eed4da3517" (UID: "cf5a50c8-3aeb-4d5c-b313-b2eed4da3517"). InnerVolumeSpecName "kube-api-access-6jb6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:16:43 crc kubenswrapper[4585]: I1201 14:16:43.369749 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jb6t\" (UniqueName: \"kubernetes.io/projected/cf5a50c8-3aeb-4d5c-b313-b2eed4da3517-kube-api-access-6jb6t\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:43 crc kubenswrapper[4585]: I1201 14:16:43.369789 4585 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf5a50c8-3aeb-4d5c-b313-b2eed4da3517-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 14:16:43 crc kubenswrapper[4585]: I1201 14:16:43.427041 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 14:16:43 crc kubenswrapper[4585]: I1201 14:16:43.815408 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2e570020-789a-4807-9cff-651caad31856","Type":"ContainerStarted","Data":"6a430f45d213c78044969f532ed92a6d7239c5aa4f85232ed1edc8be0b4fbab8"} Dec 01 14:16:43 crc kubenswrapper[4585]: I1201 14:16:43.822146 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-96f3-account-create-update-gbhsj" event={"ID":"8fae72ee-6647-4105-ab94-ed2ab6bed7da","Type":"ContainerDied","Data":"f26036630c013e3c3c70cb8cb34d25c08e53d28eecb756c76aaf19b2e2b04444"} Dec 01 14:16:43 crc kubenswrapper[4585]: I1201 14:16:43.822182 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f26036630c013e3c3c70cb8cb34d25c08e53d28eecb756c76aaf19b2e2b04444" Dec 01 14:16:43 crc kubenswrapper[4585]: I1201 14:16:43.822263 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-96f3-account-create-update-gbhsj" Dec 01 14:16:43 crc kubenswrapper[4585]: I1201 14:16:43.842239 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-nj957" event={"ID":"cf5a50c8-3aeb-4d5c-b313-b2eed4da3517","Type":"ContainerDied","Data":"277d5f3f1d56f731f4b7678d72eba1fd9a49388f248c0e874a5b1091c482f2ee"} Dec 01 14:16:43 crc kubenswrapper[4585]: I1201 14:16:43.842272 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="277d5f3f1d56f731f4b7678d72eba1fd9a49388f248c0e874a5b1091c482f2ee" Dec 01 14:16:43 crc kubenswrapper[4585]: I1201 14:16:43.842365 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-nj957" Dec 01 14:16:43 crc kubenswrapper[4585]: I1201 14:16:43.869345 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-be71-account-create-update-jns4b" event={"ID":"c9288a94-0029-4c7b-825a-ad1d005c736d","Type":"ContainerDied","Data":"dab6cc6b3ded6b3262dc903b66498f01162fb4421d9572c89ddf80f6f24d47e3"} Dec 01 14:16:43 crc kubenswrapper[4585]: I1201 14:16:43.869377 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dab6cc6b3ded6b3262dc903b66498f01162fb4421d9572c89ddf80f6f24d47e3" Dec 01 14:16:43 crc kubenswrapper[4585]: I1201 14:16:43.869430 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-be71-account-create-update-jns4b" Dec 01 14:16:43 crc kubenswrapper[4585]: I1201 14:16:43.874497 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"952acec2-d757-4b65-aaf3-61bb69e5d5d7","Type":"ContainerStarted","Data":"123d5b1088c89677156026c522b443b87a2fd89fdf207be6326d5f80d7976e90"} Dec 01 14:16:43 crc kubenswrapper[4585]: I1201 14:16:43.964278 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 14:16:44 crc kubenswrapper[4585]: W1201 14:16:44.001047 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbba1bc55_ebde_45b0_a2bf_1b05117fc134.slice/crio-91332bc09498dd4b0783a755a7b7cae24908bbd4f334f2959b7e9da65f7cdd0b WatchSource:0}: Error finding container 91332bc09498dd4b0783a755a7b7cae24908bbd4f334f2959b7e9da65f7cdd0b: Status 404 returned error can't find the container with id 91332bc09498dd4b0783a755a7b7cae24908bbd4f334f2959b7e9da65f7cdd0b Dec 01 14:16:44 crc kubenswrapper[4585]: I1201 14:16:44.426355 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a302cf5-b263-4654-b7cc-e7122f4b11cb" path="/var/lib/kubelet/pods/6a302cf5-b263-4654-b7cc-e7122f4b11cb/volumes" Dec 01 14:16:44 crc kubenswrapper[4585]: I1201 14:16:44.897992 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2e570020-789a-4807-9cff-651caad31856","Type":"ContainerStarted","Data":"8185c7c9a957ab6fc04421f09f56a52dfcc7bc02e49f0af1424fb1c61c49511d"} Dec 01 14:16:44 crc kubenswrapper[4585]: I1201 14:16:44.903792 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bba1bc55-ebde-45b0-a2bf-1b05117fc134","Type":"ContainerStarted","Data":"3174898fec1195a042b1ca390d001257b827a7fe022c04b6181b4531a624fbe3"} Dec 01 14:16:44 crc kubenswrapper[4585]: I1201 14:16:44.903844 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bba1bc55-ebde-45b0-a2bf-1b05117fc134","Type":"ContainerStarted","Data":"91332bc09498dd4b0783a755a7b7cae24908bbd4f334f2959b7e9da65f7cdd0b"} Dec 01 14:16:44 crc kubenswrapper[4585]: I1201 14:16:44.914245 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"952acec2-d757-4b65-aaf3-61bb69e5d5d7","Type":"ContainerStarted","Data":"8f15ffeed277e8980da79dbbb6e9381ce712052b6ad6600a2f0fe644e64e7738"} Dec 01 14:16:44 crc kubenswrapper[4585]: I1201 14:16:44.914993 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 01 14:16:44 crc kubenswrapper[4585]: I1201 14:16:44.961656 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.961640676 podStartE2EDuration="4.961640676s" podCreationTimestamp="2025-12-01 14:16:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:16:44.929626802 +0000 UTC m=+1118.913840667" watchObservedRunningTime="2025-12-01 14:16:44.961640676 +0000 UTC m=+1118.945854531" Dec 01 14:16:44 crc kubenswrapper[4585]: I1201 14:16:44.968425 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.968410896 podStartE2EDuration="5.968410896s" podCreationTimestamp="2025-12-01 14:16:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:16:44.95842863 +0000 UTC m=+1118.942642495" watchObservedRunningTime="2025-12-01 14:16:44.968410896 +0000 UTC m=+1118.952624751" Dec 01 14:16:45 crc kubenswrapper[4585]: I1201 14:16:45.933209 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bba1bc55-ebde-45b0-a2bf-1b05117fc134","Type":"ContainerStarted","Data":"63964dfa219c6e235b6812d20da7a3395f008509ffc3f5e4b86559c6868935ae"} Dec 01 14:16:46 crc kubenswrapper[4585]: I1201 14:16:45.995189 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.995171988 podStartE2EDuration="3.995171988s" podCreationTimestamp="2025-12-01 14:16:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:16:45.977582729 +0000 UTC m=+1119.961796604" watchObservedRunningTime="2025-12-01 14:16:45.995171988 +0000 UTC m=+1119.979385843" Dec 01 14:16:47 crc kubenswrapper[4585]: I1201 14:16:47.299518 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-lc2cr"] Dec 01 14:16:47 crc kubenswrapper[4585]: E1201 14:16:47.300165 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf5a50c8-3aeb-4d5c-b313-b2eed4da3517" containerName="mariadb-database-create" Dec 01 14:16:47 crc kubenswrapper[4585]: I1201 14:16:47.300177 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf5a50c8-3aeb-4d5c-b313-b2eed4da3517" containerName="mariadb-database-create" Dec 01 14:16:47 crc kubenswrapper[4585]: E1201 14:16:47.300197 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fae72ee-6647-4105-ab94-ed2ab6bed7da" containerName="mariadb-account-create-update" Dec 01 14:16:47 crc kubenswrapper[4585]: I1201 14:16:47.300203 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fae72ee-6647-4105-ab94-ed2ab6bed7da" containerName="mariadb-account-create-update" Dec 01 14:16:47 crc kubenswrapper[4585]: E1201 14:16:47.300219 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9288a94-0029-4c7b-825a-ad1d005c736d" containerName="mariadb-account-create-update" Dec 01 14:16:47 crc kubenswrapper[4585]: I1201 14:16:47.300225 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9288a94-0029-4c7b-825a-ad1d005c736d" containerName="mariadb-account-create-update" Dec 01 14:16:47 crc kubenswrapper[4585]: I1201 14:16:47.300381 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9288a94-0029-4c7b-825a-ad1d005c736d" containerName="mariadb-account-create-update" Dec 01 14:16:47 crc kubenswrapper[4585]: I1201 14:16:47.300405 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fae72ee-6647-4105-ab94-ed2ab6bed7da" containerName="mariadb-account-create-update" Dec 01 14:16:47 crc kubenswrapper[4585]: I1201 14:16:47.300419 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf5a50c8-3aeb-4d5c-b313-b2eed4da3517" containerName="mariadb-database-create" Dec 01 14:16:47 crc kubenswrapper[4585]: I1201 14:16:47.300924 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-lc2cr" Dec 01 14:16:47 crc kubenswrapper[4585]: I1201 14:16:47.308335 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-dqv97" Dec 01 14:16:47 crc kubenswrapper[4585]: I1201 14:16:47.308575 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 01 14:16:47 crc kubenswrapper[4585]: I1201 14:16:47.308676 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 01 14:16:47 crc kubenswrapper[4585]: I1201 14:16:47.367514 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-lc2cr"] Dec 01 14:16:47 crc kubenswrapper[4585]: I1201 14:16:47.456539 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d820d720-5215-4178-ad7f-f0b493bf2529-scripts\") pod \"nova-cell0-conductor-db-sync-lc2cr\" (UID: \"d820d720-5215-4178-ad7f-f0b493bf2529\") " pod="openstack/nova-cell0-conductor-db-sync-lc2cr" Dec 01 14:16:47 crc kubenswrapper[4585]: I1201 14:16:47.456586 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d820d720-5215-4178-ad7f-f0b493bf2529-config-data\") pod \"nova-cell0-conductor-db-sync-lc2cr\" (UID: \"d820d720-5215-4178-ad7f-f0b493bf2529\") " pod="openstack/nova-cell0-conductor-db-sync-lc2cr" Dec 01 14:16:47 crc kubenswrapper[4585]: I1201 14:16:47.456612 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d820d720-5215-4178-ad7f-f0b493bf2529-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-lc2cr\" (UID: \"d820d720-5215-4178-ad7f-f0b493bf2529\") " pod="openstack/nova-cell0-conductor-db-sync-lc2cr" Dec 01 14:16:47 crc kubenswrapper[4585]: I1201 14:16:47.456693 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wdr4\" (UniqueName: \"kubernetes.io/projected/d820d720-5215-4178-ad7f-f0b493bf2529-kube-api-access-4wdr4\") pod \"nova-cell0-conductor-db-sync-lc2cr\" (UID: \"d820d720-5215-4178-ad7f-f0b493bf2529\") " pod="openstack/nova-cell0-conductor-db-sync-lc2cr" Dec 01 14:16:47 crc kubenswrapper[4585]: I1201 14:16:47.558285 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d820d720-5215-4178-ad7f-f0b493bf2529-scripts\") pod \"nova-cell0-conductor-db-sync-lc2cr\" (UID: \"d820d720-5215-4178-ad7f-f0b493bf2529\") " pod="openstack/nova-cell0-conductor-db-sync-lc2cr" Dec 01 14:16:47 crc kubenswrapper[4585]: I1201 14:16:47.558333 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d820d720-5215-4178-ad7f-f0b493bf2529-config-data\") pod \"nova-cell0-conductor-db-sync-lc2cr\" (UID: \"d820d720-5215-4178-ad7f-f0b493bf2529\") " pod="openstack/nova-cell0-conductor-db-sync-lc2cr" Dec 01 14:16:47 crc kubenswrapper[4585]: I1201 14:16:47.558359 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d820d720-5215-4178-ad7f-f0b493bf2529-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-lc2cr\" (UID: \"d820d720-5215-4178-ad7f-f0b493bf2529\") " pod="openstack/nova-cell0-conductor-db-sync-lc2cr" Dec 01 14:16:47 crc kubenswrapper[4585]: I1201 14:16:47.558450 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wdr4\" (UniqueName: \"kubernetes.io/projected/d820d720-5215-4178-ad7f-f0b493bf2529-kube-api-access-4wdr4\") pod \"nova-cell0-conductor-db-sync-lc2cr\" (UID: \"d820d720-5215-4178-ad7f-f0b493bf2529\") " pod="openstack/nova-cell0-conductor-db-sync-lc2cr" Dec 01 14:16:47 crc kubenswrapper[4585]: I1201 14:16:47.571190 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d820d720-5215-4178-ad7f-f0b493bf2529-scripts\") pod \"nova-cell0-conductor-db-sync-lc2cr\" (UID: \"d820d720-5215-4178-ad7f-f0b493bf2529\") " pod="openstack/nova-cell0-conductor-db-sync-lc2cr" Dec 01 14:16:47 crc kubenswrapper[4585]: I1201 14:16:47.583013 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d820d720-5215-4178-ad7f-f0b493bf2529-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-lc2cr\" (UID: \"d820d720-5215-4178-ad7f-f0b493bf2529\") " pod="openstack/nova-cell0-conductor-db-sync-lc2cr" Dec 01 14:16:47 crc kubenswrapper[4585]: I1201 14:16:47.586050 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d820d720-5215-4178-ad7f-f0b493bf2529-config-data\") pod \"nova-cell0-conductor-db-sync-lc2cr\" (UID: \"d820d720-5215-4178-ad7f-f0b493bf2529\") " pod="openstack/nova-cell0-conductor-db-sync-lc2cr" Dec 01 14:16:47 crc kubenswrapper[4585]: I1201 14:16:47.593823 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wdr4\" (UniqueName: \"kubernetes.io/projected/d820d720-5215-4178-ad7f-f0b493bf2529-kube-api-access-4wdr4\") pod \"nova-cell0-conductor-db-sync-lc2cr\" (UID: \"d820d720-5215-4178-ad7f-f0b493bf2529\") " pod="openstack/nova-cell0-conductor-db-sync-lc2cr" Dec 01 14:16:47 crc kubenswrapper[4585]: I1201 14:16:47.650394 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-lc2cr" Dec 01 14:16:48 crc kubenswrapper[4585]: I1201 14:16:48.168524 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-lc2cr"] Dec 01 14:16:48 crc kubenswrapper[4585]: W1201 14:16:48.177634 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd820d720_5215_4178_ad7f_f0b493bf2529.slice/crio-4ffbd04fa99f43e3af4371a2d059ab721a3f955385adc7e2ff83f72718162ca6 WatchSource:0}: Error finding container 4ffbd04fa99f43e3af4371a2d059ab721a3f955385adc7e2ff83f72718162ca6: Status 404 returned error can't find the container with id 4ffbd04fa99f43e3af4371a2d059ab721a3f955385adc7e2ff83f72718162ca6 Dec 01 14:16:48 crc kubenswrapper[4585]: I1201 14:16:48.959819 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-lc2cr" event={"ID":"d820d720-5215-4178-ad7f-f0b493bf2529","Type":"ContainerStarted","Data":"4ffbd04fa99f43e3af4371a2d059ab721a3f955385adc7e2ff83f72718162ca6"} Dec 01 14:16:50 crc kubenswrapper[4585]: I1201 14:16:50.633860 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6f5b64975d-2mfhq" podUID="d6fd44d5-e430-42bb-ad0b-7c78e7a1f288" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Dec 01 14:16:50 crc kubenswrapper[4585]: I1201 14:16:50.791143 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6bbf659b46-55tth" podUID="e3b59a9b-bef9-4f7c-b861-bf6bc2a8bac1" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.149:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.149:8443: connect: connection refused" Dec 01 14:16:50 crc kubenswrapper[4585]: I1201 14:16:50.902116 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 01 14:16:50 crc kubenswrapper[4585]: I1201 14:16:50.902171 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 01 14:16:50 crc kubenswrapper[4585]: I1201 14:16:50.947068 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 01 14:16:50 crc kubenswrapper[4585]: I1201 14:16:50.974930 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 01 14:16:51 crc kubenswrapper[4585]: I1201 14:16:51.008005 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 01 14:16:51 crc kubenswrapper[4585]: I1201 14:16:51.008077 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 01 14:16:52 crc kubenswrapper[4585]: I1201 14:16:52.307272 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="8915c969-9480-47da-90a9-311f504dbe66" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 01 14:16:53 crc kubenswrapper[4585]: I1201 14:16:53.033300 4585 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 14:16:53 crc kubenswrapper[4585]: I1201 14:16:53.033324 4585 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 14:16:53 crc kubenswrapper[4585]: I1201 14:16:53.428997 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 01 14:16:53 crc kubenswrapper[4585]: I1201 14:16:53.429317 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 01 14:16:53 crc kubenswrapper[4585]: I1201 14:16:53.473704 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 01 14:16:53 crc kubenswrapper[4585]: I1201 14:16:53.487525 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 01 14:16:53 crc kubenswrapper[4585]: I1201 14:16:53.509917 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 01 14:16:54 crc kubenswrapper[4585]: I1201 14:16:54.041675 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 01 14:16:54 crc kubenswrapper[4585]: I1201 14:16:54.041733 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 01 14:16:54 crc kubenswrapper[4585]: I1201 14:16:54.211472 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 01 14:16:54 crc kubenswrapper[4585]: I1201 14:16:54.212060 4585 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 14:16:54 crc kubenswrapper[4585]: I1201 14:16:54.224287 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 01 14:16:57 crc kubenswrapper[4585]: I1201 14:16:57.432094 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 01 14:16:57 crc kubenswrapper[4585]: I1201 14:16:57.432424 4585 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 14:16:58 crc kubenswrapper[4585]: I1201 14:16:58.477105 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 01 14:17:00 crc kubenswrapper[4585]: I1201 14:17:00.636716 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6f5b64975d-2mfhq" podUID="d6fd44d5-e430-42bb-ad0b-7c78e7a1f288" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Dec 01 14:17:00 crc kubenswrapper[4585]: I1201 14:17:00.791727 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6bbf659b46-55tth" podUID="e3b59a9b-bef9-4f7c-b861-bf6bc2a8bac1" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.149:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.149:8443: connect: connection refused" Dec 01 14:17:03 crc kubenswrapper[4585]: I1201 14:17:03.173091 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-lc2cr" event={"ID":"d820d720-5215-4178-ad7f-f0b493bf2529","Type":"ContainerStarted","Data":"096cf172eca6df5c714d4622a97f04fa426f712f10545ae184a763b5eb8ad637"} Dec 01 14:17:03 crc kubenswrapper[4585]: I1201 14:17:03.219660 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-lc2cr" podStartSLOduration=1.806638853 podStartE2EDuration="16.219642004s" podCreationTimestamp="2025-12-01 14:16:47 +0000 UTC" firstStartedPulling="2025-12-01 14:16:48.179673313 +0000 UTC m=+1122.163887168" lastFinishedPulling="2025-12-01 14:17:02.592676464 +0000 UTC m=+1136.576890319" observedRunningTime="2025-12-01 14:17:03.205204129 +0000 UTC m=+1137.189417984" watchObservedRunningTime="2025-12-01 14:17:03.219642004 +0000 UTC m=+1137.203855859" Dec 01 14:17:05 crc kubenswrapper[4585]: I1201 14:17:05.202573 4585 generic.go:334] "Generic (PLEG): container finished" podID="8915c969-9480-47da-90a9-311f504dbe66" containerID="0332d9fb0f3900b81d4e1cd08ad52c05c87ef2cec942a44c2959c9f7c374dd84" exitCode=137 Dec 01 14:17:05 crc kubenswrapper[4585]: I1201 14:17:05.202858 4585 generic.go:334] "Generic (PLEG): container finished" podID="8915c969-9480-47da-90a9-311f504dbe66" containerID="0f29d76ec0ec7144d6b4986a3a21b9d271e69c019dc63122d74087048b21176d" exitCode=137 Dec 01 14:17:05 crc kubenswrapper[4585]: I1201 14:17:05.202878 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8915c969-9480-47da-90a9-311f504dbe66","Type":"ContainerDied","Data":"0332d9fb0f3900b81d4e1cd08ad52c05c87ef2cec942a44c2959c9f7c374dd84"} Dec 01 14:17:05 crc kubenswrapper[4585]: I1201 14:17:05.202902 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8915c969-9480-47da-90a9-311f504dbe66","Type":"ContainerDied","Data":"0f29d76ec0ec7144d6b4986a3a21b9d271e69c019dc63122d74087048b21176d"} Dec 01 14:17:05 crc kubenswrapper[4585]: I1201 14:17:05.557331 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 14:17:05 crc kubenswrapper[4585]: I1201 14:17:05.640366 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8915c969-9480-47da-90a9-311f504dbe66-run-httpd\") pod \"8915c969-9480-47da-90a9-311f504dbe66\" (UID: \"8915c969-9480-47da-90a9-311f504dbe66\") " Dec 01 14:17:05 crc kubenswrapper[4585]: I1201 14:17:05.640594 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8915c969-9480-47da-90a9-311f504dbe66-combined-ca-bundle\") pod \"8915c969-9480-47da-90a9-311f504dbe66\" (UID: \"8915c969-9480-47da-90a9-311f504dbe66\") " Dec 01 14:17:05 crc kubenswrapper[4585]: I1201 14:17:05.640661 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtnws\" (UniqueName: \"kubernetes.io/projected/8915c969-9480-47da-90a9-311f504dbe66-kube-api-access-wtnws\") pod \"8915c969-9480-47da-90a9-311f504dbe66\" (UID: \"8915c969-9480-47da-90a9-311f504dbe66\") " Dec 01 14:17:05 crc kubenswrapper[4585]: I1201 14:17:05.640689 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8915c969-9480-47da-90a9-311f504dbe66-scripts\") pod \"8915c969-9480-47da-90a9-311f504dbe66\" (UID: \"8915c969-9480-47da-90a9-311f504dbe66\") " Dec 01 14:17:05 crc kubenswrapper[4585]: I1201 14:17:05.640704 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8915c969-9480-47da-90a9-311f504dbe66-log-httpd\") pod \"8915c969-9480-47da-90a9-311f504dbe66\" (UID: \"8915c969-9480-47da-90a9-311f504dbe66\") " Dec 01 14:17:05 crc kubenswrapper[4585]: I1201 14:17:05.640736 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8915c969-9480-47da-90a9-311f504dbe66-config-data\") pod \"8915c969-9480-47da-90a9-311f504dbe66\" (UID: \"8915c969-9480-47da-90a9-311f504dbe66\") " Dec 01 14:17:05 crc kubenswrapper[4585]: I1201 14:17:05.640765 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8915c969-9480-47da-90a9-311f504dbe66-sg-core-conf-yaml\") pod \"8915c969-9480-47da-90a9-311f504dbe66\" (UID: \"8915c969-9480-47da-90a9-311f504dbe66\") " Dec 01 14:17:05 crc kubenswrapper[4585]: I1201 14:17:05.641910 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8915c969-9480-47da-90a9-311f504dbe66-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8915c969-9480-47da-90a9-311f504dbe66" (UID: "8915c969-9480-47da-90a9-311f504dbe66"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:17:05 crc kubenswrapper[4585]: I1201 14:17:05.642353 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8915c969-9480-47da-90a9-311f504dbe66-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8915c969-9480-47da-90a9-311f504dbe66" (UID: "8915c969-9480-47da-90a9-311f504dbe66"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:17:05 crc kubenswrapper[4585]: I1201 14:17:05.647517 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8915c969-9480-47da-90a9-311f504dbe66-scripts" (OuterVolumeSpecName: "scripts") pod "8915c969-9480-47da-90a9-311f504dbe66" (UID: "8915c969-9480-47da-90a9-311f504dbe66"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:17:05 crc kubenswrapper[4585]: I1201 14:17:05.666043 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8915c969-9480-47da-90a9-311f504dbe66-kube-api-access-wtnws" (OuterVolumeSpecName: "kube-api-access-wtnws") pod "8915c969-9480-47da-90a9-311f504dbe66" (UID: "8915c969-9480-47da-90a9-311f504dbe66"). InnerVolumeSpecName "kube-api-access-wtnws". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:17:05 crc kubenswrapper[4585]: I1201 14:17:05.673059 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8915c969-9480-47da-90a9-311f504dbe66-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8915c969-9480-47da-90a9-311f504dbe66" (UID: "8915c969-9480-47da-90a9-311f504dbe66"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:17:05 crc kubenswrapper[4585]: I1201 14:17:05.718509 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8915c969-9480-47da-90a9-311f504dbe66-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8915c969-9480-47da-90a9-311f504dbe66" (UID: "8915c969-9480-47da-90a9-311f504dbe66"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:17:05 crc kubenswrapper[4585]: I1201 14:17:05.742551 4585 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8915c969-9480-47da-90a9-311f504dbe66-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 14:17:05 crc kubenswrapper[4585]: I1201 14:17:05.742590 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8915c969-9480-47da-90a9-311f504dbe66-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 14:17:05 crc kubenswrapper[4585]: I1201 14:17:05.742606 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtnws\" (UniqueName: \"kubernetes.io/projected/8915c969-9480-47da-90a9-311f504dbe66-kube-api-access-wtnws\") on node \"crc\" DevicePath \"\"" Dec 01 14:17:05 crc kubenswrapper[4585]: I1201 14:17:05.742618 4585 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8915c969-9480-47da-90a9-311f504dbe66-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 14:17:05 crc kubenswrapper[4585]: I1201 14:17:05.742628 4585 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8915c969-9480-47da-90a9-311f504dbe66-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 14:17:05 crc kubenswrapper[4585]: I1201 14:17:05.742637 4585 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8915c969-9480-47da-90a9-311f504dbe66-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 01 14:17:05 crc kubenswrapper[4585]: I1201 14:17:05.755269 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8915c969-9480-47da-90a9-311f504dbe66-config-data" (OuterVolumeSpecName: "config-data") pod "8915c969-9480-47da-90a9-311f504dbe66" (UID: "8915c969-9480-47da-90a9-311f504dbe66"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:17:05 crc kubenswrapper[4585]: I1201 14:17:05.844572 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8915c969-9480-47da-90a9-311f504dbe66-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 14:17:06 crc kubenswrapper[4585]: I1201 14:17:06.213524 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8915c969-9480-47da-90a9-311f504dbe66","Type":"ContainerDied","Data":"312e5fc609f92a4eee4b75bc8b89ce949bea40137be21626fc69f9dbd5bb4724"} Dec 01 14:17:06 crc kubenswrapper[4585]: I1201 14:17:06.213586 4585 scope.go:117] "RemoveContainer" containerID="0332d9fb0f3900b81d4e1cd08ad52c05c87ef2cec942a44c2959c9f7c374dd84" Dec 01 14:17:06 crc kubenswrapper[4585]: I1201 14:17:06.213744 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 14:17:06 crc kubenswrapper[4585]: I1201 14:17:06.264470 4585 scope.go:117] "RemoveContainer" containerID="2e26278ab2529c1e6c5a0052605bfdd6d26ffd2a44a92454941601340c2f2076" Dec 01 14:17:06 crc kubenswrapper[4585]: I1201 14:17:06.273247 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 14:17:06 crc kubenswrapper[4585]: I1201 14:17:06.301696 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 01 14:17:06 crc kubenswrapper[4585]: I1201 14:17:06.301813 4585 scope.go:117] "RemoveContainer" containerID="0f29d76ec0ec7144d6b4986a3a21b9d271e69c019dc63122d74087048b21176d" Dec 01 14:17:06 crc kubenswrapper[4585]: I1201 14:17:06.311564 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 14:17:06 crc kubenswrapper[4585]: E1201 14:17:06.311930 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8915c969-9480-47da-90a9-311f504dbe66" containerName="ceilometer-notification-agent" Dec 01 14:17:06 crc kubenswrapper[4585]: I1201 14:17:06.311941 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="8915c969-9480-47da-90a9-311f504dbe66" containerName="ceilometer-notification-agent" Dec 01 14:17:06 crc kubenswrapper[4585]: E1201 14:17:06.311964 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8915c969-9480-47da-90a9-311f504dbe66" containerName="ceilometer-central-agent" Dec 01 14:17:06 crc kubenswrapper[4585]: I1201 14:17:06.311985 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="8915c969-9480-47da-90a9-311f504dbe66" containerName="ceilometer-central-agent" Dec 01 14:17:06 crc kubenswrapper[4585]: E1201 14:17:06.312008 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8915c969-9480-47da-90a9-311f504dbe66" containerName="sg-core" Dec 01 14:17:06 crc kubenswrapper[4585]: I1201 14:17:06.312014 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="8915c969-9480-47da-90a9-311f504dbe66" containerName="sg-core" Dec 01 14:17:06 crc kubenswrapper[4585]: E1201 14:17:06.312023 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8915c969-9480-47da-90a9-311f504dbe66" containerName="proxy-httpd" Dec 01 14:17:06 crc kubenswrapper[4585]: I1201 14:17:06.312029 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="8915c969-9480-47da-90a9-311f504dbe66" containerName="proxy-httpd" Dec 01 14:17:06 crc kubenswrapper[4585]: I1201 14:17:06.312195 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="8915c969-9480-47da-90a9-311f504dbe66" containerName="ceilometer-central-agent" Dec 01 14:17:06 crc kubenswrapper[4585]: I1201 14:17:06.312207 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="8915c969-9480-47da-90a9-311f504dbe66" containerName="proxy-httpd" Dec 01 14:17:06 crc kubenswrapper[4585]: I1201 14:17:06.312217 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="8915c969-9480-47da-90a9-311f504dbe66" containerName="ceilometer-notification-agent" Dec 01 14:17:06 crc kubenswrapper[4585]: I1201 14:17:06.312236 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="8915c969-9480-47da-90a9-311f504dbe66" containerName="sg-core" Dec 01 14:17:06 crc kubenswrapper[4585]: I1201 14:17:06.322144 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 14:17:06 crc kubenswrapper[4585]: I1201 14:17:06.329453 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 14:17:06 crc kubenswrapper[4585]: I1201 14:17:06.329830 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 14:17:06 crc kubenswrapper[4585]: I1201 14:17:06.353777 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 14:17:06 crc kubenswrapper[4585]: I1201 14:17:06.355944 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c322955-ba56-4357-bc08-9828e570d4c8-log-httpd\") pod \"ceilometer-0\" (UID: \"8c322955-ba56-4357-bc08-9828e570d4c8\") " pod="openstack/ceilometer-0" Dec 01 14:17:06 crc kubenswrapper[4585]: I1201 14:17:06.356175 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mjg2\" (UniqueName: \"kubernetes.io/projected/8c322955-ba56-4357-bc08-9828e570d4c8-kube-api-access-6mjg2\") pod \"ceilometer-0\" (UID: \"8c322955-ba56-4357-bc08-9828e570d4c8\") " pod="openstack/ceilometer-0" Dec 01 14:17:06 crc kubenswrapper[4585]: I1201 14:17:06.356240 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c322955-ba56-4357-bc08-9828e570d4c8-scripts\") pod \"ceilometer-0\" (UID: \"8c322955-ba56-4357-bc08-9828e570d4c8\") " pod="openstack/ceilometer-0" Dec 01 14:17:06 crc kubenswrapper[4585]: I1201 14:17:06.356340 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8c322955-ba56-4357-bc08-9828e570d4c8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8c322955-ba56-4357-bc08-9828e570d4c8\") " pod="openstack/ceilometer-0" Dec 01 14:17:06 crc kubenswrapper[4585]: I1201 14:17:06.356403 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c322955-ba56-4357-bc08-9828e570d4c8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8c322955-ba56-4357-bc08-9828e570d4c8\") " pod="openstack/ceilometer-0" Dec 01 14:17:06 crc kubenswrapper[4585]: I1201 14:17:06.356445 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c322955-ba56-4357-bc08-9828e570d4c8-config-data\") pod \"ceilometer-0\" (UID: \"8c322955-ba56-4357-bc08-9828e570d4c8\") " pod="openstack/ceilometer-0" Dec 01 14:17:06 crc kubenswrapper[4585]: I1201 14:17:06.356557 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c322955-ba56-4357-bc08-9828e570d4c8-run-httpd\") pod \"ceilometer-0\" (UID: \"8c322955-ba56-4357-bc08-9828e570d4c8\") " pod="openstack/ceilometer-0" Dec 01 14:17:06 crc kubenswrapper[4585]: I1201 14:17:06.379416 4585 scope.go:117] "RemoveContainer" containerID="0af8bea6bbe61b702f056d278f1efb6cafa8c8cb41dcfd8c3043d6af7e26cc13" Dec 01 14:17:06 crc kubenswrapper[4585]: I1201 14:17:06.437193 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8915c969-9480-47da-90a9-311f504dbe66" path="/var/lib/kubelet/pods/8915c969-9480-47da-90a9-311f504dbe66/volumes" Dec 01 14:17:06 crc kubenswrapper[4585]: I1201 14:17:06.458052 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8c322955-ba56-4357-bc08-9828e570d4c8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8c322955-ba56-4357-bc08-9828e570d4c8\") " pod="openstack/ceilometer-0" Dec 01 14:17:06 crc kubenswrapper[4585]: I1201 14:17:06.458106 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c322955-ba56-4357-bc08-9828e570d4c8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8c322955-ba56-4357-bc08-9828e570d4c8\") " pod="openstack/ceilometer-0" Dec 01 14:17:06 crc kubenswrapper[4585]: I1201 14:17:06.458143 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c322955-ba56-4357-bc08-9828e570d4c8-config-data\") pod \"ceilometer-0\" (UID: \"8c322955-ba56-4357-bc08-9828e570d4c8\") " pod="openstack/ceilometer-0" Dec 01 14:17:06 crc kubenswrapper[4585]: I1201 14:17:06.458231 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c322955-ba56-4357-bc08-9828e570d4c8-run-httpd\") pod \"ceilometer-0\" (UID: \"8c322955-ba56-4357-bc08-9828e570d4c8\") " pod="openstack/ceilometer-0" Dec 01 14:17:06 crc kubenswrapper[4585]: I1201 14:17:06.459031 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c322955-ba56-4357-bc08-9828e570d4c8-log-httpd\") pod \"ceilometer-0\" (UID: \"8c322955-ba56-4357-bc08-9828e570d4c8\") " pod="openstack/ceilometer-0" Dec 01 14:17:06 crc kubenswrapper[4585]: I1201 14:17:06.459165 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mjg2\" (UniqueName: \"kubernetes.io/projected/8c322955-ba56-4357-bc08-9828e570d4c8-kube-api-access-6mjg2\") pod \"ceilometer-0\" (UID: \"8c322955-ba56-4357-bc08-9828e570d4c8\") " pod="openstack/ceilometer-0" Dec 01 14:17:06 crc kubenswrapper[4585]: I1201 14:17:06.459212 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c322955-ba56-4357-bc08-9828e570d4c8-scripts\") pod \"ceilometer-0\" (UID: \"8c322955-ba56-4357-bc08-9828e570d4c8\") " pod="openstack/ceilometer-0" Dec 01 14:17:06 crc kubenswrapper[4585]: I1201 14:17:06.459861 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c322955-ba56-4357-bc08-9828e570d4c8-log-httpd\") pod \"ceilometer-0\" (UID: \"8c322955-ba56-4357-bc08-9828e570d4c8\") " pod="openstack/ceilometer-0" Dec 01 14:17:06 crc kubenswrapper[4585]: I1201 14:17:06.460808 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c322955-ba56-4357-bc08-9828e570d4c8-run-httpd\") pod \"ceilometer-0\" (UID: \"8c322955-ba56-4357-bc08-9828e570d4c8\") " pod="openstack/ceilometer-0" Dec 01 14:17:06 crc kubenswrapper[4585]: I1201 14:17:06.465173 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c322955-ba56-4357-bc08-9828e570d4c8-scripts\") pod \"ceilometer-0\" (UID: \"8c322955-ba56-4357-bc08-9828e570d4c8\") " pod="openstack/ceilometer-0" Dec 01 14:17:06 crc kubenswrapper[4585]: I1201 14:17:06.469128 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8c322955-ba56-4357-bc08-9828e570d4c8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8c322955-ba56-4357-bc08-9828e570d4c8\") " pod="openstack/ceilometer-0" Dec 01 14:17:06 crc kubenswrapper[4585]: I1201 14:17:06.476416 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c322955-ba56-4357-bc08-9828e570d4c8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8c322955-ba56-4357-bc08-9828e570d4c8\") " pod="openstack/ceilometer-0" Dec 01 14:17:06 crc kubenswrapper[4585]: I1201 14:17:06.482522 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c322955-ba56-4357-bc08-9828e570d4c8-config-data\") pod \"ceilometer-0\" (UID: \"8c322955-ba56-4357-bc08-9828e570d4c8\") " pod="openstack/ceilometer-0" Dec 01 14:17:06 crc kubenswrapper[4585]: I1201 14:17:06.487817 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mjg2\" (UniqueName: \"kubernetes.io/projected/8c322955-ba56-4357-bc08-9828e570d4c8-kube-api-access-6mjg2\") pod \"ceilometer-0\" (UID: \"8c322955-ba56-4357-bc08-9828e570d4c8\") " pod="openstack/ceilometer-0" Dec 01 14:17:06 crc kubenswrapper[4585]: I1201 14:17:06.649589 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 14:17:07 crc kubenswrapper[4585]: I1201 14:17:07.278799 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 14:17:08 crc kubenswrapper[4585]: I1201 14:17:08.242869 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c322955-ba56-4357-bc08-9828e570d4c8","Type":"ContainerStarted","Data":"349cee5c3d4760c56f3bb57a662be7c9f966bdbb9fc69f4ce1600cdc2e558a2f"} Dec 01 14:17:08 crc kubenswrapper[4585]: I1201 14:17:08.243484 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c322955-ba56-4357-bc08-9828e570d4c8","Type":"ContainerStarted","Data":"a2da8c133db69f02f05b57a299d76c494c949338982f490843070fc82c66a6e8"} Dec 01 14:17:09 crc kubenswrapper[4585]: I1201 14:17:09.253173 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c322955-ba56-4357-bc08-9828e570d4c8","Type":"ContainerStarted","Data":"1d216ea0f7b1de6b629daaaf85cfba679c69c16169c5fb9b4fb31b443e1d6dde"} Dec 01 14:17:10 crc kubenswrapper[4585]: I1201 14:17:10.263152 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c322955-ba56-4357-bc08-9828e570d4c8","Type":"ContainerStarted","Data":"0850bc1c40a8a1d86ac60cfedd96fe073a01266bea49d51fa948fad812494887"} Dec 01 14:17:12 crc kubenswrapper[4585]: I1201 14:17:12.285227 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c322955-ba56-4357-bc08-9828e570d4c8","Type":"ContainerStarted","Data":"e89015dbc6948547868eba5afb2622cd8a3efca48252527b738f709b7f9166f8"} Dec 01 14:17:12 crc kubenswrapper[4585]: I1201 14:17:12.286615 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 14:17:14 crc kubenswrapper[4585]: I1201 14:17:14.357663 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-6bbf659b46-55tth" Dec 01 14:17:14 crc kubenswrapper[4585]: I1201 14:17:14.412748 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.241046969 podStartE2EDuration="8.412723398s" podCreationTimestamp="2025-12-01 14:17:06 +0000 UTC" firstStartedPulling="2025-12-01 14:17:07.323895245 +0000 UTC m=+1141.308109100" lastFinishedPulling="2025-12-01 14:17:11.495571674 +0000 UTC m=+1145.479785529" observedRunningTime="2025-12-01 14:17:12.33090716 +0000 UTC m=+1146.315121015" watchObservedRunningTime="2025-12-01 14:17:14.412723398 +0000 UTC m=+1148.396937253" Dec 01 14:17:14 crc kubenswrapper[4585]: I1201 14:17:14.513509 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-6f5b64975d-2mfhq" Dec 01 14:17:16 crc kubenswrapper[4585]: I1201 14:17:16.505266 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-6bbf659b46-55tth" Dec 01 14:17:16 crc kubenswrapper[4585]: I1201 14:17:16.581333 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6f5b64975d-2mfhq"] Dec 01 14:17:16 crc kubenswrapper[4585]: I1201 14:17:16.581574 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6f5b64975d-2mfhq" podUID="d6fd44d5-e430-42bb-ad0b-7c78e7a1f288" containerName="horizon-log" containerID="cri-o://b777eaee6c922789f9a336f3ca54a96b3c11e519ca4993ef5e19721ca1c35cf5" gracePeriod=30 Dec 01 14:17:16 crc kubenswrapper[4585]: I1201 14:17:16.582602 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6f5b64975d-2mfhq" podUID="d6fd44d5-e430-42bb-ad0b-7c78e7a1f288" containerName="horizon" containerID="cri-o://1f18573e739a744b3eabfcf4b261bed85eaea03874ad4e993ab153ba3999ffcf" gracePeriod=30 Dec 01 14:17:16 crc kubenswrapper[4585]: I1201 14:17:16.626220 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6f5b64975d-2mfhq" podUID="d6fd44d5-e430-42bb-ad0b-7c78e7a1f288" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": EOF" Dec 01 14:17:18 crc kubenswrapper[4585]: I1201 14:17:18.334880 4585 generic.go:334] "Generic (PLEG): container finished" podID="d820d720-5215-4178-ad7f-f0b493bf2529" containerID="096cf172eca6df5c714d4622a97f04fa426f712f10545ae184a763b5eb8ad637" exitCode=0 Dec 01 14:17:18 crc kubenswrapper[4585]: I1201 14:17:18.335681 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-lc2cr" event={"ID":"d820d720-5215-4178-ad7f-f0b493bf2529","Type":"ContainerDied","Data":"096cf172eca6df5c714d4622a97f04fa426f712f10545ae184a763b5eb8ad637"} Dec 01 14:17:19 crc kubenswrapper[4585]: I1201 14:17:19.689225 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-lc2cr" Dec 01 14:17:19 crc kubenswrapper[4585]: I1201 14:17:19.764403 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d820d720-5215-4178-ad7f-f0b493bf2529-config-data\") pod \"d820d720-5215-4178-ad7f-f0b493bf2529\" (UID: \"d820d720-5215-4178-ad7f-f0b493bf2529\") " Dec 01 14:17:19 crc kubenswrapper[4585]: I1201 14:17:19.767582 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d820d720-5215-4178-ad7f-f0b493bf2529-scripts\") pod \"d820d720-5215-4178-ad7f-f0b493bf2529\" (UID: \"d820d720-5215-4178-ad7f-f0b493bf2529\") " Dec 01 14:17:19 crc kubenswrapper[4585]: I1201 14:17:19.767639 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wdr4\" (UniqueName: \"kubernetes.io/projected/d820d720-5215-4178-ad7f-f0b493bf2529-kube-api-access-4wdr4\") pod \"d820d720-5215-4178-ad7f-f0b493bf2529\" (UID: \"d820d720-5215-4178-ad7f-f0b493bf2529\") " Dec 01 14:17:19 crc kubenswrapper[4585]: I1201 14:17:19.767664 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d820d720-5215-4178-ad7f-f0b493bf2529-combined-ca-bundle\") pod \"d820d720-5215-4178-ad7f-f0b493bf2529\" (UID: \"d820d720-5215-4178-ad7f-f0b493bf2529\") " Dec 01 14:17:19 crc kubenswrapper[4585]: I1201 14:17:19.788554 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d820d720-5215-4178-ad7f-f0b493bf2529-scripts" (OuterVolumeSpecName: "scripts") pod "d820d720-5215-4178-ad7f-f0b493bf2529" (UID: "d820d720-5215-4178-ad7f-f0b493bf2529"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:17:19 crc kubenswrapper[4585]: I1201 14:17:19.791427 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d820d720-5215-4178-ad7f-f0b493bf2529-kube-api-access-4wdr4" (OuterVolumeSpecName: "kube-api-access-4wdr4") pod "d820d720-5215-4178-ad7f-f0b493bf2529" (UID: "d820d720-5215-4178-ad7f-f0b493bf2529"). InnerVolumeSpecName "kube-api-access-4wdr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:17:19 crc kubenswrapper[4585]: I1201 14:17:19.807056 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d820d720-5215-4178-ad7f-f0b493bf2529-config-data" (OuterVolumeSpecName: "config-data") pod "d820d720-5215-4178-ad7f-f0b493bf2529" (UID: "d820d720-5215-4178-ad7f-f0b493bf2529"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:17:19 crc kubenswrapper[4585]: I1201 14:17:19.819827 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d820d720-5215-4178-ad7f-f0b493bf2529-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d820d720-5215-4178-ad7f-f0b493bf2529" (UID: "d820d720-5215-4178-ad7f-f0b493bf2529"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:17:19 crc kubenswrapper[4585]: I1201 14:17:19.870030 4585 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d820d720-5215-4178-ad7f-f0b493bf2529-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 14:17:19 crc kubenswrapper[4585]: I1201 14:17:19.870063 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wdr4\" (UniqueName: \"kubernetes.io/projected/d820d720-5215-4178-ad7f-f0b493bf2529-kube-api-access-4wdr4\") on node \"crc\" DevicePath \"\"" Dec 01 14:17:19 crc kubenswrapper[4585]: I1201 14:17:19.870078 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d820d720-5215-4178-ad7f-f0b493bf2529-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 14:17:19 crc kubenswrapper[4585]: I1201 14:17:19.870088 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d820d720-5215-4178-ad7f-f0b493bf2529-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 14:17:20 crc kubenswrapper[4585]: I1201 14:17:20.352361 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-lc2cr" event={"ID":"d820d720-5215-4178-ad7f-f0b493bf2529","Type":"ContainerDied","Data":"4ffbd04fa99f43e3af4371a2d059ab721a3f955385adc7e2ff83f72718162ca6"} Dec 01 14:17:20 crc kubenswrapper[4585]: I1201 14:17:20.352740 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ffbd04fa99f43e3af4371a2d059ab721a3f955385adc7e2ff83f72718162ca6" Dec 01 14:17:20 crc kubenswrapper[4585]: I1201 14:17:20.352435 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-lc2cr" Dec 01 14:17:20 crc kubenswrapper[4585]: I1201 14:17:20.459520 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 01 14:17:20 crc kubenswrapper[4585]: E1201 14:17:20.468138 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d820d720-5215-4178-ad7f-f0b493bf2529" containerName="nova-cell0-conductor-db-sync" Dec 01 14:17:20 crc kubenswrapper[4585]: I1201 14:17:20.468171 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="d820d720-5215-4178-ad7f-f0b493bf2529" containerName="nova-cell0-conductor-db-sync" Dec 01 14:17:20 crc kubenswrapper[4585]: I1201 14:17:20.468380 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="d820d720-5215-4178-ad7f-f0b493bf2529" containerName="nova-cell0-conductor-db-sync" Dec 01 14:17:20 crc kubenswrapper[4585]: I1201 14:17:20.468917 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 01 14:17:20 crc kubenswrapper[4585]: I1201 14:17:20.469032 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 01 14:17:20 crc kubenswrapper[4585]: I1201 14:17:20.471090 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-dqv97" Dec 01 14:17:20 crc kubenswrapper[4585]: I1201 14:17:20.471284 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 01 14:17:20 crc kubenswrapper[4585]: I1201 14:17:20.588581 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ae9e96d-f1e0-4183-9034-f553c8af4864-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"1ae9e96d-f1e0-4183-9034-f553c8af4864\") " pod="openstack/nova-cell0-conductor-0" Dec 01 14:17:20 crc kubenswrapper[4585]: I1201 14:17:20.588662 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ae9e96d-f1e0-4183-9034-f553c8af4864-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"1ae9e96d-f1e0-4183-9034-f553c8af4864\") " pod="openstack/nova-cell0-conductor-0" Dec 01 14:17:20 crc kubenswrapper[4585]: I1201 14:17:20.588705 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ml6nj\" (UniqueName: \"kubernetes.io/projected/1ae9e96d-f1e0-4183-9034-f553c8af4864-kube-api-access-ml6nj\") pod \"nova-cell0-conductor-0\" (UID: \"1ae9e96d-f1e0-4183-9034-f553c8af4864\") " pod="openstack/nova-cell0-conductor-0" Dec 01 14:17:20 crc kubenswrapper[4585]: I1201 14:17:20.690540 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ae9e96d-f1e0-4183-9034-f553c8af4864-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"1ae9e96d-f1e0-4183-9034-f553c8af4864\") " pod="openstack/nova-cell0-conductor-0" Dec 01 14:17:20 crc kubenswrapper[4585]: I1201 14:17:20.690672 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ae9e96d-f1e0-4183-9034-f553c8af4864-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"1ae9e96d-f1e0-4183-9034-f553c8af4864\") " pod="openstack/nova-cell0-conductor-0" Dec 01 14:17:20 crc kubenswrapper[4585]: I1201 14:17:20.690740 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ml6nj\" (UniqueName: \"kubernetes.io/projected/1ae9e96d-f1e0-4183-9034-f553c8af4864-kube-api-access-ml6nj\") pod \"nova-cell0-conductor-0\" (UID: \"1ae9e96d-f1e0-4183-9034-f553c8af4864\") " pod="openstack/nova-cell0-conductor-0" Dec 01 14:17:20 crc kubenswrapper[4585]: I1201 14:17:20.695146 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ae9e96d-f1e0-4183-9034-f553c8af4864-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"1ae9e96d-f1e0-4183-9034-f553c8af4864\") " pod="openstack/nova-cell0-conductor-0" Dec 01 14:17:20 crc kubenswrapper[4585]: I1201 14:17:20.704663 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ae9e96d-f1e0-4183-9034-f553c8af4864-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"1ae9e96d-f1e0-4183-9034-f553c8af4864\") " pod="openstack/nova-cell0-conductor-0" Dec 01 14:17:20 crc kubenswrapper[4585]: I1201 14:17:20.709164 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ml6nj\" (UniqueName: \"kubernetes.io/projected/1ae9e96d-f1e0-4183-9034-f553c8af4864-kube-api-access-ml6nj\") pod \"nova-cell0-conductor-0\" (UID: \"1ae9e96d-f1e0-4183-9034-f553c8af4864\") " pod="openstack/nova-cell0-conductor-0" Dec 01 14:17:20 crc kubenswrapper[4585]: I1201 14:17:20.787772 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 01 14:17:21 crc kubenswrapper[4585]: I1201 14:17:21.093940 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6f5b64975d-2mfhq" podUID="d6fd44d5-e430-42bb-ad0b-7c78e7a1f288" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:55768->10.217.0.147:8443: read: connection reset by peer" Dec 01 14:17:21 crc kubenswrapper[4585]: I1201 14:17:21.095100 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6f5b64975d-2mfhq" podUID="d6fd44d5-e430-42bb-ad0b-7c78e7a1f288" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Dec 01 14:17:21 crc kubenswrapper[4585]: I1201 14:17:21.364013 4585 generic.go:334] "Generic (PLEG): container finished" podID="d6fd44d5-e430-42bb-ad0b-7c78e7a1f288" containerID="1f18573e739a744b3eabfcf4b261bed85eaea03874ad4e993ab153ba3999ffcf" exitCode=0 Dec 01 14:17:21 crc kubenswrapper[4585]: I1201 14:17:21.364062 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f5b64975d-2mfhq" event={"ID":"d6fd44d5-e430-42bb-ad0b-7c78e7a1f288","Type":"ContainerDied","Data":"1f18573e739a744b3eabfcf4b261bed85eaea03874ad4e993ab153ba3999ffcf"} Dec 01 14:17:21 crc kubenswrapper[4585]: I1201 14:17:21.364095 4585 scope.go:117] "RemoveContainer" containerID="93a074bee351885d31ad8a67b537a1943c95441154a2cae193cda36b50b6191f" Dec 01 14:17:21 crc kubenswrapper[4585]: I1201 14:17:21.405753 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 01 14:17:22 crc kubenswrapper[4585]: I1201 14:17:22.376938 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"1ae9e96d-f1e0-4183-9034-f553c8af4864","Type":"ContainerStarted","Data":"d0da8d77178aac6df3193da8eb87aa8c7aac56448699c19d29eed571782c0872"} Dec 01 14:17:22 crc kubenswrapper[4585]: I1201 14:17:22.376992 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"1ae9e96d-f1e0-4183-9034-f553c8af4864","Type":"ContainerStarted","Data":"14938f6e2c9742b7546f37a118065428ec7886a589a11fa266e993a9cb5a5cec"} Dec 01 14:17:22 crc kubenswrapper[4585]: I1201 14:17:22.402688 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.402668182 podStartE2EDuration="2.402668182s" podCreationTimestamp="2025-12-01 14:17:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:17:22.394163395 +0000 UTC m=+1156.378377260" watchObservedRunningTime="2025-12-01 14:17:22.402668182 +0000 UTC m=+1156.386882047" Dec 01 14:17:23 crc kubenswrapper[4585]: I1201 14:17:23.387051 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 01 14:17:30 crc kubenswrapper[4585]: I1201 14:17:30.632935 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6f5b64975d-2mfhq" podUID="d6fd44d5-e430-42bb-ad0b-7c78e7a1f288" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Dec 01 14:17:30 crc kubenswrapper[4585]: I1201 14:17:30.821715 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.260460 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-b7vt8"] Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.261880 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-b7vt8" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.264333 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.264566 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.280626 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-b7vt8"] Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.392923 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31a54fa3-c02b-4d90-8375-b50ab8de60fe-config-data\") pod \"nova-cell0-cell-mapping-b7vt8\" (UID: \"31a54fa3-c02b-4d90-8375-b50ab8de60fe\") " pod="openstack/nova-cell0-cell-mapping-b7vt8" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.393005 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzc84\" (UniqueName: \"kubernetes.io/projected/31a54fa3-c02b-4d90-8375-b50ab8de60fe-kube-api-access-mzc84\") pod \"nova-cell0-cell-mapping-b7vt8\" (UID: \"31a54fa3-c02b-4d90-8375-b50ab8de60fe\") " pod="openstack/nova-cell0-cell-mapping-b7vt8" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.393052 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31a54fa3-c02b-4d90-8375-b50ab8de60fe-scripts\") pod \"nova-cell0-cell-mapping-b7vt8\" (UID: \"31a54fa3-c02b-4d90-8375-b50ab8de60fe\") " pod="openstack/nova-cell0-cell-mapping-b7vt8" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.393110 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31a54fa3-c02b-4d90-8375-b50ab8de60fe-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-b7vt8\" (UID: \"31a54fa3-c02b-4d90-8375-b50ab8de60fe\") " pod="openstack/nova-cell0-cell-mapping-b7vt8" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.438575 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.440156 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.448115 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.454099 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.509429 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c214024-05df-4ee0-a121-04adc2e5f5c5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0c214024-05df-4ee0-a121-04adc2e5f5c5\") " pod="openstack/nova-scheduler-0" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.509514 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31a54fa3-c02b-4d90-8375-b50ab8de60fe-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-b7vt8\" (UID: \"31a54fa3-c02b-4d90-8375-b50ab8de60fe\") " pod="openstack/nova-cell0-cell-mapping-b7vt8" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.509681 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c214024-05df-4ee0-a121-04adc2e5f5c5-config-data\") pod \"nova-scheduler-0\" (UID: \"0c214024-05df-4ee0-a121-04adc2e5f5c5\") " pod="openstack/nova-scheduler-0" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.509715 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31a54fa3-c02b-4d90-8375-b50ab8de60fe-config-data\") pod \"nova-cell0-cell-mapping-b7vt8\" (UID: \"31a54fa3-c02b-4d90-8375-b50ab8de60fe\") " pod="openstack/nova-cell0-cell-mapping-b7vt8" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.509796 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzc84\" (UniqueName: \"kubernetes.io/projected/31a54fa3-c02b-4d90-8375-b50ab8de60fe-kube-api-access-mzc84\") pod \"nova-cell0-cell-mapping-b7vt8\" (UID: \"31a54fa3-c02b-4d90-8375-b50ab8de60fe\") " pod="openstack/nova-cell0-cell-mapping-b7vt8" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.509930 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31a54fa3-c02b-4d90-8375-b50ab8de60fe-scripts\") pod \"nova-cell0-cell-mapping-b7vt8\" (UID: \"31a54fa3-c02b-4d90-8375-b50ab8de60fe\") " pod="openstack/nova-cell0-cell-mapping-b7vt8" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.510006 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvsxs\" (UniqueName: \"kubernetes.io/projected/0c214024-05df-4ee0-a121-04adc2e5f5c5-kube-api-access-lvsxs\") pod \"nova-scheduler-0\" (UID: \"0c214024-05df-4ee0-a121-04adc2e5f5c5\") " pod="openstack/nova-scheduler-0" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.536771 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31a54fa3-c02b-4d90-8375-b50ab8de60fe-config-data\") pod \"nova-cell0-cell-mapping-b7vt8\" (UID: \"31a54fa3-c02b-4d90-8375-b50ab8de60fe\") " pod="openstack/nova-cell0-cell-mapping-b7vt8" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.545665 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31a54fa3-c02b-4d90-8375-b50ab8de60fe-scripts\") pod \"nova-cell0-cell-mapping-b7vt8\" (UID: \"31a54fa3-c02b-4d90-8375-b50ab8de60fe\") " pod="openstack/nova-cell0-cell-mapping-b7vt8" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.592149 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31a54fa3-c02b-4d90-8375-b50ab8de60fe-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-b7vt8\" (UID: \"31a54fa3-c02b-4d90-8375-b50ab8de60fe\") " pod="openstack/nova-cell0-cell-mapping-b7vt8" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.626049 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.643137 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.647958 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzc84\" (UniqueName: \"kubernetes.io/projected/31a54fa3-c02b-4d90-8375-b50ab8de60fe-kube-api-access-mzc84\") pod \"nova-cell0-cell-mapping-b7vt8\" (UID: \"31a54fa3-c02b-4d90-8375-b50ab8de60fe\") " pod="openstack/nova-cell0-cell-mapping-b7vt8" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.659988 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.662160 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c214024-05df-4ee0-a121-04adc2e5f5c5-config-data\") pod \"nova-scheduler-0\" (UID: \"0c214024-05df-4ee0-a121-04adc2e5f5c5\") " pod="openstack/nova-scheduler-0" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.662609 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-b7vt8" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.677512 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c214024-05df-4ee0-a121-04adc2e5f5c5-config-data\") pod \"nova-scheduler-0\" (UID: \"0c214024-05df-4ee0-a121-04adc2e5f5c5\") " pod="openstack/nova-scheduler-0" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.679713 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.681177 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.683814 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvsxs\" (UniqueName: \"kubernetes.io/projected/0c214024-05df-4ee0-a121-04adc2e5f5c5-kube-api-access-lvsxs\") pod \"nova-scheduler-0\" (UID: \"0c214024-05df-4ee0-a121-04adc2e5f5c5\") " pod="openstack/nova-scheduler-0" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.683945 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c214024-05df-4ee0-a121-04adc2e5f5c5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0c214024-05df-4ee0-a121-04adc2e5f5c5\") " pod="openstack/nova-scheduler-0" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.691491 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.694028 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c214024-05df-4ee0-a121-04adc2e5f5c5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0c214024-05df-4ee0-a121-04adc2e5f5c5\") " pod="openstack/nova-scheduler-0" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.706548 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.718930 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.775359 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvsxs\" (UniqueName: \"kubernetes.io/projected/0c214024-05df-4ee0-a121-04adc2e5f5c5-kube-api-access-lvsxs\") pod \"nova-scheduler-0\" (UID: \"0c214024-05df-4ee0-a121-04adc2e5f5c5\") " pod="openstack/nova-scheduler-0" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.786326 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad4421f3-1cd6-47fe-a017-8b7ea120652c-logs\") pod \"nova-api-0\" (UID: \"ad4421f3-1cd6-47fe-a017-8b7ea120652c\") " pod="openstack/nova-api-0" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.786399 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cea2a563-6121-4f95-a12f-fdd9e258e240-logs\") pod \"nova-metadata-0\" (UID: \"cea2a563-6121-4f95-a12f-fdd9e258e240\") " pod="openstack/nova-metadata-0" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.786431 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad4421f3-1cd6-47fe-a017-8b7ea120652c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ad4421f3-1cd6-47fe-a017-8b7ea120652c\") " pod="openstack/nova-api-0" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.786462 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cea2a563-6121-4f95-a12f-fdd9e258e240-config-data\") pod \"nova-metadata-0\" (UID: \"cea2a563-6121-4f95-a12f-fdd9e258e240\") " pod="openstack/nova-metadata-0" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.786492 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5tzb\" (UniqueName: \"kubernetes.io/projected/cea2a563-6121-4f95-a12f-fdd9e258e240-kube-api-access-b5tzb\") pod \"nova-metadata-0\" (UID: \"cea2a563-6121-4f95-a12f-fdd9e258e240\") " pod="openstack/nova-metadata-0" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.786514 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjhvt\" (UniqueName: \"kubernetes.io/projected/ad4421f3-1cd6-47fe-a017-8b7ea120652c-kube-api-access-fjhvt\") pod \"nova-api-0\" (UID: \"ad4421f3-1cd6-47fe-a017-8b7ea120652c\") " pod="openstack/nova-api-0" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.786580 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad4421f3-1cd6-47fe-a017-8b7ea120652c-config-data\") pod \"nova-api-0\" (UID: \"ad4421f3-1cd6-47fe-a017-8b7ea120652c\") " pod="openstack/nova-api-0" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.786627 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cea2a563-6121-4f95-a12f-fdd9e258e240-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cea2a563-6121-4f95-a12f-fdd9e258e240\") " pod="openstack/nova-metadata-0" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.829905 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-hdklj"] Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.832102 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-hdklj" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.845511 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-hdklj"] Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.889078 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c2cb820e-8840-4482-9866-e51474883db3-dns-svc\") pod \"dnsmasq-dns-757b4f8459-hdklj\" (UID: \"c2cb820e-8840-4482-9866-e51474883db3\") " pod="openstack/dnsmasq-dns-757b4f8459-hdklj" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.889350 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad4421f3-1cd6-47fe-a017-8b7ea120652c-config-data\") pod \"nova-api-0\" (UID: \"ad4421f3-1cd6-47fe-a017-8b7ea120652c\") " pod="openstack/nova-api-0" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.889514 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c2cb820e-8840-4482-9866-e51474883db3-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-hdklj\" (UID: \"c2cb820e-8840-4482-9866-e51474883db3\") " pod="openstack/dnsmasq-dns-757b4f8459-hdklj" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.889624 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqd2h\" (UniqueName: \"kubernetes.io/projected/c2cb820e-8840-4482-9866-e51474883db3-kube-api-access-dqd2h\") pod \"dnsmasq-dns-757b4f8459-hdklj\" (UID: \"c2cb820e-8840-4482-9866-e51474883db3\") " pod="openstack/dnsmasq-dns-757b4f8459-hdklj" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.889733 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cea2a563-6121-4f95-a12f-fdd9e258e240-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cea2a563-6121-4f95-a12f-fdd9e258e240\") " pod="openstack/nova-metadata-0" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.889847 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c2cb820e-8840-4482-9866-e51474883db3-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-hdklj\" (UID: \"c2cb820e-8840-4482-9866-e51474883db3\") " pod="openstack/dnsmasq-dns-757b4f8459-hdklj" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.889943 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c2cb820e-8840-4482-9866-e51474883db3-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-hdklj\" (UID: \"c2cb820e-8840-4482-9866-e51474883db3\") " pod="openstack/dnsmasq-dns-757b4f8459-hdklj" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.890120 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad4421f3-1cd6-47fe-a017-8b7ea120652c-logs\") pod \"nova-api-0\" (UID: \"ad4421f3-1cd6-47fe-a017-8b7ea120652c\") " pod="openstack/nova-api-0" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.890217 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cea2a563-6121-4f95-a12f-fdd9e258e240-logs\") pod \"nova-metadata-0\" (UID: \"cea2a563-6121-4f95-a12f-fdd9e258e240\") " pod="openstack/nova-metadata-0" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.890342 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad4421f3-1cd6-47fe-a017-8b7ea120652c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ad4421f3-1cd6-47fe-a017-8b7ea120652c\") " pod="openstack/nova-api-0" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.890434 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2cb820e-8840-4482-9866-e51474883db3-config\") pod \"dnsmasq-dns-757b4f8459-hdklj\" (UID: \"c2cb820e-8840-4482-9866-e51474883db3\") " pod="openstack/dnsmasq-dns-757b4f8459-hdklj" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.890703 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cea2a563-6121-4f95-a12f-fdd9e258e240-config-data\") pod \"nova-metadata-0\" (UID: \"cea2a563-6121-4f95-a12f-fdd9e258e240\") " pod="openstack/nova-metadata-0" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.890826 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5tzb\" (UniqueName: \"kubernetes.io/projected/cea2a563-6121-4f95-a12f-fdd9e258e240-kube-api-access-b5tzb\") pod \"nova-metadata-0\" (UID: \"cea2a563-6121-4f95-a12f-fdd9e258e240\") " pod="openstack/nova-metadata-0" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.890936 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjhvt\" (UniqueName: \"kubernetes.io/projected/ad4421f3-1cd6-47fe-a017-8b7ea120652c-kube-api-access-fjhvt\") pod \"nova-api-0\" (UID: \"ad4421f3-1cd6-47fe-a017-8b7ea120652c\") " pod="openstack/nova-api-0" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.892385 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad4421f3-1cd6-47fe-a017-8b7ea120652c-logs\") pod \"nova-api-0\" (UID: \"ad4421f3-1cd6-47fe-a017-8b7ea120652c\") " pod="openstack/nova-api-0" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.897280 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cea2a563-6121-4f95-a12f-fdd9e258e240-logs\") pod \"nova-metadata-0\" (UID: \"cea2a563-6121-4f95-a12f-fdd9e258e240\") " pod="openstack/nova-metadata-0" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.904058 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cea2a563-6121-4f95-a12f-fdd9e258e240-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cea2a563-6121-4f95-a12f-fdd9e258e240\") " pod="openstack/nova-metadata-0" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.904288 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad4421f3-1cd6-47fe-a017-8b7ea120652c-config-data\") pod \"nova-api-0\" (UID: \"ad4421f3-1cd6-47fe-a017-8b7ea120652c\") " pod="openstack/nova-api-0" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.917894 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cea2a563-6121-4f95-a12f-fdd9e258e240-config-data\") pod \"nova-metadata-0\" (UID: \"cea2a563-6121-4f95-a12f-fdd9e258e240\") " pod="openstack/nova-metadata-0" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.918580 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad4421f3-1cd6-47fe-a017-8b7ea120652c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ad4421f3-1cd6-47fe-a017-8b7ea120652c\") " pod="openstack/nova-api-0" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.933736 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjhvt\" (UniqueName: \"kubernetes.io/projected/ad4421f3-1cd6-47fe-a017-8b7ea120652c-kube-api-access-fjhvt\") pod \"nova-api-0\" (UID: \"ad4421f3-1cd6-47fe-a017-8b7ea120652c\") " pod="openstack/nova-api-0" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.942427 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5tzb\" (UniqueName: \"kubernetes.io/projected/cea2a563-6121-4f95-a12f-fdd9e258e240-kube-api-access-b5tzb\") pod \"nova-metadata-0\" (UID: \"cea2a563-6121-4f95-a12f-fdd9e258e240\") " pod="openstack/nova-metadata-0" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.993047 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c2cb820e-8840-4482-9866-e51474883db3-dns-svc\") pod \"dnsmasq-dns-757b4f8459-hdklj\" (UID: \"c2cb820e-8840-4482-9866-e51474883db3\") " pod="openstack/dnsmasq-dns-757b4f8459-hdklj" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.993414 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c2cb820e-8840-4482-9866-e51474883db3-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-hdklj\" (UID: \"c2cb820e-8840-4482-9866-e51474883db3\") " pod="openstack/dnsmasq-dns-757b4f8459-hdklj" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.993453 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqd2h\" (UniqueName: \"kubernetes.io/projected/c2cb820e-8840-4482-9866-e51474883db3-kube-api-access-dqd2h\") pod \"dnsmasq-dns-757b4f8459-hdklj\" (UID: \"c2cb820e-8840-4482-9866-e51474883db3\") " pod="openstack/dnsmasq-dns-757b4f8459-hdklj" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.993485 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c2cb820e-8840-4482-9866-e51474883db3-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-hdklj\" (UID: \"c2cb820e-8840-4482-9866-e51474883db3\") " pod="openstack/dnsmasq-dns-757b4f8459-hdklj" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.993507 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c2cb820e-8840-4482-9866-e51474883db3-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-hdklj\" (UID: \"c2cb820e-8840-4482-9866-e51474883db3\") " pod="openstack/dnsmasq-dns-757b4f8459-hdklj" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.993563 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2cb820e-8840-4482-9866-e51474883db3-config\") pod \"dnsmasq-dns-757b4f8459-hdklj\" (UID: \"c2cb820e-8840-4482-9866-e51474883db3\") " pod="openstack/dnsmasq-dns-757b4f8459-hdklj" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.995242 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c2cb820e-8840-4482-9866-e51474883db3-dns-svc\") pod \"dnsmasq-dns-757b4f8459-hdklj\" (UID: \"c2cb820e-8840-4482-9866-e51474883db3\") " pod="openstack/dnsmasq-dns-757b4f8459-hdklj" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.995984 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c2cb820e-8840-4482-9866-e51474883db3-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-hdklj\" (UID: \"c2cb820e-8840-4482-9866-e51474883db3\") " pod="openstack/dnsmasq-dns-757b4f8459-hdklj" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.996018 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c2cb820e-8840-4482-9866-e51474883db3-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-hdklj\" (UID: \"c2cb820e-8840-4482-9866-e51474883db3\") " pod="openstack/dnsmasq-dns-757b4f8459-hdklj" Dec 01 14:17:31 crc kubenswrapper[4585]: I1201 14:17:31.996827 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2cb820e-8840-4482-9866-e51474883db3-config\") pod \"dnsmasq-dns-757b4f8459-hdklj\" (UID: \"c2cb820e-8840-4482-9866-e51474883db3\") " pod="openstack/dnsmasq-dns-757b4f8459-hdklj" Dec 01 14:17:32 crc kubenswrapper[4585]: I1201 14:17:32.000094 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c2cb820e-8840-4482-9866-e51474883db3-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-hdklj\" (UID: \"c2cb820e-8840-4482-9866-e51474883db3\") " pod="openstack/dnsmasq-dns-757b4f8459-hdklj" Dec 01 14:17:32 crc kubenswrapper[4585]: I1201 14:17:32.048675 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqd2h\" (UniqueName: \"kubernetes.io/projected/c2cb820e-8840-4482-9866-e51474883db3-kube-api-access-dqd2h\") pod \"dnsmasq-dns-757b4f8459-hdklj\" (UID: \"c2cb820e-8840-4482-9866-e51474883db3\") " pod="openstack/dnsmasq-dns-757b4f8459-hdklj" Dec 01 14:17:32 crc kubenswrapper[4585]: I1201 14:17:32.060426 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 14:17:32 crc kubenswrapper[4585]: I1201 14:17:32.107740 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 14:17:32 crc kubenswrapper[4585]: I1201 14:17:32.124412 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 14:17:32 crc kubenswrapper[4585]: I1201 14:17:32.133032 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 01 14:17:32 crc kubenswrapper[4585]: I1201 14:17:32.145538 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 01 14:17:32 crc kubenswrapper[4585]: I1201 14:17:32.172283 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 14:17:32 crc kubenswrapper[4585]: I1201 14:17:32.206560 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b498816b-102e-4feb-bf5d-0666528a370d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b498816b-102e-4feb-bf5d-0666528a370d\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 14:17:32 crc kubenswrapper[4585]: I1201 14:17:32.206636 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b498816b-102e-4feb-bf5d-0666528a370d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b498816b-102e-4feb-bf5d-0666528a370d\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 14:17:32 crc kubenswrapper[4585]: I1201 14:17:32.206743 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fg65b\" (UniqueName: \"kubernetes.io/projected/b498816b-102e-4feb-bf5d-0666528a370d-kube-api-access-fg65b\") pod \"nova-cell1-novncproxy-0\" (UID: \"b498816b-102e-4feb-bf5d-0666528a370d\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 14:17:32 crc kubenswrapper[4585]: I1201 14:17:32.221518 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 14:17:32 crc kubenswrapper[4585]: I1201 14:17:32.275386 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-hdklj" Dec 01 14:17:32 crc kubenswrapper[4585]: I1201 14:17:32.311095 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fg65b\" (UniqueName: \"kubernetes.io/projected/b498816b-102e-4feb-bf5d-0666528a370d-kube-api-access-fg65b\") pod \"nova-cell1-novncproxy-0\" (UID: \"b498816b-102e-4feb-bf5d-0666528a370d\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 14:17:32 crc kubenswrapper[4585]: I1201 14:17:32.311164 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b498816b-102e-4feb-bf5d-0666528a370d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b498816b-102e-4feb-bf5d-0666528a370d\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 14:17:32 crc kubenswrapper[4585]: I1201 14:17:32.319749 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b498816b-102e-4feb-bf5d-0666528a370d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b498816b-102e-4feb-bf5d-0666528a370d\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 14:17:32 crc kubenswrapper[4585]: I1201 14:17:32.331812 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b498816b-102e-4feb-bf5d-0666528a370d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b498816b-102e-4feb-bf5d-0666528a370d\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 14:17:32 crc kubenswrapper[4585]: I1201 14:17:32.344542 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b498816b-102e-4feb-bf5d-0666528a370d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b498816b-102e-4feb-bf5d-0666528a370d\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 14:17:32 crc kubenswrapper[4585]: I1201 14:17:32.357950 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fg65b\" (UniqueName: \"kubernetes.io/projected/b498816b-102e-4feb-bf5d-0666528a370d-kube-api-access-fg65b\") pod \"nova-cell1-novncproxy-0\" (UID: \"b498816b-102e-4feb-bf5d-0666528a370d\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 14:17:32 crc kubenswrapper[4585]: I1201 14:17:32.474852 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 01 14:17:32 crc kubenswrapper[4585]: I1201 14:17:32.578827 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-b7vt8"] Dec 01 14:17:32 crc kubenswrapper[4585]: W1201 14:17:32.666491 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31a54fa3_c02b_4d90_8375_b50ab8de60fe.slice/crio-91242f9d30aaa707bcf8da0d1dab507d5b36607e7ba0964ef1cda4fe99e13bd9 WatchSource:0}: Error finding container 91242f9d30aaa707bcf8da0d1dab507d5b36607e7ba0964ef1cda4fe99e13bd9: Status 404 returned error can't find the container with id 91242f9d30aaa707bcf8da0d1dab507d5b36607e7ba0964ef1cda4fe99e13bd9 Dec 01 14:17:32 crc kubenswrapper[4585]: I1201 14:17:32.927720 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 14:17:33 crc kubenswrapper[4585]: I1201 14:17:33.114132 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 14:17:33 crc kubenswrapper[4585]: I1201 14:17:33.124723 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 14:17:33 crc kubenswrapper[4585]: W1201 14:17:33.127591 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcea2a563_6121_4f95_a12f_fdd9e258e240.slice/crio-049dbb2bf5f890a147cebeff957bd8872a47880ee5b71790872ba69d661c795f WatchSource:0}: Error finding container 049dbb2bf5f890a147cebeff957bd8872a47880ee5b71790872ba69d661c795f: Status 404 returned error can't find the container with id 049dbb2bf5f890a147cebeff957bd8872a47880ee5b71790872ba69d661c795f Dec 01 14:17:33 crc kubenswrapper[4585]: W1201 14:17:33.228074 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2cb820e_8840_4482_9866_e51474883db3.slice/crio-f0ec22ede8bee54b854145ba441e3e7e356b11fe43a49b9d98eadee758589cc6 WatchSource:0}: Error finding container f0ec22ede8bee54b854145ba441e3e7e356b11fe43a49b9d98eadee758589cc6: Status 404 returned error can't find the container with id f0ec22ede8bee54b854145ba441e3e7e356b11fe43a49b9d98eadee758589cc6 Dec 01 14:17:33 crc kubenswrapper[4585]: I1201 14:17:33.233108 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-hdklj"] Dec 01 14:17:33 crc kubenswrapper[4585]: I1201 14:17:33.380069 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-vfxqx"] Dec 01 14:17:33 crc kubenswrapper[4585]: I1201 14:17:33.381789 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-vfxqx" Dec 01 14:17:33 crc kubenswrapper[4585]: I1201 14:17:33.406311 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 01 14:17:33 crc kubenswrapper[4585]: I1201 14:17:33.406535 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 01 14:17:33 crc kubenswrapper[4585]: I1201 14:17:33.426346 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-vfxqx"] Dec 01 14:17:33 crc kubenswrapper[4585]: I1201 14:17:33.441912 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 14:17:33 crc kubenswrapper[4585]: I1201 14:17:33.488245 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa5f1c18-8b57-47a1-ba5d-5ae47fd1ae30-scripts\") pod \"nova-cell1-conductor-db-sync-vfxqx\" (UID: \"fa5f1c18-8b57-47a1-ba5d-5ae47fd1ae30\") " pod="openstack/nova-cell1-conductor-db-sync-vfxqx" Dec 01 14:17:33 crc kubenswrapper[4585]: I1201 14:17:33.488676 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa5f1c18-8b57-47a1-ba5d-5ae47fd1ae30-config-data\") pod \"nova-cell1-conductor-db-sync-vfxqx\" (UID: \"fa5f1c18-8b57-47a1-ba5d-5ae47fd1ae30\") " pod="openstack/nova-cell1-conductor-db-sync-vfxqx" Dec 01 14:17:33 crc kubenswrapper[4585]: I1201 14:17:33.488753 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa5f1c18-8b57-47a1-ba5d-5ae47fd1ae30-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-vfxqx\" (UID: \"fa5f1c18-8b57-47a1-ba5d-5ae47fd1ae30\") " pod="openstack/nova-cell1-conductor-db-sync-vfxqx" Dec 01 14:17:33 crc kubenswrapper[4585]: I1201 14:17:33.488782 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpg6g\" (UniqueName: \"kubernetes.io/projected/fa5f1c18-8b57-47a1-ba5d-5ae47fd1ae30-kube-api-access-gpg6g\") pod \"nova-cell1-conductor-db-sync-vfxqx\" (UID: \"fa5f1c18-8b57-47a1-ba5d-5ae47fd1ae30\") " pod="openstack/nova-cell1-conductor-db-sync-vfxqx" Dec 01 14:17:33 crc kubenswrapper[4585]: I1201 14:17:33.503139 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b498816b-102e-4feb-bf5d-0666528a370d","Type":"ContainerStarted","Data":"ff0a5cf12435ab507d74a93d86a901dda87ae28301dc967622d09cca206e9357"} Dec 01 14:17:33 crc kubenswrapper[4585]: I1201 14:17:33.504732 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cea2a563-6121-4f95-a12f-fdd9e258e240","Type":"ContainerStarted","Data":"049dbb2bf5f890a147cebeff957bd8872a47880ee5b71790872ba69d661c795f"} Dec 01 14:17:33 crc kubenswrapper[4585]: I1201 14:17:33.506021 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ad4421f3-1cd6-47fe-a017-8b7ea120652c","Type":"ContainerStarted","Data":"c81dfcf7b5033dd0383669dca9466baeda977fd5424e13a96da8c21a0ec5570a"} Dec 01 14:17:33 crc kubenswrapper[4585]: I1201 14:17:33.507380 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-b7vt8" event={"ID":"31a54fa3-c02b-4d90-8375-b50ab8de60fe","Type":"ContainerStarted","Data":"fcea060dd8edaf28d77724cb85e1b60a91cb53c09162f38fdc417a7e1376ad5c"} Dec 01 14:17:33 crc kubenswrapper[4585]: I1201 14:17:33.507409 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-b7vt8" event={"ID":"31a54fa3-c02b-4d90-8375-b50ab8de60fe","Type":"ContainerStarted","Data":"91242f9d30aaa707bcf8da0d1dab507d5b36607e7ba0964ef1cda4fe99e13bd9"} Dec 01 14:17:33 crc kubenswrapper[4585]: I1201 14:17:33.513323 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-hdklj" event={"ID":"c2cb820e-8840-4482-9866-e51474883db3","Type":"ContainerStarted","Data":"f0ec22ede8bee54b854145ba441e3e7e356b11fe43a49b9d98eadee758589cc6"} Dec 01 14:17:33 crc kubenswrapper[4585]: I1201 14:17:33.518678 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0c214024-05df-4ee0-a121-04adc2e5f5c5","Type":"ContainerStarted","Data":"f00ab36b814b7361723f31bd40dd53dae8865a5b77e87e15813476fe60585576"} Dec 01 14:17:33 crc kubenswrapper[4585]: I1201 14:17:33.529810 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-b7vt8" podStartSLOduration=2.529791952 podStartE2EDuration="2.529791952s" podCreationTimestamp="2025-12-01 14:17:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:17:33.52936255 +0000 UTC m=+1167.513576405" watchObservedRunningTime="2025-12-01 14:17:33.529791952 +0000 UTC m=+1167.514005807" Dec 01 14:17:33 crc kubenswrapper[4585]: I1201 14:17:33.590467 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa5f1c18-8b57-47a1-ba5d-5ae47fd1ae30-config-data\") pod \"nova-cell1-conductor-db-sync-vfxqx\" (UID: \"fa5f1c18-8b57-47a1-ba5d-5ae47fd1ae30\") " pod="openstack/nova-cell1-conductor-db-sync-vfxqx" Dec 01 14:17:33 crc kubenswrapper[4585]: I1201 14:17:33.590565 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa5f1c18-8b57-47a1-ba5d-5ae47fd1ae30-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-vfxqx\" (UID: \"fa5f1c18-8b57-47a1-ba5d-5ae47fd1ae30\") " pod="openstack/nova-cell1-conductor-db-sync-vfxqx" Dec 01 14:17:33 crc kubenswrapper[4585]: I1201 14:17:33.590600 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpg6g\" (UniqueName: \"kubernetes.io/projected/fa5f1c18-8b57-47a1-ba5d-5ae47fd1ae30-kube-api-access-gpg6g\") pod \"nova-cell1-conductor-db-sync-vfxqx\" (UID: \"fa5f1c18-8b57-47a1-ba5d-5ae47fd1ae30\") " pod="openstack/nova-cell1-conductor-db-sync-vfxqx" Dec 01 14:17:33 crc kubenswrapper[4585]: I1201 14:17:33.590674 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa5f1c18-8b57-47a1-ba5d-5ae47fd1ae30-scripts\") pod \"nova-cell1-conductor-db-sync-vfxqx\" (UID: \"fa5f1c18-8b57-47a1-ba5d-5ae47fd1ae30\") " pod="openstack/nova-cell1-conductor-db-sync-vfxqx" Dec 01 14:17:33 crc kubenswrapper[4585]: I1201 14:17:33.596368 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa5f1c18-8b57-47a1-ba5d-5ae47fd1ae30-config-data\") pod \"nova-cell1-conductor-db-sync-vfxqx\" (UID: \"fa5f1c18-8b57-47a1-ba5d-5ae47fd1ae30\") " pod="openstack/nova-cell1-conductor-db-sync-vfxqx" Dec 01 14:17:33 crc kubenswrapper[4585]: I1201 14:17:33.598911 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa5f1c18-8b57-47a1-ba5d-5ae47fd1ae30-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-vfxqx\" (UID: \"fa5f1c18-8b57-47a1-ba5d-5ae47fd1ae30\") " pod="openstack/nova-cell1-conductor-db-sync-vfxqx" Dec 01 14:17:33 crc kubenswrapper[4585]: I1201 14:17:33.607396 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa5f1c18-8b57-47a1-ba5d-5ae47fd1ae30-scripts\") pod \"nova-cell1-conductor-db-sync-vfxqx\" (UID: \"fa5f1c18-8b57-47a1-ba5d-5ae47fd1ae30\") " pod="openstack/nova-cell1-conductor-db-sync-vfxqx" Dec 01 14:17:33 crc kubenswrapper[4585]: I1201 14:17:33.609657 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpg6g\" (UniqueName: \"kubernetes.io/projected/fa5f1c18-8b57-47a1-ba5d-5ae47fd1ae30-kube-api-access-gpg6g\") pod \"nova-cell1-conductor-db-sync-vfxqx\" (UID: \"fa5f1c18-8b57-47a1-ba5d-5ae47fd1ae30\") " pod="openstack/nova-cell1-conductor-db-sync-vfxqx" Dec 01 14:17:33 crc kubenswrapper[4585]: I1201 14:17:33.748382 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-vfxqx" Dec 01 14:17:34 crc kubenswrapper[4585]: I1201 14:17:34.298235 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-vfxqx"] Dec 01 14:17:34 crc kubenswrapper[4585]: W1201 14:17:34.322174 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa5f1c18_8b57_47a1_ba5d_5ae47fd1ae30.slice/crio-b6385e120fa0d9c95ea71521ee4be688b0cfdc26b9095008f55c0e57eaa65d67 WatchSource:0}: Error finding container b6385e120fa0d9c95ea71521ee4be688b0cfdc26b9095008f55c0e57eaa65d67: Status 404 returned error can't find the container with id b6385e120fa0d9c95ea71521ee4be688b0cfdc26b9095008f55c0e57eaa65d67 Dec 01 14:17:34 crc kubenswrapper[4585]: I1201 14:17:34.532872 4585 generic.go:334] "Generic (PLEG): container finished" podID="c2cb820e-8840-4482-9866-e51474883db3" containerID="bdff6b51a06e09f8e1f880935bd6b22491c4015333f6ea4cf966f1fcf2bd3763" exitCode=0 Dec 01 14:17:34 crc kubenswrapper[4585]: I1201 14:17:34.532964 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-hdklj" event={"ID":"c2cb820e-8840-4482-9866-e51474883db3","Type":"ContainerDied","Data":"bdff6b51a06e09f8e1f880935bd6b22491c4015333f6ea4cf966f1fcf2bd3763"} Dec 01 14:17:34 crc kubenswrapper[4585]: I1201 14:17:34.542678 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-vfxqx" event={"ID":"fa5f1c18-8b57-47a1-ba5d-5ae47fd1ae30","Type":"ContainerStarted","Data":"b6385e120fa0d9c95ea71521ee4be688b0cfdc26b9095008f55c0e57eaa65d67"} Dec 01 14:17:35 crc kubenswrapper[4585]: I1201 14:17:35.571561 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-vfxqx" event={"ID":"fa5f1c18-8b57-47a1-ba5d-5ae47fd1ae30","Type":"ContainerStarted","Data":"166f287ec8ac7df3fc80d66b577864cf2107dbc4710cf00e57d7f82a993c8ae9"} Dec 01 14:17:35 crc kubenswrapper[4585]: I1201 14:17:35.578409 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-hdklj" event={"ID":"c2cb820e-8840-4482-9866-e51474883db3","Type":"ContainerStarted","Data":"de5b5e57382acf7f9d83c321db9e9248c37525425646371ecdbe826abc330d5a"} Dec 01 14:17:35 crc kubenswrapper[4585]: I1201 14:17:35.579234 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-757b4f8459-hdklj" Dec 01 14:17:35 crc kubenswrapper[4585]: I1201 14:17:35.616272 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-vfxqx" podStartSLOduration=2.616253694 podStartE2EDuration="2.616253694s" podCreationTimestamp="2025-12-01 14:17:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:17:35.591317589 +0000 UTC m=+1169.575531464" watchObservedRunningTime="2025-12-01 14:17:35.616253694 +0000 UTC m=+1169.600467549" Dec 01 14:17:35 crc kubenswrapper[4585]: I1201 14:17:35.617763 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-757b4f8459-hdklj" podStartSLOduration=4.617747144 podStartE2EDuration="4.617747144s" podCreationTimestamp="2025-12-01 14:17:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:17:35.610354276 +0000 UTC m=+1169.594568131" watchObservedRunningTime="2025-12-01 14:17:35.617747144 +0000 UTC m=+1169.601960989" Dec 01 14:17:35 crc kubenswrapper[4585]: I1201 14:17:35.863896 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 14:17:35 crc kubenswrapper[4585]: I1201 14:17:35.894549 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 14:17:36 crc kubenswrapper[4585]: I1201 14:17:36.664010 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 01 14:17:38 crc kubenswrapper[4585]: I1201 14:17:38.693170 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0c214024-05df-4ee0-a121-04adc2e5f5c5","Type":"ContainerStarted","Data":"8b392d66828f6fd499c67724d985bd5d31084bdffa01eb7dc7c6917836ab59e7"} Dec 01 14:17:38 crc kubenswrapper[4585]: I1201 14:17:38.794659 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b498816b-102e-4feb-bf5d-0666528a370d","Type":"ContainerStarted","Data":"e42a0432904abcccbd590e9b876347e521aecda15e651758e5e493362a880751"} Dec 01 14:17:38 crc kubenswrapper[4585]: I1201 14:17:38.795773 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="b498816b-102e-4feb-bf5d-0666528a370d" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://e42a0432904abcccbd590e9b876347e521aecda15e651758e5e493362a880751" gracePeriod=30 Dec 01 14:17:38 crc kubenswrapper[4585]: I1201 14:17:38.804009 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cea2a563-6121-4f95-a12f-fdd9e258e240","Type":"ContainerStarted","Data":"2a661329c834c2f39238dea87cd7f7d7b9c8c05d2d98e22e9ac80223547f65f8"} Dec 01 14:17:38 crc kubenswrapper[4585]: I1201 14:17:38.804061 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cea2a563-6121-4f95-a12f-fdd9e258e240","Type":"ContainerStarted","Data":"ec9281af5e958d8af97ee02e003d40a0ee1ec71e73efdabc3d382c6fc62866f2"} Dec 01 14:17:38 crc kubenswrapper[4585]: I1201 14:17:38.804234 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="cea2a563-6121-4f95-a12f-fdd9e258e240" containerName="nova-metadata-log" containerID="cri-o://ec9281af5e958d8af97ee02e003d40a0ee1ec71e73efdabc3d382c6fc62866f2" gracePeriod=30 Dec 01 14:17:38 crc kubenswrapper[4585]: I1201 14:17:38.804350 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="cea2a563-6121-4f95-a12f-fdd9e258e240" containerName="nova-metadata-metadata" containerID="cri-o://2a661329c834c2f39238dea87cd7f7d7b9c8c05d2d98e22e9ac80223547f65f8" gracePeriod=30 Dec 01 14:17:38 crc kubenswrapper[4585]: I1201 14:17:38.809700 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ad4421f3-1cd6-47fe-a017-8b7ea120652c","Type":"ContainerStarted","Data":"cdf934349ebe5f6190cb6e45580f731d3a4b8f7f82ba8a9a44c9233521813518"} Dec 01 14:17:38 crc kubenswrapper[4585]: I1201 14:17:38.809763 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ad4421f3-1cd6-47fe-a017-8b7ea120652c","Type":"ContainerStarted","Data":"322e5d29a50b16bd223090cc1a749e920cd268ce8cc3c43ddf38009966e082bd"} Dec 01 14:17:38 crc kubenswrapper[4585]: I1201 14:17:38.831927 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.140342026 podStartE2EDuration="7.831905568s" podCreationTimestamp="2025-12-01 14:17:31 +0000 UTC" firstStartedPulling="2025-12-01 14:17:32.953618108 +0000 UTC m=+1166.937831953" lastFinishedPulling="2025-12-01 14:17:37.64518164 +0000 UTC m=+1171.629395495" observedRunningTime="2025-12-01 14:17:38.793503884 +0000 UTC m=+1172.777717739" watchObservedRunningTime="2025-12-01 14:17:38.831905568 +0000 UTC m=+1172.816119423" Dec 01 14:17:38 crc kubenswrapper[4585]: I1201 14:17:38.833891 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.608876879 podStartE2EDuration="6.83387997s" podCreationTimestamp="2025-12-01 14:17:32 +0000 UTC" firstStartedPulling="2025-12-01 14:17:33.43933738 +0000 UTC m=+1167.423551235" lastFinishedPulling="2025-12-01 14:17:37.664340471 +0000 UTC m=+1171.648554326" observedRunningTime="2025-12-01 14:17:38.816075006 +0000 UTC m=+1172.800288861" watchObservedRunningTime="2025-12-01 14:17:38.83387997 +0000 UTC m=+1172.818093835" Dec 01 14:17:38 crc kubenswrapper[4585]: I1201 14:17:38.882404 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.375020224 podStartE2EDuration="7.882386644s" podCreationTimestamp="2025-12-01 14:17:31 +0000 UTC" firstStartedPulling="2025-12-01 14:17:33.143133422 +0000 UTC m=+1167.127347267" lastFinishedPulling="2025-12-01 14:17:37.650499842 +0000 UTC m=+1171.634713687" observedRunningTime="2025-12-01 14:17:38.880585066 +0000 UTC m=+1172.864798921" watchObservedRunningTime="2025-12-01 14:17:38.882386644 +0000 UTC m=+1172.866600499" Dec 01 14:17:38 crc kubenswrapper[4585]: I1201 14:17:38.961583 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.44799773 podStartE2EDuration="7.961559986s" podCreationTimestamp="2025-12-01 14:17:31 +0000 UTC" firstStartedPulling="2025-12-01 14:17:33.134647575 +0000 UTC m=+1167.118861430" lastFinishedPulling="2025-12-01 14:17:37.648209831 +0000 UTC m=+1171.632423686" observedRunningTime="2025-12-01 14:17:38.908066029 +0000 UTC m=+1172.892279894" watchObservedRunningTime="2025-12-01 14:17:38.961559986 +0000 UTC m=+1172.945773841" Dec 01 14:17:39 crc kubenswrapper[4585]: I1201 14:17:39.535911 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 14:17:39 crc kubenswrapper[4585]: I1201 14:17:39.599496 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cea2a563-6121-4f95-a12f-fdd9e258e240-logs\") pod \"cea2a563-6121-4f95-a12f-fdd9e258e240\" (UID: \"cea2a563-6121-4f95-a12f-fdd9e258e240\") " Dec 01 14:17:39 crc kubenswrapper[4585]: I1201 14:17:39.599651 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cea2a563-6121-4f95-a12f-fdd9e258e240-combined-ca-bundle\") pod \"cea2a563-6121-4f95-a12f-fdd9e258e240\" (UID: \"cea2a563-6121-4f95-a12f-fdd9e258e240\") " Dec 01 14:17:39 crc kubenswrapper[4585]: I1201 14:17:39.599699 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cea2a563-6121-4f95-a12f-fdd9e258e240-config-data\") pod \"cea2a563-6121-4f95-a12f-fdd9e258e240\" (UID: \"cea2a563-6121-4f95-a12f-fdd9e258e240\") " Dec 01 14:17:39 crc kubenswrapper[4585]: I1201 14:17:39.599737 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5tzb\" (UniqueName: \"kubernetes.io/projected/cea2a563-6121-4f95-a12f-fdd9e258e240-kube-api-access-b5tzb\") pod \"cea2a563-6121-4f95-a12f-fdd9e258e240\" (UID: \"cea2a563-6121-4f95-a12f-fdd9e258e240\") " Dec 01 14:17:39 crc kubenswrapper[4585]: I1201 14:17:39.606291 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cea2a563-6121-4f95-a12f-fdd9e258e240-logs" (OuterVolumeSpecName: "logs") pod "cea2a563-6121-4f95-a12f-fdd9e258e240" (UID: "cea2a563-6121-4f95-a12f-fdd9e258e240"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:17:39 crc kubenswrapper[4585]: I1201 14:17:39.609381 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cea2a563-6121-4f95-a12f-fdd9e258e240-kube-api-access-b5tzb" (OuterVolumeSpecName: "kube-api-access-b5tzb") pod "cea2a563-6121-4f95-a12f-fdd9e258e240" (UID: "cea2a563-6121-4f95-a12f-fdd9e258e240"). InnerVolumeSpecName "kube-api-access-b5tzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:17:39 crc kubenswrapper[4585]: I1201 14:17:39.654144 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cea2a563-6121-4f95-a12f-fdd9e258e240-config-data" (OuterVolumeSpecName: "config-data") pod "cea2a563-6121-4f95-a12f-fdd9e258e240" (UID: "cea2a563-6121-4f95-a12f-fdd9e258e240"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:17:39 crc kubenswrapper[4585]: I1201 14:17:39.678082 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cea2a563-6121-4f95-a12f-fdd9e258e240-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cea2a563-6121-4f95-a12f-fdd9e258e240" (UID: "cea2a563-6121-4f95-a12f-fdd9e258e240"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:17:39 crc kubenswrapper[4585]: I1201 14:17:39.702194 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cea2a563-6121-4f95-a12f-fdd9e258e240-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 14:17:39 crc kubenswrapper[4585]: I1201 14:17:39.702231 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cea2a563-6121-4f95-a12f-fdd9e258e240-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 14:17:39 crc kubenswrapper[4585]: I1201 14:17:39.702242 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5tzb\" (UniqueName: \"kubernetes.io/projected/cea2a563-6121-4f95-a12f-fdd9e258e240-kube-api-access-b5tzb\") on node \"crc\" DevicePath \"\"" Dec 01 14:17:39 crc kubenswrapper[4585]: I1201 14:17:39.702251 4585 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cea2a563-6121-4f95-a12f-fdd9e258e240-logs\") on node \"crc\" DevicePath \"\"" Dec 01 14:17:39 crc kubenswrapper[4585]: I1201 14:17:39.819321 4585 generic.go:334] "Generic (PLEG): container finished" podID="cea2a563-6121-4f95-a12f-fdd9e258e240" containerID="2a661329c834c2f39238dea87cd7f7d7b9c8c05d2d98e22e9ac80223547f65f8" exitCode=0 Dec 01 14:17:39 crc kubenswrapper[4585]: I1201 14:17:39.819348 4585 generic.go:334] "Generic (PLEG): container finished" podID="cea2a563-6121-4f95-a12f-fdd9e258e240" containerID="ec9281af5e958d8af97ee02e003d40a0ee1ec71e73efdabc3d382c6fc62866f2" exitCode=143 Dec 01 14:17:39 crc kubenswrapper[4585]: I1201 14:17:39.820252 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 14:17:39 crc kubenswrapper[4585]: I1201 14:17:39.826931 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cea2a563-6121-4f95-a12f-fdd9e258e240","Type":"ContainerDied","Data":"2a661329c834c2f39238dea87cd7f7d7b9c8c05d2d98e22e9ac80223547f65f8"} Dec 01 14:17:39 crc kubenswrapper[4585]: I1201 14:17:39.826992 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cea2a563-6121-4f95-a12f-fdd9e258e240","Type":"ContainerDied","Data":"ec9281af5e958d8af97ee02e003d40a0ee1ec71e73efdabc3d382c6fc62866f2"} Dec 01 14:17:39 crc kubenswrapper[4585]: I1201 14:17:39.827003 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cea2a563-6121-4f95-a12f-fdd9e258e240","Type":"ContainerDied","Data":"049dbb2bf5f890a147cebeff957bd8872a47880ee5b71790872ba69d661c795f"} Dec 01 14:17:39 crc kubenswrapper[4585]: I1201 14:17:39.827020 4585 scope.go:117] "RemoveContainer" containerID="2a661329c834c2f39238dea87cd7f7d7b9c8c05d2d98e22e9ac80223547f65f8" Dec 01 14:17:39 crc kubenswrapper[4585]: I1201 14:17:39.856605 4585 scope.go:117] "RemoveContainer" containerID="ec9281af5e958d8af97ee02e003d40a0ee1ec71e73efdabc3d382c6fc62866f2" Dec 01 14:17:39 crc kubenswrapper[4585]: I1201 14:17:39.866308 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 14:17:39 crc kubenswrapper[4585]: I1201 14:17:39.900106 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 14:17:39 crc kubenswrapper[4585]: I1201 14:17:39.901326 4585 scope.go:117] "RemoveContainer" containerID="2a661329c834c2f39238dea87cd7f7d7b9c8c05d2d98e22e9ac80223547f65f8" Dec 01 14:17:39 crc kubenswrapper[4585]: E1201 14:17:39.903394 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a661329c834c2f39238dea87cd7f7d7b9c8c05d2d98e22e9ac80223547f65f8\": container with ID starting with 2a661329c834c2f39238dea87cd7f7d7b9c8c05d2d98e22e9ac80223547f65f8 not found: ID does not exist" containerID="2a661329c834c2f39238dea87cd7f7d7b9c8c05d2d98e22e9ac80223547f65f8" Dec 01 14:17:39 crc kubenswrapper[4585]: I1201 14:17:39.903423 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a661329c834c2f39238dea87cd7f7d7b9c8c05d2d98e22e9ac80223547f65f8"} err="failed to get container status \"2a661329c834c2f39238dea87cd7f7d7b9c8c05d2d98e22e9ac80223547f65f8\": rpc error: code = NotFound desc = could not find container \"2a661329c834c2f39238dea87cd7f7d7b9c8c05d2d98e22e9ac80223547f65f8\": container with ID starting with 2a661329c834c2f39238dea87cd7f7d7b9c8c05d2d98e22e9ac80223547f65f8 not found: ID does not exist" Dec 01 14:17:39 crc kubenswrapper[4585]: I1201 14:17:39.903446 4585 scope.go:117] "RemoveContainer" containerID="ec9281af5e958d8af97ee02e003d40a0ee1ec71e73efdabc3d382c6fc62866f2" Dec 01 14:17:39 crc kubenswrapper[4585]: E1201 14:17:39.904527 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec9281af5e958d8af97ee02e003d40a0ee1ec71e73efdabc3d382c6fc62866f2\": container with ID starting with ec9281af5e958d8af97ee02e003d40a0ee1ec71e73efdabc3d382c6fc62866f2 not found: ID does not exist" containerID="ec9281af5e958d8af97ee02e003d40a0ee1ec71e73efdabc3d382c6fc62866f2" Dec 01 14:17:39 crc kubenswrapper[4585]: I1201 14:17:39.904553 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec9281af5e958d8af97ee02e003d40a0ee1ec71e73efdabc3d382c6fc62866f2"} err="failed to get container status \"ec9281af5e958d8af97ee02e003d40a0ee1ec71e73efdabc3d382c6fc62866f2\": rpc error: code = NotFound desc = could not find container \"ec9281af5e958d8af97ee02e003d40a0ee1ec71e73efdabc3d382c6fc62866f2\": container with ID starting with ec9281af5e958d8af97ee02e003d40a0ee1ec71e73efdabc3d382c6fc62866f2 not found: ID does not exist" Dec 01 14:17:39 crc kubenswrapper[4585]: I1201 14:17:39.904566 4585 scope.go:117] "RemoveContainer" containerID="2a661329c834c2f39238dea87cd7f7d7b9c8c05d2d98e22e9ac80223547f65f8" Dec 01 14:17:39 crc kubenswrapper[4585]: I1201 14:17:39.906686 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a661329c834c2f39238dea87cd7f7d7b9c8c05d2d98e22e9ac80223547f65f8"} err="failed to get container status \"2a661329c834c2f39238dea87cd7f7d7b9c8c05d2d98e22e9ac80223547f65f8\": rpc error: code = NotFound desc = could not find container \"2a661329c834c2f39238dea87cd7f7d7b9c8c05d2d98e22e9ac80223547f65f8\": container with ID starting with 2a661329c834c2f39238dea87cd7f7d7b9c8c05d2d98e22e9ac80223547f65f8 not found: ID does not exist" Dec 01 14:17:39 crc kubenswrapper[4585]: I1201 14:17:39.906706 4585 scope.go:117] "RemoveContainer" containerID="ec9281af5e958d8af97ee02e003d40a0ee1ec71e73efdabc3d382c6fc62866f2" Dec 01 14:17:39 crc kubenswrapper[4585]: I1201 14:17:39.908801 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec9281af5e958d8af97ee02e003d40a0ee1ec71e73efdabc3d382c6fc62866f2"} err="failed to get container status \"ec9281af5e958d8af97ee02e003d40a0ee1ec71e73efdabc3d382c6fc62866f2\": rpc error: code = NotFound desc = could not find container \"ec9281af5e958d8af97ee02e003d40a0ee1ec71e73efdabc3d382c6fc62866f2\": container with ID starting with ec9281af5e958d8af97ee02e003d40a0ee1ec71e73efdabc3d382c6fc62866f2 not found: ID does not exist" Dec 01 14:17:39 crc kubenswrapper[4585]: I1201 14:17:39.927445 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 01 14:17:39 crc kubenswrapper[4585]: E1201 14:17:39.927911 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cea2a563-6121-4f95-a12f-fdd9e258e240" containerName="nova-metadata-metadata" Dec 01 14:17:39 crc kubenswrapper[4585]: I1201 14:17:39.927929 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="cea2a563-6121-4f95-a12f-fdd9e258e240" containerName="nova-metadata-metadata" Dec 01 14:17:39 crc kubenswrapper[4585]: E1201 14:17:39.927944 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cea2a563-6121-4f95-a12f-fdd9e258e240" containerName="nova-metadata-log" Dec 01 14:17:39 crc kubenswrapper[4585]: I1201 14:17:39.927951 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="cea2a563-6121-4f95-a12f-fdd9e258e240" containerName="nova-metadata-log" Dec 01 14:17:39 crc kubenswrapper[4585]: I1201 14:17:39.928174 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="cea2a563-6121-4f95-a12f-fdd9e258e240" containerName="nova-metadata-log" Dec 01 14:17:39 crc kubenswrapper[4585]: I1201 14:17:39.928198 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="cea2a563-6121-4f95-a12f-fdd9e258e240" containerName="nova-metadata-metadata" Dec 01 14:17:39 crc kubenswrapper[4585]: I1201 14:17:39.929302 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 14:17:39 crc kubenswrapper[4585]: I1201 14:17:39.932259 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 01 14:17:39 crc kubenswrapper[4585]: I1201 14:17:39.932558 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 01 14:17:39 crc kubenswrapper[4585]: I1201 14:17:39.952458 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 14:17:40 crc kubenswrapper[4585]: I1201 14:17:40.109070 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6mpf\" (UniqueName: \"kubernetes.io/projected/0b1b5aa5-9df7-422b-96a9-f5f381caf344-kube-api-access-q6mpf\") pod \"nova-metadata-0\" (UID: \"0b1b5aa5-9df7-422b-96a9-f5f381caf344\") " pod="openstack/nova-metadata-0" Dec 01 14:17:40 crc kubenswrapper[4585]: I1201 14:17:40.109116 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b1b5aa5-9df7-422b-96a9-f5f381caf344-config-data\") pod \"nova-metadata-0\" (UID: \"0b1b5aa5-9df7-422b-96a9-f5f381caf344\") " pod="openstack/nova-metadata-0" Dec 01 14:17:40 crc kubenswrapper[4585]: I1201 14:17:40.109165 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b1b5aa5-9df7-422b-96a9-f5f381caf344-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0b1b5aa5-9df7-422b-96a9-f5f381caf344\") " pod="openstack/nova-metadata-0" Dec 01 14:17:40 crc kubenswrapper[4585]: I1201 14:17:40.109195 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b1b5aa5-9df7-422b-96a9-f5f381caf344-logs\") pod \"nova-metadata-0\" (UID: \"0b1b5aa5-9df7-422b-96a9-f5f381caf344\") " pod="openstack/nova-metadata-0" Dec 01 14:17:40 crc kubenswrapper[4585]: I1201 14:17:40.109508 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b1b5aa5-9df7-422b-96a9-f5f381caf344-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0b1b5aa5-9df7-422b-96a9-f5f381caf344\") " pod="openstack/nova-metadata-0" Dec 01 14:17:40 crc kubenswrapper[4585]: I1201 14:17:40.211428 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b1b5aa5-9df7-422b-96a9-f5f381caf344-logs\") pod \"nova-metadata-0\" (UID: \"0b1b5aa5-9df7-422b-96a9-f5f381caf344\") " pod="openstack/nova-metadata-0" Dec 01 14:17:40 crc kubenswrapper[4585]: I1201 14:17:40.211534 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b1b5aa5-9df7-422b-96a9-f5f381caf344-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0b1b5aa5-9df7-422b-96a9-f5f381caf344\") " pod="openstack/nova-metadata-0" Dec 01 14:17:40 crc kubenswrapper[4585]: I1201 14:17:40.211589 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6mpf\" (UniqueName: \"kubernetes.io/projected/0b1b5aa5-9df7-422b-96a9-f5f381caf344-kube-api-access-q6mpf\") pod \"nova-metadata-0\" (UID: \"0b1b5aa5-9df7-422b-96a9-f5f381caf344\") " pod="openstack/nova-metadata-0" Dec 01 14:17:40 crc kubenswrapper[4585]: I1201 14:17:40.211608 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b1b5aa5-9df7-422b-96a9-f5f381caf344-config-data\") pod \"nova-metadata-0\" (UID: \"0b1b5aa5-9df7-422b-96a9-f5f381caf344\") " pod="openstack/nova-metadata-0" Dec 01 14:17:40 crc kubenswrapper[4585]: I1201 14:17:40.211663 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b1b5aa5-9df7-422b-96a9-f5f381caf344-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0b1b5aa5-9df7-422b-96a9-f5f381caf344\") " pod="openstack/nova-metadata-0" Dec 01 14:17:40 crc kubenswrapper[4585]: I1201 14:17:40.212509 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b1b5aa5-9df7-422b-96a9-f5f381caf344-logs\") pod \"nova-metadata-0\" (UID: \"0b1b5aa5-9df7-422b-96a9-f5f381caf344\") " pod="openstack/nova-metadata-0" Dec 01 14:17:40 crc kubenswrapper[4585]: I1201 14:17:40.217876 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b1b5aa5-9df7-422b-96a9-f5f381caf344-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0b1b5aa5-9df7-422b-96a9-f5f381caf344\") " pod="openstack/nova-metadata-0" Dec 01 14:17:40 crc kubenswrapper[4585]: I1201 14:17:40.219009 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b1b5aa5-9df7-422b-96a9-f5f381caf344-config-data\") pod \"nova-metadata-0\" (UID: \"0b1b5aa5-9df7-422b-96a9-f5f381caf344\") " pod="openstack/nova-metadata-0" Dec 01 14:17:40 crc kubenswrapper[4585]: I1201 14:17:40.221962 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b1b5aa5-9df7-422b-96a9-f5f381caf344-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0b1b5aa5-9df7-422b-96a9-f5f381caf344\") " pod="openstack/nova-metadata-0" Dec 01 14:17:40 crc kubenswrapper[4585]: I1201 14:17:40.236503 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6mpf\" (UniqueName: \"kubernetes.io/projected/0b1b5aa5-9df7-422b-96a9-f5f381caf344-kube-api-access-q6mpf\") pod \"nova-metadata-0\" (UID: \"0b1b5aa5-9df7-422b-96a9-f5f381caf344\") " pod="openstack/nova-metadata-0" Dec 01 14:17:40 crc kubenswrapper[4585]: I1201 14:17:40.256685 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 14:17:40 crc kubenswrapper[4585]: I1201 14:17:40.454202 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cea2a563-6121-4f95-a12f-fdd9e258e240" path="/var/lib/kubelet/pods/cea2a563-6121-4f95-a12f-fdd9e258e240/volumes" Dec 01 14:17:40 crc kubenswrapper[4585]: I1201 14:17:40.637174 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6f5b64975d-2mfhq" podUID="d6fd44d5-e430-42bb-ad0b-7c78e7a1f288" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Dec 01 14:17:40 crc kubenswrapper[4585]: I1201 14:17:40.961385 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 14:17:41 crc kubenswrapper[4585]: I1201 14:17:41.848388 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0b1b5aa5-9df7-422b-96a9-f5f381caf344","Type":"ContainerStarted","Data":"2cb5e933ddfa0be02436199b66a85f5dade5e20a4bed7bd3bf3f35e4856d32b3"} Dec 01 14:17:41 crc kubenswrapper[4585]: I1201 14:17:41.849392 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0b1b5aa5-9df7-422b-96a9-f5f381caf344","Type":"ContainerStarted","Data":"67cfb6fa85a26b00c8e0ea9c2dedcad370a340f13fd73bae008662cda3e46254"} Dec 01 14:17:41 crc kubenswrapper[4585]: I1201 14:17:41.849416 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0b1b5aa5-9df7-422b-96a9-f5f381caf344","Type":"ContainerStarted","Data":"4eb0055fd0e47949e48d071daced06d069354207b06a23add545bf51a63de582"} Dec 01 14:17:41 crc kubenswrapper[4585]: I1201 14:17:41.876169 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.876151181 podStartE2EDuration="2.876151181s" podCreationTimestamp="2025-12-01 14:17:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:17:41.869452062 +0000 UTC m=+1175.853665907" watchObservedRunningTime="2025-12-01 14:17:41.876151181 +0000 UTC m=+1175.860365036" Dec 01 14:17:42 crc kubenswrapper[4585]: I1201 14:17:42.061242 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 01 14:17:42 crc kubenswrapper[4585]: I1201 14:17:42.061286 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 01 14:17:42 crc kubenswrapper[4585]: I1201 14:17:42.088855 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 01 14:17:42 crc kubenswrapper[4585]: I1201 14:17:42.224340 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 14:17:42 crc kubenswrapper[4585]: I1201 14:17:42.224384 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 14:17:42 crc kubenswrapper[4585]: I1201 14:17:42.277851 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-757b4f8459-hdklj" Dec 01 14:17:42 crc kubenswrapper[4585]: I1201 14:17:42.337250 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-tztg6"] Dec 01 14:17:42 crc kubenswrapper[4585]: I1201 14:17:42.337546 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-tztg6" podUID="ee30ed1d-c158-48eb-b68f-1e613718edeb" containerName="dnsmasq-dns" containerID="cri-o://745b88980361c0a8994adb67d0c89d686e0571033b3b0a60d64f9b7fb2aead6f" gracePeriod=10 Dec 01 14:17:42 crc kubenswrapper[4585]: I1201 14:17:42.476085 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 01 14:17:42 crc kubenswrapper[4585]: I1201 14:17:42.518200 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 14:17:42 crc kubenswrapper[4585]: I1201 14:17:42.521953 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="b47683b1-2753-468a-b272-f9f1760a71f3" containerName="kube-state-metrics" containerID="cri-o://8147e2dbb97af8be7979d82e5fa350b2f97bc7cf7ff3d0b0cf18ef412e320908" gracePeriod=30 Dec 01 14:17:42 crc kubenswrapper[4585]: I1201 14:17:42.878794 4585 generic.go:334] "Generic (PLEG): container finished" podID="ee30ed1d-c158-48eb-b68f-1e613718edeb" containerID="745b88980361c0a8994adb67d0c89d686e0571033b3b0a60d64f9b7fb2aead6f" exitCode=0 Dec 01 14:17:42 crc kubenswrapper[4585]: I1201 14:17:42.879067 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-tztg6" event={"ID":"ee30ed1d-c158-48eb-b68f-1e613718edeb","Type":"ContainerDied","Data":"745b88980361c0a8994adb67d0c89d686e0571033b3b0a60d64f9b7fb2aead6f"} Dec 01 14:17:42 crc kubenswrapper[4585]: I1201 14:17:42.881671 4585 generic.go:334] "Generic (PLEG): container finished" podID="b47683b1-2753-468a-b272-f9f1760a71f3" containerID="8147e2dbb97af8be7979d82e5fa350b2f97bc7cf7ff3d0b0cf18ef412e320908" exitCode=2 Dec 01 14:17:42 crc kubenswrapper[4585]: I1201 14:17:42.881838 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b47683b1-2753-468a-b272-f9f1760a71f3","Type":"ContainerDied","Data":"8147e2dbb97af8be7979d82e5fa350b2f97bc7cf7ff3d0b0cf18ef412e320908"} Dec 01 14:17:42 crc kubenswrapper[4585]: I1201 14:17:42.944581 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 01 14:17:43 crc kubenswrapper[4585]: I1201 14:17:43.049481 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-tztg6" Dec 01 14:17:43 crc kubenswrapper[4585]: I1201 14:17:43.177874 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ee30ed1d-c158-48eb-b68f-1e613718edeb-dns-swift-storage-0\") pod \"ee30ed1d-c158-48eb-b68f-1e613718edeb\" (UID: \"ee30ed1d-c158-48eb-b68f-1e613718edeb\") " Dec 01 14:17:43 crc kubenswrapper[4585]: I1201 14:17:43.177937 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee30ed1d-c158-48eb-b68f-1e613718edeb-config\") pod \"ee30ed1d-c158-48eb-b68f-1e613718edeb\" (UID: \"ee30ed1d-c158-48eb-b68f-1e613718edeb\") " Dec 01 14:17:43 crc kubenswrapper[4585]: I1201 14:17:43.178026 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee30ed1d-c158-48eb-b68f-1e613718edeb-ovsdbserver-sb\") pod \"ee30ed1d-c158-48eb-b68f-1e613718edeb\" (UID: \"ee30ed1d-c158-48eb-b68f-1e613718edeb\") " Dec 01 14:17:43 crc kubenswrapper[4585]: I1201 14:17:43.178074 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjltc\" (UniqueName: \"kubernetes.io/projected/ee30ed1d-c158-48eb-b68f-1e613718edeb-kube-api-access-qjltc\") pod \"ee30ed1d-c158-48eb-b68f-1e613718edeb\" (UID: \"ee30ed1d-c158-48eb-b68f-1e613718edeb\") " Dec 01 14:17:43 crc kubenswrapper[4585]: I1201 14:17:43.178146 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee30ed1d-c158-48eb-b68f-1e613718edeb-dns-svc\") pod \"ee30ed1d-c158-48eb-b68f-1e613718edeb\" (UID: \"ee30ed1d-c158-48eb-b68f-1e613718edeb\") " Dec 01 14:17:43 crc kubenswrapper[4585]: I1201 14:17:43.178171 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee30ed1d-c158-48eb-b68f-1e613718edeb-ovsdbserver-nb\") pod \"ee30ed1d-c158-48eb-b68f-1e613718edeb\" (UID: \"ee30ed1d-c158-48eb-b68f-1e613718edeb\") " Dec 01 14:17:43 crc kubenswrapper[4585]: I1201 14:17:43.191125 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee30ed1d-c158-48eb-b68f-1e613718edeb-kube-api-access-qjltc" (OuterVolumeSpecName: "kube-api-access-qjltc") pod "ee30ed1d-c158-48eb-b68f-1e613718edeb" (UID: "ee30ed1d-c158-48eb-b68f-1e613718edeb"). InnerVolumeSpecName "kube-api-access-qjltc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:17:43 crc kubenswrapper[4585]: I1201 14:17:43.275157 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 01 14:17:43 crc kubenswrapper[4585]: I1201 14:17:43.280588 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjltc\" (UniqueName: \"kubernetes.io/projected/ee30ed1d-c158-48eb-b68f-1e613718edeb-kube-api-access-qjltc\") on node \"crc\" DevicePath \"\"" Dec 01 14:17:43 crc kubenswrapper[4585]: I1201 14:17:43.289758 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee30ed1d-c158-48eb-b68f-1e613718edeb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ee30ed1d-c158-48eb-b68f-1e613718edeb" (UID: "ee30ed1d-c158-48eb-b68f-1e613718edeb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:17:43 crc kubenswrapper[4585]: I1201 14:17:43.302812 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee30ed1d-c158-48eb-b68f-1e613718edeb-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ee30ed1d-c158-48eb-b68f-1e613718edeb" (UID: "ee30ed1d-c158-48eb-b68f-1e613718edeb"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:17:43 crc kubenswrapper[4585]: I1201 14:17:43.309161 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ad4421f3-1cd6-47fe-a017-8b7ea120652c" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.185:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 14:17:43 crc kubenswrapper[4585]: I1201 14:17:43.309503 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ad4421f3-1cd6-47fe-a017-8b7ea120652c" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.185:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 14:17:43 crc kubenswrapper[4585]: I1201 14:17:43.311187 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee30ed1d-c158-48eb-b68f-1e613718edeb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ee30ed1d-c158-48eb-b68f-1e613718edeb" (UID: "ee30ed1d-c158-48eb-b68f-1e613718edeb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:17:43 crc kubenswrapper[4585]: I1201 14:17:43.314520 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee30ed1d-c158-48eb-b68f-1e613718edeb-config" (OuterVolumeSpecName: "config") pod "ee30ed1d-c158-48eb-b68f-1e613718edeb" (UID: "ee30ed1d-c158-48eb-b68f-1e613718edeb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:17:43 crc kubenswrapper[4585]: I1201 14:17:43.339861 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee30ed1d-c158-48eb-b68f-1e613718edeb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ee30ed1d-c158-48eb-b68f-1e613718edeb" (UID: "ee30ed1d-c158-48eb-b68f-1e613718edeb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:17:43 crc kubenswrapper[4585]: I1201 14:17:43.382063 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-585z6\" (UniqueName: \"kubernetes.io/projected/b47683b1-2753-468a-b272-f9f1760a71f3-kube-api-access-585z6\") pod \"b47683b1-2753-468a-b272-f9f1760a71f3\" (UID: \"b47683b1-2753-468a-b272-f9f1760a71f3\") " Dec 01 14:17:43 crc kubenswrapper[4585]: I1201 14:17:43.382888 4585 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee30ed1d-c158-48eb-b68f-1e613718edeb-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 14:17:43 crc kubenswrapper[4585]: I1201 14:17:43.383468 4585 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee30ed1d-c158-48eb-b68f-1e613718edeb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 14:17:43 crc kubenswrapper[4585]: I1201 14:17:43.383564 4585 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ee30ed1d-c158-48eb-b68f-1e613718edeb-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 01 14:17:43 crc kubenswrapper[4585]: I1201 14:17:43.383687 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee30ed1d-c158-48eb-b68f-1e613718edeb-config\") on node \"crc\" DevicePath \"\"" Dec 01 14:17:43 crc kubenswrapper[4585]: I1201 14:17:43.383780 4585 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee30ed1d-c158-48eb-b68f-1e613718edeb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 14:17:43 crc kubenswrapper[4585]: I1201 14:17:43.386259 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b47683b1-2753-468a-b272-f9f1760a71f3-kube-api-access-585z6" (OuterVolumeSpecName: "kube-api-access-585z6") pod "b47683b1-2753-468a-b272-f9f1760a71f3" (UID: "b47683b1-2753-468a-b272-f9f1760a71f3"). InnerVolumeSpecName "kube-api-access-585z6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:17:43 crc kubenswrapper[4585]: I1201 14:17:43.486126 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-585z6\" (UniqueName: \"kubernetes.io/projected/b47683b1-2753-468a-b272-f9f1760a71f3-kube-api-access-585z6\") on node \"crc\" DevicePath \"\"" Dec 01 14:17:43 crc kubenswrapper[4585]: I1201 14:17:43.715720 4585 patch_prober.go:28] interesting pod/machine-config-daemon-lj9gs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 14:17:43 crc kubenswrapper[4585]: I1201 14:17:43.715769 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 14:17:43 crc kubenswrapper[4585]: I1201 14:17:43.891599 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-tztg6" Dec 01 14:17:43 crc kubenswrapper[4585]: I1201 14:17:43.891621 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-tztg6" event={"ID":"ee30ed1d-c158-48eb-b68f-1e613718edeb","Type":"ContainerDied","Data":"1d301e9a19e493643cf7c617958a39eac70913defa3d4b43b0ee5055feae4ab1"} Dec 01 14:17:43 crc kubenswrapper[4585]: I1201 14:17:43.891669 4585 scope.go:117] "RemoveContainer" containerID="745b88980361c0a8994adb67d0c89d686e0571033b3b0a60d64f9b7fb2aead6f" Dec 01 14:17:43 crc kubenswrapper[4585]: I1201 14:17:43.894309 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 01 14:17:43 crc kubenswrapper[4585]: I1201 14:17:43.902334 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b47683b1-2753-468a-b272-f9f1760a71f3","Type":"ContainerDied","Data":"a11ccd462fd7d2294421da4cc2582211d5c2da0344a85037b5341f6ea6cccbfa"} Dec 01 14:17:43 crc kubenswrapper[4585]: I1201 14:17:43.916023 4585 scope.go:117] "RemoveContainer" containerID="53be4dc2e21501649ee25680a49eabe364a6a4c3326076876a18fdee764d3604" Dec 01 14:17:43 crc kubenswrapper[4585]: I1201 14:17:43.932988 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-tztg6"] Dec 01 14:17:43 crc kubenswrapper[4585]: I1201 14:17:43.970153 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-tztg6"] Dec 01 14:17:43 crc kubenswrapper[4585]: I1201 14:17:43.989345 4585 scope.go:117] "RemoveContainer" containerID="8147e2dbb97af8be7979d82e5fa350b2f97bc7cf7ff3d0b0cf18ef412e320908" Dec 01 14:17:44 crc kubenswrapper[4585]: I1201 14:17:44.013410 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 14:17:44 crc kubenswrapper[4585]: I1201 14:17:44.048702 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 14:17:44 crc kubenswrapper[4585]: I1201 14:17:44.055905 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 14:17:44 crc kubenswrapper[4585]: E1201 14:17:44.056381 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee30ed1d-c158-48eb-b68f-1e613718edeb" containerName="init" Dec 01 14:17:44 crc kubenswrapper[4585]: I1201 14:17:44.056406 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee30ed1d-c158-48eb-b68f-1e613718edeb" containerName="init" Dec 01 14:17:44 crc kubenswrapper[4585]: E1201 14:17:44.056431 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee30ed1d-c158-48eb-b68f-1e613718edeb" containerName="dnsmasq-dns" Dec 01 14:17:44 crc kubenswrapper[4585]: I1201 14:17:44.056438 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee30ed1d-c158-48eb-b68f-1e613718edeb" containerName="dnsmasq-dns" Dec 01 14:17:44 crc kubenswrapper[4585]: E1201 14:17:44.056455 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b47683b1-2753-468a-b272-f9f1760a71f3" containerName="kube-state-metrics" Dec 01 14:17:44 crc kubenswrapper[4585]: I1201 14:17:44.056463 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="b47683b1-2753-468a-b272-f9f1760a71f3" containerName="kube-state-metrics" Dec 01 14:17:44 crc kubenswrapper[4585]: I1201 14:17:44.056648 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="b47683b1-2753-468a-b272-f9f1760a71f3" containerName="kube-state-metrics" Dec 01 14:17:44 crc kubenswrapper[4585]: I1201 14:17:44.056663 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee30ed1d-c158-48eb-b68f-1e613718edeb" containerName="dnsmasq-dns" Dec 01 14:17:44 crc kubenswrapper[4585]: I1201 14:17:44.057429 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 01 14:17:44 crc kubenswrapper[4585]: I1201 14:17:44.059516 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 01 14:17:44 crc kubenswrapper[4585]: I1201 14:17:44.060345 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 01 14:17:44 crc kubenswrapper[4585]: I1201 14:17:44.077031 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 14:17:44 crc kubenswrapper[4585]: I1201 14:17:44.197208 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6dc37b7-09de-4e17-9d88-358b3d3d5908-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"e6dc37b7-09de-4e17-9d88-358b3d3d5908\") " pod="openstack/kube-state-metrics-0" Dec 01 14:17:44 crc kubenswrapper[4585]: I1201 14:17:44.197254 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/e6dc37b7-09de-4e17-9d88-358b3d3d5908-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"e6dc37b7-09de-4e17-9d88-358b3d3d5908\") " pod="openstack/kube-state-metrics-0" Dec 01 14:17:44 crc kubenswrapper[4585]: I1201 14:17:44.197732 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6dc37b7-09de-4e17-9d88-358b3d3d5908-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"e6dc37b7-09de-4e17-9d88-358b3d3d5908\") " pod="openstack/kube-state-metrics-0" Dec 01 14:17:44 crc kubenswrapper[4585]: I1201 14:17:44.197780 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnhbc\" (UniqueName: \"kubernetes.io/projected/e6dc37b7-09de-4e17-9d88-358b3d3d5908-kube-api-access-bnhbc\") pod \"kube-state-metrics-0\" (UID: \"e6dc37b7-09de-4e17-9d88-358b3d3d5908\") " pod="openstack/kube-state-metrics-0" Dec 01 14:17:44 crc kubenswrapper[4585]: I1201 14:17:44.299486 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6dc37b7-09de-4e17-9d88-358b3d3d5908-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"e6dc37b7-09de-4e17-9d88-358b3d3d5908\") " pod="openstack/kube-state-metrics-0" Dec 01 14:17:44 crc kubenswrapper[4585]: I1201 14:17:44.299530 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/e6dc37b7-09de-4e17-9d88-358b3d3d5908-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"e6dc37b7-09de-4e17-9d88-358b3d3d5908\") " pod="openstack/kube-state-metrics-0" Dec 01 14:17:44 crc kubenswrapper[4585]: I1201 14:17:44.300252 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6dc37b7-09de-4e17-9d88-358b3d3d5908-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"e6dc37b7-09de-4e17-9d88-358b3d3d5908\") " pod="openstack/kube-state-metrics-0" Dec 01 14:17:44 crc kubenswrapper[4585]: I1201 14:17:44.300281 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnhbc\" (UniqueName: \"kubernetes.io/projected/e6dc37b7-09de-4e17-9d88-358b3d3d5908-kube-api-access-bnhbc\") pod \"kube-state-metrics-0\" (UID: \"e6dc37b7-09de-4e17-9d88-358b3d3d5908\") " pod="openstack/kube-state-metrics-0" Dec 01 14:17:44 crc kubenswrapper[4585]: I1201 14:17:44.303754 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/e6dc37b7-09de-4e17-9d88-358b3d3d5908-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"e6dc37b7-09de-4e17-9d88-358b3d3d5908\") " pod="openstack/kube-state-metrics-0" Dec 01 14:17:44 crc kubenswrapper[4585]: I1201 14:17:44.305200 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6dc37b7-09de-4e17-9d88-358b3d3d5908-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"e6dc37b7-09de-4e17-9d88-358b3d3d5908\") " pod="openstack/kube-state-metrics-0" Dec 01 14:17:44 crc kubenswrapper[4585]: I1201 14:17:44.307242 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6dc37b7-09de-4e17-9d88-358b3d3d5908-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"e6dc37b7-09de-4e17-9d88-358b3d3d5908\") " pod="openstack/kube-state-metrics-0" Dec 01 14:17:44 crc kubenswrapper[4585]: I1201 14:17:44.317470 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnhbc\" (UniqueName: \"kubernetes.io/projected/e6dc37b7-09de-4e17-9d88-358b3d3d5908-kube-api-access-bnhbc\") pod \"kube-state-metrics-0\" (UID: \"e6dc37b7-09de-4e17-9d88-358b3d3d5908\") " pod="openstack/kube-state-metrics-0" Dec 01 14:17:44 crc kubenswrapper[4585]: I1201 14:17:44.377366 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 01 14:17:44 crc kubenswrapper[4585]: I1201 14:17:44.427158 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b47683b1-2753-468a-b272-f9f1760a71f3" path="/var/lib/kubelet/pods/b47683b1-2753-468a-b272-f9f1760a71f3/volumes" Dec 01 14:17:44 crc kubenswrapper[4585]: I1201 14:17:44.427704 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee30ed1d-c158-48eb-b68f-1e613718edeb" path="/var/lib/kubelet/pods/ee30ed1d-c158-48eb-b68f-1e613718edeb/volumes" Dec 01 14:17:44 crc kubenswrapper[4585]: I1201 14:17:44.892701 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 14:17:44 crc kubenswrapper[4585]: I1201 14:17:44.902413 4585 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 14:17:44 crc kubenswrapper[4585]: I1201 14:17:44.907899 4585 generic.go:334] "Generic (PLEG): container finished" podID="fa5f1c18-8b57-47a1-ba5d-5ae47fd1ae30" containerID="166f287ec8ac7df3fc80d66b577864cf2107dbc4710cf00e57d7f82a993c8ae9" exitCode=0 Dec 01 14:17:44 crc kubenswrapper[4585]: I1201 14:17:44.907991 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-vfxqx" event={"ID":"fa5f1c18-8b57-47a1-ba5d-5ae47fd1ae30","Type":"ContainerDied","Data":"166f287ec8ac7df3fc80d66b577864cf2107dbc4710cf00e57d7f82a993c8ae9"} Dec 01 14:17:44 crc kubenswrapper[4585]: I1201 14:17:44.916051 4585 generic.go:334] "Generic (PLEG): container finished" podID="31a54fa3-c02b-4d90-8375-b50ab8de60fe" containerID="fcea060dd8edaf28d77724cb85e1b60a91cb53c09162f38fdc417a7e1376ad5c" exitCode=0 Dec 01 14:17:44 crc kubenswrapper[4585]: I1201 14:17:44.916224 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-b7vt8" event={"ID":"31a54fa3-c02b-4d90-8375-b50ab8de60fe","Type":"ContainerDied","Data":"fcea060dd8edaf28d77724cb85e1b60a91cb53c09162f38fdc417a7e1376ad5c"} Dec 01 14:17:45 crc kubenswrapper[4585]: I1201 14:17:45.174884 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 14:17:45 crc kubenswrapper[4585]: I1201 14:17:45.175225 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8c322955-ba56-4357-bc08-9828e570d4c8" containerName="ceilometer-central-agent" containerID="cri-o://349cee5c3d4760c56f3bb57a662be7c9f966bdbb9fc69f4ce1600cdc2e558a2f" gracePeriod=30 Dec 01 14:17:45 crc kubenswrapper[4585]: I1201 14:17:45.175402 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8c322955-ba56-4357-bc08-9828e570d4c8" containerName="sg-core" containerID="cri-o://0850bc1c40a8a1d86ac60cfedd96fe073a01266bea49d51fa948fad812494887" gracePeriod=30 Dec 01 14:17:45 crc kubenswrapper[4585]: I1201 14:17:45.175469 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8c322955-ba56-4357-bc08-9828e570d4c8" containerName="ceilometer-notification-agent" containerID="cri-o://1d216ea0f7b1de6b629daaaf85cfba679c69c16169c5fb9b4fb31b443e1d6dde" gracePeriod=30 Dec 01 14:17:45 crc kubenswrapper[4585]: I1201 14:17:45.175718 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8c322955-ba56-4357-bc08-9828e570d4c8" containerName="proxy-httpd" containerID="cri-o://e89015dbc6948547868eba5afb2622cd8a3efca48252527b738f709b7f9166f8" gracePeriod=30 Dec 01 14:17:45 crc kubenswrapper[4585]: I1201 14:17:45.257739 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 01 14:17:45 crc kubenswrapper[4585]: I1201 14:17:45.257808 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 01 14:17:45 crc kubenswrapper[4585]: I1201 14:17:45.926407 4585 generic.go:334] "Generic (PLEG): container finished" podID="8c322955-ba56-4357-bc08-9828e570d4c8" containerID="e89015dbc6948547868eba5afb2622cd8a3efca48252527b738f709b7f9166f8" exitCode=0 Dec 01 14:17:45 crc kubenswrapper[4585]: I1201 14:17:45.926692 4585 generic.go:334] "Generic (PLEG): container finished" podID="8c322955-ba56-4357-bc08-9828e570d4c8" containerID="0850bc1c40a8a1d86ac60cfedd96fe073a01266bea49d51fa948fad812494887" exitCode=2 Dec 01 14:17:45 crc kubenswrapper[4585]: I1201 14:17:45.926480 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c322955-ba56-4357-bc08-9828e570d4c8","Type":"ContainerDied","Data":"e89015dbc6948547868eba5afb2622cd8a3efca48252527b738f709b7f9166f8"} Dec 01 14:17:45 crc kubenswrapper[4585]: I1201 14:17:45.926739 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c322955-ba56-4357-bc08-9828e570d4c8","Type":"ContainerDied","Data":"0850bc1c40a8a1d86ac60cfedd96fe073a01266bea49d51fa948fad812494887"} Dec 01 14:17:45 crc kubenswrapper[4585]: I1201 14:17:45.926750 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c322955-ba56-4357-bc08-9828e570d4c8","Type":"ContainerDied","Data":"349cee5c3d4760c56f3bb57a662be7c9f966bdbb9fc69f4ce1600cdc2e558a2f"} Dec 01 14:17:45 crc kubenswrapper[4585]: I1201 14:17:45.926703 4585 generic.go:334] "Generic (PLEG): container finished" podID="8c322955-ba56-4357-bc08-9828e570d4c8" containerID="349cee5c3d4760c56f3bb57a662be7c9f966bdbb9fc69f4ce1600cdc2e558a2f" exitCode=0 Dec 01 14:17:45 crc kubenswrapper[4585]: I1201 14:17:45.929279 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e6dc37b7-09de-4e17-9d88-358b3d3d5908","Type":"ContainerStarted","Data":"f118fd7b48acdeae276f6e024eed4411f424635ec7ea8f71b4b6fd6e59c194de"} Dec 01 14:17:45 crc kubenswrapper[4585]: I1201 14:17:45.929338 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e6dc37b7-09de-4e17-9d88-358b3d3d5908","Type":"ContainerStarted","Data":"552c3a5387cfbaaa98dae88c6e0e29d243327ccd85335a1b74edeb7ba3c2c2d0"} Dec 01 14:17:45 crc kubenswrapper[4585]: I1201 14:17:45.956597 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.611015322 podStartE2EDuration="2.956575517s" podCreationTimestamp="2025-12-01 14:17:43 +0000 UTC" firstStartedPulling="2025-12-01 14:17:44.902206939 +0000 UTC m=+1178.886420794" lastFinishedPulling="2025-12-01 14:17:45.247767134 +0000 UTC m=+1179.231980989" observedRunningTime="2025-12-01 14:17:45.948355498 +0000 UTC m=+1179.932569353" watchObservedRunningTime="2025-12-01 14:17:45.956575517 +0000 UTC m=+1179.940789372" Dec 01 14:17:46 crc kubenswrapper[4585]: I1201 14:17:46.352663 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-b7vt8" Dec 01 14:17:46 crc kubenswrapper[4585]: I1201 14:17:46.361938 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-vfxqx" Dec 01 14:17:46 crc kubenswrapper[4585]: I1201 14:17:46.453782 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzc84\" (UniqueName: \"kubernetes.io/projected/31a54fa3-c02b-4d90-8375-b50ab8de60fe-kube-api-access-mzc84\") pod \"31a54fa3-c02b-4d90-8375-b50ab8de60fe\" (UID: \"31a54fa3-c02b-4d90-8375-b50ab8de60fe\") " Dec 01 14:17:46 crc kubenswrapper[4585]: I1201 14:17:46.453881 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31a54fa3-c02b-4d90-8375-b50ab8de60fe-scripts\") pod \"31a54fa3-c02b-4d90-8375-b50ab8de60fe\" (UID: \"31a54fa3-c02b-4d90-8375-b50ab8de60fe\") " Dec 01 14:17:46 crc kubenswrapper[4585]: I1201 14:17:46.453950 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa5f1c18-8b57-47a1-ba5d-5ae47fd1ae30-config-data\") pod \"fa5f1c18-8b57-47a1-ba5d-5ae47fd1ae30\" (UID: \"fa5f1c18-8b57-47a1-ba5d-5ae47fd1ae30\") " Dec 01 14:17:46 crc kubenswrapper[4585]: I1201 14:17:46.454010 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31a54fa3-c02b-4d90-8375-b50ab8de60fe-config-data\") pod \"31a54fa3-c02b-4d90-8375-b50ab8de60fe\" (UID: \"31a54fa3-c02b-4d90-8375-b50ab8de60fe\") " Dec 01 14:17:46 crc kubenswrapper[4585]: I1201 14:17:46.454087 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gpg6g\" (UniqueName: \"kubernetes.io/projected/fa5f1c18-8b57-47a1-ba5d-5ae47fd1ae30-kube-api-access-gpg6g\") pod \"fa5f1c18-8b57-47a1-ba5d-5ae47fd1ae30\" (UID: \"fa5f1c18-8b57-47a1-ba5d-5ae47fd1ae30\") " Dec 01 14:17:46 crc kubenswrapper[4585]: I1201 14:17:46.454172 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa5f1c18-8b57-47a1-ba5d-5ae47fd1ae30-combined-ca-bundle\") pod \"fa5f1c18-8b57-47a1-ba5d-5ae47fd1ae30\" (UID: \"fa5f1c18-8b57-47a1-ba5d-5ae47fd1ae30\") " Dec 01 14:17:46 crc kubenswrapper[4585]: I1201 14:17:46.454216 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31a54fa3-c02b-4d90-8375-b50ab8de60fe-combined-ca-bundle\") pod \"31a54fa3-c02b-4d90-8375-b50ab8de60fe\" (UID: \"31a54fa3-c02b-4d90-8375-b50ab8de60fe\") " Dec 01 14:17:46 crc kubenswrapper[4585]: I1201 14:17:46.454252 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa5f1c18-8b57-47a1-ba5d-5ae47fd1ae30-scripts\") pod \"fa5f1c18-8b57-47a1-ba5d-5ae47fd1ae30\" (UID: \"fa5f1c18-8b57-47a1-ba5d-5ae47fd1ae30\") " Dec 01 14:17:46 crc kubenswrapper[4585]: I1201 14:17:46.459245 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa5f1c18-8b57-47a1-ba5d-5ae47fd1ae30-kube-api-access-gpg6g" (OuterVolumeSpecName: "kube-api-access-gpg6g") pod "fa5f1c18-8b57-47a1-ba5d-5ae47fd1ae30" (UID: "fa5f1c18-8b57-47a1-ba5d-5ae47fd1ae30"). InnerVolumeSpecName "kube-api-access-gpg6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:17:46 crc kubenswrapper[4585]: I1201 14:17:46.459668 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31a54fa3-c02b-4d90-8375-b50ab8de60fe-scripts" (OuterVolumeSpecName: "scripts") pod "31a54fa3-c02b-4d90-8375-b50ab8de60fe" (UID: "31a54fa3-c02b-4d90-8375-b50ab8de60fe"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:17:46 crc kubenswrapper[4585]: I1201 14:17:46.461025 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa5f1c18-8b57-47a1-ba5d-5ae47fd1ae30-scripts" (OuterVolumeSpecName: "scripts") pod "fa5f1c18-8b57-47a1-ba5d-5ae47fd1ae30" (UID: "fa5f1c18-8b57-47a1-ba5d-5ae47fd1ae30"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:17:46 crc kubenswrapper[4585]: I1201 14:17:46.475204 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31a54fa3-c02b-4d90-8375-b50ab8de60fe-kube-api-access-mzc84" (OuterVolumeSpecName: "kube-api-access-mzc84") pod "31a54fa3-c02b-4d90-8375-b50ab8de60fe" (UID: "31a54fa3-c02b-4d90-8375-b50ab8de60fe"). InnerVolumeSpecName "kube-api-access-mzc84". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:17:46 crc kubenswrapper[4585]: I1201 14:17:46.489925 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31a54fa3-c02b-4d90-8375-b50ab8de60fe-config-data" (OuterVolumeSpecName: "config-data") pod "31a54fa3-c02b-4d90-8375-b50ab8de60fe" (UID: "31a54fa3-c02b-4d90-8375-b50ab8de60fe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:17:46 crc kubenswrapper[4585]: I1201 14:17:46.491079 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa5f1c18-8b57-47a1-ba5d-5ae47fd1ae30-config-data" (OuterVolumeSpecName: "config-data") pod "fa5f1c18-8b57-47a1-ba5d-5ae47fd1ae30" (UID: "fa5f1c18-8b57-47a1-ba5d-5ae47fd1ae30"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:17:46 crc kubenswrapper[4585]: I1201 14:17:46.498389 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa5f1c18-8b57-47a1-ba5d-5ae47fd1ae30-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fa5f1c18-8b57-47a1-ba5d-5ae47fd1ae30" (UID: "fa5f1c18-8b57-47a1-ba5d-5ae47fd1ae30"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:17:46 crc kubenswrapper[4585]: I1201 14:17:46.500474 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31a54fa3-c02b-4d90-8375-b50ab8de60fe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "31a54fa3-c02b-4d90-8375-b50ab8de60fe" (UID: "31a54fa3-c02b-4d90-8375-b50ab8de60fe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:17:46 crc kubenswrapper[4585]: I1201 14:17:46.558162 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gpg6g\" (UniqueName: \"kubernetes.io/projected/fa5f1c18-8b57-47a1-ba5d-5ae47fd1ae30-kube-api-access-gpg6g\") on node \"crc\" DevicePath \"\"" Dec 01 14:17:46 crc kubenswrapper[4585]: I1201 14:17:46.558192 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa5f1c18-8b57-47a1-ba5d-5ae47fd1ae30-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 14:17:46 crc kubenswrapper[4585]: I1201 14:17:46.558204 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31a54fa3-c02b-4d90-8375-b50ab8de60fe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 14:17:46 crc kubenswrapper[4585]: I1201 14:17:46.558214 4585 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa5f1c18-8b57-47a1-ba5d-5ae47fd1ae30-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 14:17:46 crc kubenswrapper[4585]: I1201 14:17:46.558225 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzc84\" (UniqueName: \"kubernetes.io/projected/31a54fa3-c02b-4d90-8375-b50ab8de60fe-kube-api-access-mzc84\") on node \"crc\" DevicePath \"\"" Dec 01 14:17:46 crc kubenswrapper[4585]: I1201 14:17:46.558235 4585 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31a54fa3-c02b-4d90-8375-b50ab8de60fe-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 14:17:46 crc kubenswrapper[4585]: I1201 14:17:46.558244 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa5f1c18-8b57-47a1-ba5d-5ae47fd1ae30-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 14:17:46 crc kubenswrapper[4585]: I1201 14:17:46.558254 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31a54fa3-c02b-4d90-8375-b50ab8de60fe-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 14:17:46 crc kubenswrapper[4585]: I1201 14:17:46.939953 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-b7vt8" event={"ID":"31a54fa3-c02b-4d90-8375-b50ab8de60fe","Type":"ContainerDied","Data":"91242f9d30aaa707bcf8da0d1dab507d5b36607e7ba0964ef1cda4fe99e13bd9"} Dec 01 14:17:46 crc kubenswrapper[4585]: I1201 14:17:46.940207 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91242f9d30aaa707bcf8da0d1dab507d5b36607e7ba0964ef1cda4fe99e13bd9" Dec 01 14:17:46 crc kubenswrapper[4585]: I1201 14:17:46.940148 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-b7vt8" Dec 01 14:17:46 crc kubenswrapper[4585]: I1201 14:17:46.941906 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-vfxqx" event={"ID":"fa5f1c18-8b57-47a1-ba5d-5ae47fd1ae30","Type":"ContainerDied","Data":"b6385e120fa0d9c95ea71521ee4be688b0cfdc26b9095008f55c0e57eaa65d67"} Dec 01 14:17:46 crc kubenswrapper[4585]: I1201 14:17:46.941950 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6385e120fa0d9c95ea71521ee4be688b0cfdc26b9095008f55c0e57eaa65d67" Dec 01 14:17:46 crc kubenswrapper[4585]: I1201 14:17:46.942006 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-vfxqx" Dec 01 14:17:46 crc kubenswrapper[4585]: I1201 14:17:46.962702 4585 generic.go:334] "Generic (PLEG): container finished" podID="d6fd44d5-e430-42bb-ad0b-7c78e7a1f288" containerID="b777eaee6c922789f9a336f3ca54a96b3c11e519ca4993ef5e19721ca1c35cf5" exitCode=137 Dec 01 14:17:46 crc kubenswrapper[4585]: I1201 14:17:46.962941 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f5b64975d-2mfhq" event={"ID":"d6fd44d5-e430-42bb-ad0b-7c78e7a1f288","Type":"ContainerDied","Data":"b777eaee6c922789f9a336f3ca54a96b3c11e519ca4993ef5e19721ca1c35cf5"} Dec 01 14:17:46 crc kubenswrapper[4585]: I1201 14:17:46.962987 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f5b64975d-2mfhq" event={"ID":"d6fd44d5-e430-42bb-ad0b-7c78e7a1f288","Type":"ContainerDied","Data":"88b369e3178655e086e9a0801f919b9233e05bd6833be87fe93a8dc38fae50b6"} Dec 01 14:17:46 crc kubenswrapper[4585]: I1201 14:17:46.962999 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88b369e3178655e086e9a0801f919b9233e05bd6833be87fe93a8dc38fae50b6" Dec 01 14:17:46 crc kubenswrapper[4585]: I1201 14:17:46.963021 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 01 14:17:46 crc kubenswrapper[4585]: I1201 14:17:46.984841 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f5b64975d-2mfhq" Dec 01 14:17:47 crc kubenswrapper[4585]: I1201 14:17:47.052188 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 01 14:17:47 crc kubenswrapper[4585]: E1201 14:17:47.053155 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31a54fa3-c02b-4d90-8375-b50ab8de60fe" containerName="nova-manage" Dec 01 14:17:47 crc kubenswrapper[4585]: I1201 14:17:47.057385 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="31a54fa3-c02b-4d90-8375-b50ab8de60fe" containerName="nova-manage" Dec 01 14:17:47 crc kubenswrapper[4585]: E1201 14:17:47.057489 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6fd44d5-e430-42bb-ad0b-7c78e7a1f288" containerName="horizon" Dec 01 14:17:47 crc kubenswrapper[4585]: I1201 14:17:47.057549 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6fd44d5-e430-42bb-ad0b-7c78e7a1f288" containerName="horizon" Dec 01 14:17:47 crc kubenswrapper[4585]: E1201 14:17:47.057604 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa5f1c18-8b57-47a1-ba5d-5ae47fd1ae30" containerName="nova-cell1-conductor-db-sync" Dec 01 14:17:47 crc kubenswrapper[4585]: I1201 14:17:47.057668 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa5f1c18-8b57-47a1-ba5d-5ae47fd1ae30" containerName="nova-cell1-conductor-db-sync" Dec 01 14:17:47 crc kubenswrapper[4585]: E1201 14:17:47.057761 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6fd44d5-e430-42bb-ad0b-7c78e7a1f288" containerName="horizon-log" Dec 01 14:17:47 crc kubenswrapper[4585]: I1201 14:17:47.057824 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6fd44d5-e430-42bb-ad0b-7c78e7a1f288" containerName="horizon-log" Dec 01 14:17:47 crc kubenswrapper[4585]: E1201 14:17:47.057981 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6fd44d5-e430-42bb-ad0b-7c78e7a1f288" containerName="horizon" Dec 01 14:17:47 crc kubenswrapper[4585]: I1201 14:17:47.058041 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6fd44d5-e430-42bb-ad0b-7c78e7a1f288" containerName="horizon" Dec 01 14:17:47 crc kubenswrapper[4585]: I1201 14:17:47.058386 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6fd44d5-e430-42bb-ad0b-7c78e7a1f288" containerName="horizon-log" Dec 01 14:17:47 crc kubenswrapper[4585]: I1201 14:17:47.058471 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa5f1c18-8b57-47a1-ba5d-5ae47fd1ae30" containerName="nova-cell1-conductor-db-sync" Dec 01 14:17:47 crc kubenswrapper[4585]: I1201 14:17:47.058530 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6fd44d5-e430-42bb-ad0b-7c78e7a1f288" containerName="horizon" Dec 01 14:17:47 crc kubenswrapper[4585]: I1201 14:17:47.058587 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6fd44d5-e430-42bb-ad0b-7c78e7a1f288" containerName="horizon" Dec 01 14:17:47 crc kubenswrapper[4585]: I1201 14:17:47.058662 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="31a54fa3-c02b-4d90-8375-b50ab8de60fe" containerName="nova-manage" Dec 01 14:17:47 crc kubenswrapper[4585]: I1201 14:17:47.059384 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 01 14:17:47 crc kubenswrapper[4585]: I1201 14:17:47.073055 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 01 14:17:47 crc kubenswrapper[4585]: I1201 14:17:47.075230 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 01 14:17:47 crc kubenswrapper[4585]: I1201 14:17:47.075735 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6fd44d5-e430-42bb-ad0b-7c78e7a1f288-combined-ca-bundle\") pod \"d6fd44d5-e430-42bb-ad0b-7c78e7a1f288\" (UID: \"d6fd44d5-e430-42bb-ad0b-7c78e7a1f288\") " Dec 01 14:17:47 crc kubenswrapper[4585]: I1201 14:17:47.075839 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmxvr\" (UniqueName: \"kubernetes.io/projected/d6fd44d5-e430-42bb-ad0b-7c78e7a1f288-kube-api-access-gmxvr\") pod \"d6fd44d5-e430-42bb-ad0b-7c78e7a1f288\" (UID: \"d6fd44d5-e430-42bb-ad0b-7c78e7a1f288\") " Dec 01 14:17:47 crc kubenswrapper[4585]: I1201 14:17:47.075875 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6fd44d5-e430-42bb-ad0b-7c78e7a1f288-horizon-tls-certs\") pod \"d6fd44d5-e430-42bb-ad0b-7c78e7a1f288\" (UID: \"d6fd44d5-e430-42bb-ad0b-7c78e7a1f288\") " Dec 01 14:17:47 crc kubenswrapper[4585]: I1201 14:17:47.075916 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d6fd44d5-e430-42bb-ad0b-7c78e7a1f288-scripts\") pod \"d6fd44d5-e430-42bb-ad0b-7c78e7a1f288\" (UID: \"d6fd44d5-e430-42bb-ad0b-7c78e7a1f288\") " Dec 01 14:17:47 crc kubenswrapper[4585]: I1201 14:17:47.075981 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d6fd44d5-e430-42bb-ad0b-7c78e7a1f288-config-data\") pod \"d6fd44d5-e430-42bb-ad0b-7c78e7a1f288\" (UID: \"d6fd44d5-e430-42bb-ad0b-7c78e7a1f288\") " Dec 01 14:17:47 crc kubenswrapper[4585]: I1201 14:17:47.076016 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d6fd44d5-e430-42bb-ad0b-7c78e7a1f288-horizon-secret-key\") pod \"d6fd44d5-e430-42bb-ad0b-7c78e7a1f288\" (UID: \"d6fd44d5-e430-42bb-ad0b-7c78e7a1f288\") " Dec 01 14:17:47 crc kubenswrapper[4585]: I1201 14:17:47.076067 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6fd44d5-e430-42bb-ad0b-7c78e7a1f288-logs\") pod \"d6fd44d5-e430-42bb-ad0b-7c78e7a1f288\" (UID: \"d6fd44d5-e430-42bb-ad0b-7c78e7a1f288\") " Dec 01 14:17:47 crc kubenswrapper[4585]: I1201 14:17:47.078921 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6fd44d5-e430-42bb-ad0b-7c78e7a1f288-logs" (OuterVolumeSpecName: "logs") pod "d6fd44d5-e430-42bb-ad0b-7c78e7a1f288" (UID: "d6fd44d5-e430-42bb-ad0b-7c78e7a1f288"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:17:47 crc kubenswrapper[4585]: I1201 14:17:47.082323 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6fd44d5-e430-42bb-ad0b-7c78e7a1f288-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "d6fd44d5-e430-42bb-ad0b-7c78e7a1f288" (UID: "d6fd44d5-e430-42bb-ad0b-7c78e7a1f288"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:17:47 crc kubenswrapper[4585]: I1201 14:17:47.089209 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6fd44d5-e430-42bb-ad0b-7c78e7a1f288-kube-api-access-gmxvr" (OuterVolumeSpecName: "kube-api-access-gmxvr") pod "d6fd44d5-e430-42bb-ad0b-7c78e7a1f288" (UID: "d6fd44d5-e430-42bb-ad0b-7c78e7a1f288"). InnerVolumeSpecName "kube-api-access-gmxvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:17:47 crc kubenswrapper[4585]: I1201 14:17:47.119945 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6fd44d5-e430-42bb-ad0b-7c78e7a1f288-config-data" (OuterVolumeSpecName: "config-data") pod "d6fd44d5-e430-42bb-ad0b-7c78e7a1f288" (UID: "d6fd44d5-e430-42bb-ad0b-7c78e7a1f288"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:17:47 crc kubenswrapper[4585]: I1201 14:17:47.129089 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6fd44d5-e430-42bb-ad0b-7c78e7a1f288-scripts" (OuterVolumeSpecName: "scripts") pod "d6fd44d5-e430-42bb-ad0b-7c78e7a1f288" (UID: "d6fd44d5-e430-42bb-ad0b-7c78e7a1f288"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:17:47 crc kubenswrapper[4585]: I1201 14:17:47.171920 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6fd44d5-e430-42bb-ad0b-7c78e7a1f288-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d6fd44d5-e430-42bb-ad0b-7c78e7a1f288" (UID: "d6fd44d5-e430-42bb-ad0b-7c78e7a1f288"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:17:47 crc kubenswrapper[4585]: I1201 14:17:47.172018 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6fd44d5-e430-42bb-ad0b-7c78e7a1f288-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "d6fd44d5-e430-42bb-ad0b-7c78e7a1f288" (UID: "d6fd44d5-e430-42bb-ad0b-7c78e7a1f288"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:17:47 crc kubenswrapper[4585]: I1201 14:17:47.178347 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgfhv\" (UniqueName: \"kubernetes.io/projected/4cf9cb62-4c4f-43ae-8a94-78eafbb19a82-kube-api-access-mgfhv\") pod \"nova-cell1-conductor-0\" (UID: \"4cf9cb62-4c4f-43ae-8a94-78eafbb19a82\") " pod="openstack/nova-cell1-conductor-0" Dec 01 14:17:47 crc kubenswrapper[4585]: I1201 14:17:47.178398 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cf9cb62-4c4f-43ae-8a94-78eafbb19a82-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"4cf9cb62-4c4f-43ae-8a94-78eafbb19a82\") " pod="openstack/nova-cell1-conductor-0" Dec 01 14:17:47 crc kubenswrapper[4585]: I1201 14:17:47.178433 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cf9cb62-4c4f-43ae-8a94-78eafbb19a82-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"4cf9cb62-4c4f-43ae-8a94-78eafbb19a82\") " pod="openstack/nova-cell1-conductor-0" Dec 01 14:17:47 crc kubenswrapper[4585]: I1201 14:17:47.178621 4585 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d6fd44d5-e430-42bb-ad0b-7c78e7a1f288-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 14:17:47 crc kubenswrapper[4585]: I1201 14:17:47.178639 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d6fd44d5-e430-42bb-ad0b-7c78e7a1f288-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 14:17:47 crc kubenswrapper[4585]: I1201 14:17:47.178649 4585 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d6fd44d5-e430-42bb-ad0b-7c78e7a1f288-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 01 14:17:47 crc kubenswrapper[4585]: I1201 14:17:47.178659 4585 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d6fd44d5-e430-42bb-ad0b-7c78e7a1f288-logs\") on node \"crc\" DevicePath \"\"" Dec 01 14:17:47 crc kubenswrapper[4585]: I1201 14:17:47.178668 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6fd44d5-e430-42bb-ad0b-7c78e7a1f288-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 14:17:47 crc kubenswrapper[4585]: I1201 14:17:47.178677 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmxvr\" (UniqueName: \"kubernetes.io/projected/d6fd44d5-e430-42bb-ad0b-7c78e7a1f288-kube-api-access-gmxvr\") on node \"crc\" DevicePath \"\"" Dec 01 14:17:47 crc kubenswrapper[4585]: I1201 14:17:47.178685 4585 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6fd44d5-e430-42bb-ad0b-7c78e7a1f288-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 14:17:47 crc kubenswrapper[4585]: I1201 14:17:47.227672 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 01 14:17:47 crc kubenswrapper[4585]: I1201 14:17:47.228263 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ad4421f3-1cd6-47fe-a017-8b7ea120652c" containerName="nova-api-api" containerID="cri-o://cdf934349ebe5f6190cb6e45580f731d3a4b8f7f82ba8a9a44c9233521813518" gracePeriod=30 Dec 01 14:17:47 crc kubenswrapper[4585]: I1201 14:17:47.227939 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ad4421f3-1cd6-47fe-a017-8b7ea120652c" containerName="nova-api-log" containerID="cri-o://322e5d29a50b16bd223090cc1a749e920cd268ce8cc3c43ddf38009966e082bd" gracePeriod=30 Dec 01 14:17:47 crc kubenswrapper[4585]: I1201 14:17:47.252110 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 14:17:47 crc kubenswrapper[4585]: I1201 14:17:47.252364 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="0c214024-05df-4ee0-a121-04adc2e5f5c5" containerName="nova-scheduler-scheduler" containerID="cri-o://8b392d66828f6fd499c67724d985bd5d31084bdffa01eb7dc7c6917836ab59e7" gracePeriod=30 Dec 01 14:17:47 crc kubenswrapper[4585]: I1201 14:17:47.266437 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 14:17:47 crc kubenswrapper[4585]: I1201 14:17:47.266842 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0b1b5aa5-9df7-422b-96a9-f5f381caf344" containerName="nova-metadata-log" containerID="cri-o://67cfb6fa85a26b00c8e0ea9c2dedcad370a340f13fd73bae008662cda3e46254" gracePeriod=30 Dec 01 14:17:47 crc kubenswrapper[4585]: I1201 14:17:47.266954 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0b1b5aa5-9df7-422b-96a9-f5f381caf344" containerName="nova-metadata-metadata" containerID="cri-o://2cb5e933ddfa0be02436199b66a85f5dade5e20a4bed7bd3bf3f35e4856d32b3" gracePeriod=30 Dec 01 14:17:47 crc kubenswrapper[4585]: I1201 14:17:47.286121 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgfhv\" (UniqueName: \"kubernetes.io/projected/4cf9cb62-4c4f-43ae-8a94-78eafbb19a82-kube-api-access-mgfhv\") pod \"nova-cell1-conductor-0\" (UID: \"4cf9cb62-4c4f-43ae-8a94-78eafbb19a82\") " pod="openstack/nova-cell1-conductor-0" Dec 01 14:17:47 crc kubenswrapper[4585]: I1201 14:17:47.286170 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cf9cb62-4c4f-43ae-8a94-78eafbb19a82-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"4cf9cb62-4c4f-43ae-8a94-78eafbb19a82\") " pod="openstack/nova-cell1-conductor-0" Dec 01 14:17:47 crc kubenswrapper[4585]: I1201 14:17:47.286203 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cf9cb62-4c4f-43ae-8a94-78eafbb19a82-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"4cf9cb62-4c4f-43ae-8a94-78eafbb19a82\") " pod="openstack/nova-cell1-conductor-0" Dec 01 14:17:47 crc kubenswrapper[4585]: I1201 14:17:47.292720 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cf9cb62-4c4f-43ae-8a94-78eafbb19a82-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"4cf9cb62-4c4f-43ae-8a94-78eafbb19a82\") " pod="openstack/nova-cell1-conductor-0" Dec 01 14:17:47 crc kubenswrapper[4585]: I1201 14:17:47.294670 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cf9cb62-4c4f-43ae-8a94-78eafbb19a82-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"4cf9cb62-4c4f-43ae-8a94-78eafbb19a82\") " pod="openstack/nova-cell1-conductor-0" Dec 01 14:17:47 crc kubenswrapper[4585]: I1201 14:17:47.313880 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgfhv\" (UniqueName: \"kubernetes.io/projected/4cf9cb62-4c4f-43ae-8a94-78eafbb19a82-kube-api-access-mgfhv\") pod \"nova-cell1-conductor-0\" (UID: \"4cf9cb62-4c4f-43ae-8a94-78eafbb19a82\") " pod="openstack/nova-cell1-conductor-0" Dec 01 14:17:47 crc kubenswrapper[4585]: I1201 14:17:47.409090 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 01 14:17:47 crc kubenswrapper[4585]: I1201 14:17:47.982798 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 01 14:17:48 crc kubenswrapper[4585]: I1201 14:17:48.000622 4585 generic.go:334] "Generic (PLEG): container finished" podID="ad4421f3-1cd6-47fe-a017-8b7ea120652c" containerID="322e5d29a50b16bd223090cc1a749e920cd268ce8cc3c43ddf38009966e082bd" exitCode=143 Dec 01 14:17:48 crc kubenswrapper[4585]: I1201 14:17:48.000698 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ad4421f3-1cd6-47fe-a017-8b7ea120652c","Type":"ContainerDied","Data":"322e5d29a50b16bd223090cc1a749e920cd268ce8cc3c43ddf38009966e082bd"} Dec 01 14:17:48 crc kubenswrapper[4585]: I1201 14:17:48.028282 4585 generic.go:334] "Generic (PLEG): container finished" podID="0b1b5aa5-9df7-422b-96a9-f5f381caf344" containerID="2cb5e933ddfa0be02436199b66a85f5dade5e20a4bed7bd3bf3f35e4856d32b3" exitCode=0 Dec 01 14:17:48 crc kubenswrapper[4585]: I1201 14:17:48.028320 4585 generic.go:334] "Generic (PLEG): container finished" podID="0b1b5aa5-9df7-422b-96a9-f5f381caf344" containerID="67cfb6fa85a26b00c8e0ea9c2dedcad370a340f13fd73bae008662cda3e46254" exitCode=143 Dec 01 14:17:48 crc kubenswrapper[4585]: I1201 14:17:48.028420 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f5b64975d-2mfhq" Dec 01 14:17:48 crc kubenswrapper[4585]: I1201 14:17:48.029040 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0b1b5aa5-9df7-422b-96a9-f5f381caf344","Type":"ContainerDied","Data":"2cb5e933ddfa0be02436199b66a85f5dade5e20a4bed7bd3bf3f35e4856d32b3"} Dec 01 14:17:48 crc kubenswrapper[4585]: I1201 14:17:48.029108 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0b1b5aa5-9df7-422b-96a9-f5f381caf344","Type":"ContainerDied","Data":"67cfb6fa85a26b00c8e0ea9c2dedcad370a340f13fd73bae008662cda3e46254"} Dec 01 14:17:48 crc kubenswrapper[4585]: I1201 14:17:48.029118 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0b1b5aa5-9df7-422b-96a9-f5f381caf344","Type":"ContainerDied","Data":"4eb0055fd0e47949e48d071daced06d069354207b06a23add545bf51a63de582"} Dec 01 14:17:48 crc kubenswrapper[4585]: I1201 14:17:48.029128 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4eb0055fd0e47949e48d071daced06d069354207b06a23add545bf51a63de582" Dec 01 14:17:48 crc kubenswrapper[4585]: I1201 14:17:48.073798 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 14:17:48 crc kubenswrapper[4585]: I1201 14:17:48.118028 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6f5b64975d-2mfhq"] Dec 01 14:17:48 crc kubenswrapper[4585]: I1201 14:17:48.147034 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6f5b64975d-2mfhq"] Dec 01 14:17:48 crc kubenswrapper[4585]: I1201 14:17:48.206585 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b1b5aa5-9df7-422b-96a9-f5f381caf344-combined-ca-bundle\") pod \"0b1b5aa5-9df7-422b-96a9-f5f381caf344\" (UID: \"0b1b5aa5-9df7-422b-96a9-f5f381caf344\") " Dec 01 14:17:48 crc kubenswrapper[4585]: I1201 14:17:48.207299 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b1b5aa5-9df7-422b-96a9-f5f381caf344-config-data\") pod \"0b1b5aa5-9df7-422b-96a9-f5f381caf344\" (UID: \"0b1b5aa5-9df7-422b-96a9-f5f381caf344\") " Dec 01 14:17:48 crc kubenswrapper[4585]: I1201 14:17:48.207478 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b1b5aa5-9df7-422b-96a9-f5f381caf344-logs\") pod \"0b1b5aa5-9df7-422b-96a9-f5f381caf344\" (UID: \"0b1b5aa5-9df7-422b-96a9-f5f381caf344\") " Dec 01 14:17:48 crc kubenswrapper[4585]: I1201 14:17:48.207513 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6mpf\" (UniqueName: \"kubernetes.io/projected/0b1b5aa5-9df7-422b-96a9-f5f381caf344-kube-api-access-q6mpf\") pod \"0b1b5aa5-9df7-422b-96a9-f5f381caf344\" (UID: \"0b1b5aa5-9df7-422b-96a9-f5f381caf344\") " Dec 01 14:17:48 crc kubenswrapper[4585]: I1201 14:17:48.207532 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b1b5aa5-9df7-422b-96a9-f5f381caf344-nova-metadata-tls-certs\") pod \"0b1b5aa5-9df7-422b-96a9-f5f381caf344\" (UID: \"0b1b5aa5-9df7-422b-96a9-f5f381caf344\") " Dec 01 14:17:48 crc kubenswrapper[4585]: I1201 14:17:48.212301 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b1b5aa5-9df7-422b-96a9-f5f381caf344-logs" (OuterVolumeSpecName: "logs") pod "0b1b5aa5-9df7-422b-96a9-f5f381caf344" (UID: "0b1b5aa5-9df7-422b-96a9-f5f381caf344"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:17:48 crc kubenswrapper[4585]: I1201 14:17:48.228221 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b1b5aa5-9df7-422b-96a9-f5f381caf344-kube-api-access-q6mpf" (OuterVolumeSpecName: "kube-api-access-q6mpf") pod "0b1b5aa5-9df7-422b-96a9-f5f381caf344" (UID: "0b1b5aa5-9df7-422b-96a9-f5f381caf344"). InnerVolumeSpecName "kube-api-access-q6mpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:17:48 crc kubenswrapper[4585]: I1201 14:17:48.297677 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b1b5aa5-9df7-422b-96a9-f5f381caf344-config-data" (OuterVolumeSpecName: "config-data") pod "0b1b5aa5-9df7-422b-96a9-f5f381caf344" (UID: "0b1b5aa5-9df7-422b-96a9-f5f381caf344"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:17:48 crc kubenswrapper[4585]: I1201 14:17:48.304111 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b1b5aa5-9df7-422b-96a9-f5f381caf344-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0b1b5aa5-9df7-422b-96a9-f5f381caf344" (UID: "0b1b5aa5-9df7-422b-96a9-f5f381caf344"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:17:48 crc kubenswrapper[4585]: I1201 14:17:48.309269 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b1b5aa5-9df7-422b-96a9-f5f381caf344-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 14:17:48 crc kubenswrapper[4585]: I1201 14:17:48.309303 4585 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b1b5aa5-9df7-422b-96a9-f5f381caf344-logs\") on node \"crc\" DevicePath \"\"" Dec 01 14:17:48 crc kubenswrapper[4585]: I1201 14:17:48.309316 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6mpf\" (UniqueName: \"kubernetes.io/projected/0b1b5aa5-9df7-422b-96a9-f5f381caf344-kube-api-access-q6mpf\") on node \"crc\" DevicePath \"\"" Dec 01 14:17:48 crc kubenswrapper[4585]: I1201 14:17:48.309327 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b1b5aa5-9df7-422b-96a9-f5f381caf344-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 14:17:48 crc kubenswrapper[4585]: I1201 14:17:48.315263 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b1b5aa5-9df7-422b-96a9-f5f381caf344-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "0b1b5aa5-9df7-422b-96a9-f5f381caf344" (UID: "0b1b5aa5-9df7-422b-96a9-f5f381caf344"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:17:48 crc kubenswrapper[4585]: I1201 14:17:48.410724 4585 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b1b5aa5-9df7-422b-96a9-f5f381caf344-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 14:17:48 crc kubenswrapper[4585]: I1201 14:17:48.421562 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6fd44d5-e430-42bb-ad0b-7c78e7a1f288" path="/var/lib/kubelet/pods/d6fd44d5-e430-42bb-ad0b-7c78e7a1f288/volumes" Dec 01 14:17:49 crc kubenswrapper[4585]: I1201 14:17:49.047053 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 14:17:49 crc kubenswrapper[4585]: I1201 14:17:49.049192 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"4cf9cb62-4c4f-43ae-8a94-78eafbb19a82","Type":"ContainerStarted","Data":"d46d8c66faefcdbf7bede8979f5cac82ceb431fb10a877102e471b8c53e854ab"} Dec 01 14:17:49 crc kubenswrapper[4585]: I1201 14:17:49.049241 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"4cf9cb62-4c4f-43ae-8a94-78eafbb19a82","Type":"ContainerStarted","Data":"c66a501642651f1588ab8e6bbe66291e026f47ea4d891f4cc5027a3b48a348a8"} Dec 01 14:17:49 crc kubenswrapper[4585]: I1201 14:17:49.049294 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 01 14:17:49 crc kubenswrapper[4585]: I1201 14:17:49.069587 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.069571374 podStartE2EDuration="2.069571374s" podCreationTimestamp="2025-12-01 14:17:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:17:49.064316014 +0000 UTC m=+1183.048529869" watchObservedRunningTime="2025-12-01 14:17:49.069571374 +0000 UTC m=+1183.053785229" Dec 01 14:17:49 crc kubenswrapper[4585]: I1201 14:17:49.086999 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 14:17:49 crc kubenswrapper[4585]: I1201 14:17:49.097120 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 14:17:49 crc kubenswrapper[4585]: I1201 14:17:49.118298 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 01 14:17:49 crc kubenswrapper[4585]: E1201 14:17:49.118847 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b1b5aa5-9df7-422b-96a9-f5f381caf344" containerName="nova-metadata-metadata" Dec 01 14:17:49 crc kubenswrapper[4585]: I1201 14:17:49.118865 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b1b5aa5-9df7-422b-96a9-f5f381caf344" containerName="nova-metadata-metadata" Dec 01 14:17:49 crc kubenswrapper[4585]: E1201 14:17:49.118881 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b1b5aa5-9df7-422b-96a9-f5f381caf344" containerName="nova-metadata-log" Dec 01 14:17:49 crc kubenswrapper[4585]: I1201 14:17:49.118889 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b1b5aa5-9df7-422b-96a9-f5f381caf344" containerName="nova-metadata-log" Dec 01 14:17:49 crc kubenswrapper[4585]: I1201 14:17:49.119319 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b1b5aa5-9df7-422b-96a9-f5f381caf344" containerName="nova-metadata-log" Dec 01 14:17:49 crc kubenswrapper[4585]: I1201 14:17:49.119338 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b1b5aa5-9df7-422b-96a9-f5f381caf344" containerName="nova-metadata-metadata" Dec 01 14:17:49 crc kubenswrapper[4585]: I1201 14:17:49.120674 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 14:17:49 crc kubenswrapper[4585]: I1201 14:17:49.125429 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 01 14:17:49 crc kubenswrapper[4585]: I1201 14:17:49.125634 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 01 14:17:49 crc kubenswrapper[4585]: I1201 14:17:49.128216 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 14:17:49 crc kubenswrapper[4585]: I1201 14:17:49.226006 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9080235-3e98-4f0f-945c-475e6de22a49-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d9080235-3e98-4f0f-945c-475e6de22a49\") " pod="openstack/nova-metadata-0" Dec 01 14:17:49 crc kubenswrapper[4585]: I1201 14:17:49.226053 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9080235-3e98-4f0f-945c-475e6de22a49-config-data\") pod \"nova-metadata-0\" (UID: \"d9080235-3e98-4f0f-945c-475e6de22a49\") " pod="openstack/nova-metadata-0" Dec 01 14:17:49 crc kubenswrapper[4585]: I1201 14:17:49.226075 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtmlx\" (UniqueName: \"kubernetes.io/projected/d9080235-3e98-4f0f-945c-475e6de22a49-kube-api-access-gtmlx\") pod \"nova-metadata-0\" (UID: \"d9080235-3e98-4f0f-945c-475e6de22a49\") " pod="openstack/nova-metadata-0" Dec 01 14:17:49 crc kubenswrapper[4585]: I1201 14:17:49.226095 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9080235-3e98-4f0f-945c-475e6de22a49-logs\") pod \"nova-metadata-0\" (UID: \"d9080235-3e98-4f0f-945c-475e6de22a49\") " pod="openstack/nova-metadata-0" Dec 01 14:17:49 crc kubenswrapper[4585]: I1201 14:17:49.226140 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9080235-3e98-4f0f-945c-475e6de22a49-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d9080235-3e98-4f0f-945c-475e6de22a49\") " pod="openstack/nova-metadata-0" Dec 01 14:17:49 crc kubenswrapper[4585]: I1201 14:17:49.328217 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9080235-3e98-4f0f-945c-475e6de22a49-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d9080235-3e98-4f0f-945c-475e6de22a49\") " pod="openstack/nova-metadata-0" Dec 01 14:17:49 crc kubenswrapper[4585]: I1201 14:17:49.328271 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9080235-3e98-4f0f-945c-475e6de22a49-config-data\") pod \"nova-metadata-0\" (UID: \"d9080235-3e98-4f0f-945c-475e6de22a49\") " pod="openstack/nova-metadata-0" Dec 01 14:17:49 crc kubenswrapper[4585]: I1201 14:17:49.328293 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtmlx\" (UniqueName: \"kubernetes.io/projected/d9080235-3e98-4f0f-945c-475e6de22a49-kube-api-access-gtmlx\") pod \"nova-metadata-0\" (UID: \"d9080235-3e98-4f0f-945c-475e6de22a49\") " pod="openstack/nova-metadata-0" Dec 01 14:17:49 crc kubenswrapper[4585]: I1201 14:17:49.328318 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9080235-3e98-4f0f-945c-475e6de22a49-logs\") pod \"nova-metadata-0\" (UID: \"d9080235-3e98-4f0f-945c-475e6de22a49\") " pod="openstack/nova-metadata-0" Dec 01 14:17:49 crc kubenswrapper[4585]: I1201 14:17:49.328337 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9080235-3e98-4f0f-945c-475e6de22a49-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d9080235-3e98-4f0f-945c-475e6de22a49\") " pod="openstack/nova-metadata-0" Dec 01 14:17:49 crc kubenswrapper[4585]: I1201 14:17:49.328912 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9080235-3e98-4f0f-945c-475e6de22a49-logs\") pod \"nova-metadata-0\" (UID: \"d9080235-3e98-4f0f-945c-475e6de22a49\") " pod="openstack/nova-metadata-0" Dec 01 14:17:49 crc kubenswrapper[4585]: I1201 14:17:49.333281 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9080235-3e98-4f0f-945c-475e6de22a49-config-data\") pod \"nova-metadata-0\" (UID: \"d9080235-3e98-4f0f-945c-475e6de22a49\") " pod="openstack/nova-metadata-0" Dec 01 14:17:49 crc kubenswrapper[4585]: I1201 14:17:49.334644 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9080235-3e98-4f0f-945c-475e6de22a49-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d9080235-3e98-4f0f-945c-475e6de22a49\") " pod="openstack/nova-metadata-0" Dec 01 14:17:49 crc kubenswrapper[4585]: I1201 14:17:49.345005 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9080235-3e98-4f0f-945c-475e6de22a49-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d9080235-3e98-4f0f-945c-475e6de22a49\") " pod="openstack/nova-metadata-0" Dec 01 14:17:49 crc kubenswrapper[4585]: I1201 14:17:49.345330 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtmlx\" (UniqueName: \"kubernetes.io/projected/d9080235-3e98-4f0f-945c-475e6de22a49-kube-api-access-gtmlx\") pod \"nova-metadata-0\" (UID: \"d9080235-3e98-4f0f-945c-475e6de22a49\") " pod="openstack/nova-metadata-0" Dec 01 14:17:49 crc kubenswrapper[4585]: I1201 14:17:49.453198 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 14:17:49 crc kubenswrapper[4585]: I1201 14:17:49.905376 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 14:17:50 crc kubenswrapper[4585]: I1201 14:17:50.057203 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d9080235-3e98-4f0f-945c-475e6de22a49","Type":"ContainerStarted","Data":"69146a789b11c65e32d09683cec09d36396e40c502c30b9d3771ccc6903a995f"} Dec 01 14:17:50 crc kubenswrapper[4585]: I1201 14:17:50.423579 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b1b5aa5-9df7-422b-96a9-f5f381caf344" path="/var/lib/kubelet/pods/0b1b5aa5-9df7-422b-96a9-f5f381caf344/volumes" Dec 01 14:17:50 crc kubenswrapper[4585]: I1201 14:17:50.970486 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 14:17:50 crc kubenswrapper[4585]: I1201 14:17:50.993348 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.060477 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjhvt\" (UniqueName: \"kubernetes.io/projected/ad4421f3-1cd6-47fe-a017-8b7ea120652c-kube-api-access-fjhvt\") pod \"ad4421f3-1cd6-47fe-a017-8b7ea120652c\" (UID: \"ad4421f3-1cd6-47fe-a017-8b7ea120652c\") " Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.060566 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad4421f3-1cd6-47fe-a017-8b7ea120652c-combined-ca-bundle\") pod \"ad4421f3-1cd6-47fe-a017-8b7ea120652c\" (UID: \"ad4421f3-1cd6-47fe-a017-8b7ea120652c\") " Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.060598 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad4421f3-1cd6-47fe-a017-8b7ea120652c-logs\") pod \"ad4421f3-1cd6-47fe-a017-8b7ea120652c\" (UID: \"ad4421f3-1cd6-47fe-a017-8b7ea120652c\") " Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.060647 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c214024-05df-4ee0-a121-04adc2e5f5c5-config-data\") pod \"0c214024-05df-4ee0-a121-04adc2e5f5c5\" (UID: \"0c214024-05df-4ee0-a121-04adc2e5f5c5\") " Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.060718 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad4421f3-1cd6-47fe-a017-8b7ea120652c-config-data\") pod \"ad4421f3-1cd6-47fe-a017-8b7ea120652c\" (UID: \"ad4421f3-1cd6-47fe-a017-8b7ea120652c\") " Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.060738 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvsxs\" (UniqueName: \"kubernetes.io/projected/0c214024-05df-4ee0-a121-04adc2e5f5c5-kube-api-access-lvsxs\") pod \"0c214024-05df-4ee0-a121-04adc2e5f5c5\" (UID: \"0c214024-05df-4ee0-a121-04adc2e5f5c5\") " Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.060766 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c214024-05df-4ee0-a121-04adc2e5f5c5-combined-ca-bundle\") pod \"0c214024-05df-4ee0-a121-04adc2e5f5c5\" (UID: \"0c214024-05df-4ee0-a121-04adc2e5f5c5\") " Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.061731 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad4421f3-1cd6-47fe-a017-8b7ea120652c-logs" (OuterVolumeSpecName: "logs") pod "ad4421f3-1cd6-47fe-a017-8b7ea120652c" (UID: "ad4421f3-1cd6-47fe-a017-8b7ea120652c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.072265 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c214024-05df-4ee0-a121-04adc2e5f5c5-kube-api-access-lvsxs" (OuterVolumeSpecName: "kube-api-access-lvsxs") pod "0c214024-05df-4ee0-a121-04adc2e5f5c5" (UID: "0c214024-05df-4ee0-a121-04adc2e5f5c5"). InnerVolumeSpecName "kube-api-access-lvsxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.072801 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad4421f3-1cd6-47fe-a017-8b7ea120652c-kube-api-access-fjhvt" (OuterVolumeSpecName: "kube-api-access-fjhvt") pod "ad4421f3-1cd6-47fe-a017-8b7ea120652c" (UID: "ad4421f3-1cd6-47fe-a017-8b7ea120652c"). InnerVolumeSpecName "kube-api-access-fjhvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.077571 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d9080235-3e98-4f0f-945c-475e6de22a49","Type":"ContainerStarted","Data":"199cdb1bedef6b3ef896b88c40c0ae60097e550577cb48b6a0b29dc388671876"} Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.077622 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d9080235-3e98-4f0f-945c-475e6de22a49","Type":"ContainerStarted","Data":"798fc74143f881628ca03f67176da27c5f13d751e6b0423d7a360227f11e8a68"} Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.085632 4585 generic.go:334] "Generic (PLEG): container finished" podID="0c214024-05df-4ee0-a121-04adc2e5f5c5" containerID="8b392d66828f6fd499c67724d985bd5d31084bdffa01eb7dc7c6917836ab59e7" exitCode=0 Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.085695 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0c214024-05df-4ee0-a121-04adc2e5f5c5","Type":"ContainerDied","Data":"8b392d66828f6fd499c67724d985bd5d31084bdffa01eb7dc7c6917836ab59e7"} Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.085721 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0c214024-05df-4ee0-a121-04adc2e5f5c5","Type":"ContainerDied","Data":"f00ab36b814b7361723f31bd40dd53dae8865a5b77e87e15813476fe60585576"} Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.085738 4585 scope.go:117] "RemoveContainer" containerID="8b392d66828f6fd499c67724d985bd5d31084bdffa01eb7dc7c6917836ab59e7" Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.085864 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.089040 4585 generic.go:334] "Generic (PLEG): container finished" podID="ad4421f3-1cd6-47fe-a017-8b7ea120652c" containerID="cdf934349ebe5f6190cb6e45580f731d3a4b8f7f82ba8a9a44c9233521813518" exitCode=0 Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.089269 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.089815 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ad4421f3-1cd6-47fe-a017-8b7ea120652c","Type":"ContainerDied","Data":"cdf934349ebe5f6190cb6e45580f731d3a4b8f7f82ba8a9a44c9233521813518"} Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.089899 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ad4421f3-1cd6-47fe-a017-8b7ea120652c","Type":"ContainerDied","Data":"c81dfcf7b5033dd0383669dca9466baeda977fd5424e13a96da8c21a0ec5570a"} Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.115189 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad4421f3-1cd6-47fe-a017-8b7ea120652c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ad4421f3-1cd6-47fe-a017-8b7ea120652c" (UID: "ad4421f3-1cd6-47fe-a017-8b7ea120652c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.119088 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.11907177 podStartE2EDuration="2.11907177s" podCreationTimestamp="2025-12-01 14:17:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:17:51.110248794 +0000 UTC m=+1185.094462649" watchObservedRunningTime="2025-12-01 14:17:51.11907177 +0000 UTC m=+1185.103285625" Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.122487 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c214024-05df-4ee0-a121-04adc2e5f5c5-config-data" (OuterVolumeSpecName: "config-data") pod "0c214024-05df-4ee0-a121-04adc2e5f5c5" (UID: "0c214024-05df-4ee0-a121-04adc2e5f5c5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.131314 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c214024-05df-4ee0-a121-04adc2e5f5c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0c214024-05df-4ee0-a121-04adc2e5f5c5" (UID: "0c214024-05df-4ee0-a121-04adc2e5f5c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.141362 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad4421f3-1cd6-47fe-a017-8b7ea120652c-config-data" (OuterVolumeSpecName: "config-data") pod "ad4421f3-1cd6-47fe-a017-8b7ea120652c" (UID: "ad4421f3-1cd6-47fe-a017-8b7ea120652c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.148819 4585 scope.go:117] "RemoveContainer" containerID="8b392d66828f6fd499c67724d985bd5d31084bdffa01eb7dc7c6917836ab59e7" Dec 01 14:17:51 crc kubenswrapper[4585]: E1201 14:17:51.151069 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b392d66828f6fd499c67724d985bd5d31084bdffa01eb7dc7c6917836ab59e7\": container with ID starting with 8b392d66828f6fd499c67724d985bd5d31084bdffa01eb7dc7c6917836ab59e7 not found: ID does not exist" containerID="8b392d66828f6fd499c67724d985bd5d31084bdffa01eb7dc7c6917836ab59e7" Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.151117 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b392d66828f6fd499c67724d985bd5d31084bdffa01eb7dc7c6917836ab59e7"} err="failed to get container status \"8b392d66828f6fd499c67724d985bd5d31084bdffa01eb7dc7c6917836ab59e7\": rpc error: code = NotFound desc = could not find container \"8b392d66828f6fd499c67724d985bd5d31084bdffa01eb7dc7c6917836ab59e7\": container with ID starting with 8b392d66828f6fd499c67724d985bd5d31084bdffa01eb7dc7c6917836ab59e7 not found: ID does not exist" Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.151143 4585 scope.go:117] "RemoveContainer" containerID="cdf934349ebe5f6190cb6e45580f731d3a4b8f7f82ba8a9a44c9233521813518" Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.163300 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjhvt\" (UniqueName: \"kubernetes.io/projected/ad4421f3-1cd6-47fe-a017-8b7ea120652c-kube-api-access-fjhvt\") on node \"crc\" DevicePath \"\"" Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.163333 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad4421f3-1cd6-47fe-a017-8b7ea120652c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.163343 4585 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad4421f3-1cd6-47fe-a017-8b7ea120652c-logs\") on node \"crc\" DevicePath \"\"" Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.163353 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c214024-05df-4ee0-a121-04adc2e5f5c5-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.163362 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad4421f3-1cd6-47fe-a017-8b7ea120652c-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.163371 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvsxs\" (UniqueName: \"kubernetes.io/projected/0c214024-05df-4ee0-a121-04adc2e5f5c5-kube-api-access-lvsxs\") on node \"crc\" DevicePath \"\"" Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.163379 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c214024-05df-4ee0-a121-04adc2e5f5c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.185362 4585 scope.go:117] "RemoveContainer" containerID="322e5d29a50b16bd223090cc1a749e920cd268ce8cc3c43ddf38009966e082bd" Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.206015 4585 scope.go:117] "RemoveContainer" containerID="cdf934349ebe5f6190cb6e45580f731d3a4b8f7f82ba8a9a44c9233521813518" Dec 01 14:17:51 crc kubenswrapper[4585]: E1201 14:17:51.206638 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdf934349ebe5f6190cb6e45580f731d3a4b8f7f82ba8a9a44c9233521813518\": container with ID starting with cdf934349ebe5f6190cb6e45580f731d3a4b8f7f82ba8a9a44c9233521813518 not found: ID does not exist" containerID="cdf934349ebe5f6190cb6e45580f731d3a4b8f7f82ba8a9a44c9233521813518" Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.206680 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdf934349ebe5f6190cb6e45580f731d3a4b8f7f82ba8a9a44c9233521813518"} err="failed to get container status \"cdf934349ebe5f6190cb6e45580f731d3a4b8f7f82ba8a9a44c9233521813518\": rpc error: code = NotFound desc = could not find container \"cdf934349ebe5f6190cb6e45580f731d3a4b8f7f82ba8a9a44c9233521813518\": container with ID starting with cdf934349ebe5f6190cb6e45580f731d3a4b8f7f82ba8a9a44c9233521813518 not found: ID does not exist" Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.206708 4585 scope.go:117] "RemoveContainer" containerID="322e5d29a50b16bd223090cc1a749e920cd268ce8cc3c43ddf38009966e082bd" Dec 01 14:17:51 crc kubenswrapper[4585]: E1201 14:17:51.207030 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"322e5d29a50b16bd223090cc1a749e920cd268ce8cc3c43ddf38009966e082bd\": container with ID starting with 322e5d29a50b16bd223090cc1a749e920cd268ce8cc3c43ddf38009966e082bd not found: ID does not exist" containerID="322e5d29a50b16bd223090cc1a749e920cd268ce8cc3c43ddf38009966e082bd" Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.207078 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"322e5d29a50b16bd223090cc1a749e920cd268ce8cc3c43ddf38009966e082bd"} err="failed to get container status \"322e5d29a50b16bd223090cc1a749e920cd268ce8cc3c43ddf38009966e082bd\": rpc error: code = NotFound desc = could not find container \"322e5d29a50b16bd223090cc1a749e920cd268ce8cc3c43ddf38009966e082bd\": container with ID starting with 322e5d29a50b16bd223090cc1a749e920cd268ce8cc3c43ddf38009966e082bd not found: ID does not exist" Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.421710 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.435105 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.450535 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.466083 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.480334 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 14:17:51 crc kubenswrapper[4585]: E1201 14:17:51.480845 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad4421f3-1cd6-47fe-a017-8b7ea120652c" containerName="nova-api-api" Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.480868 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad4421f3-1cd6-47fe-a017-8b7ea120652c" containerName="nova-api-api" Dec 01 14:17:51 crc kubenswrapper[4585]: E1201 14:17:51.480883 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c214024-05df-4ee0-a121-04adc2e5f5c5" containerName="nova-scheduler-scheduler" Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.480894 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c214024-05df-4ee0-a121-04adc2e5f5c5" containerName="nova-scheduler-scheduler" Dec 01 14:17:51 crc kubenswrapper[4585]: E1201 14:17:51.480914 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad4421f3-1cd6-47fe-a017-8b7ea120652c" containerName="nova-api-log" Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.480922 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad4421f3-1cd6-47fe-a017-8b7ea120652c" containerName="nova-api-log" Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.481147 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c214024-05df-4ee0-a121-04adc2e5f5c5" containerName="nova-scheduler-scheduler" Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.481177 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad4421f3-1cd6-47fe-a017-8b7ea120652c" containerName="nova-api-api" Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.481200 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad4421f3-1cd6-47fe-a017-8b7ea120652c" containerName="nova-api-log" Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.481870 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.488072 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.490344 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.502750 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.504746 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.508846 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.522300 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.573418 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmvd5\" (UniqueName: \"kubernetes.io/projected/991dd5fb-9091-44df-a27a-9f130b663a51-kube-api-access-vmvd5\") pod \"nova-scheduler-0\" (UID: \"991dd5fb-9091-44df-a27a-9f130b663a51\") " pod="openstack/nova-scheduler-0" Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.573479 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/991dd5fb-9091-44df-a27a-9f130b663a51-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"991dd5fb-9091-44df-a27a-9f130b663a51\") " pod="openstack/nova-scheduler-0" Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.573650 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8aecb332-b72b-4862-8416-3e657de2aefd-logs\") pod \"nova-api-0\" (UID: \"8aecb332-b72b-4862-8416-3e657de2aefd\") " pod="openstack/nova-api-0" Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.573686 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/991dd5fb-9091-44df-a27a-9f130b663a51-config-data\") pod \"nova-scheduler-0\" (UID: \"991dd5fb-9091-44df-a27a-9f130b663a51\") " pod="openstack/nova-scheduler-0" Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.573727 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8aecb332-b72b-4862-8416-3e657de2aefd-config-data\") pod \"nova-api-0\" (UID: \"8aecb332-b72b-4862-8416-3e657de2aefd\") " pod="openstack/nova-api-0" Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.573813 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lp4xw\" (UniqueName: \"kubernetes.io/projected/8aecb332-b72b-4862-8416-3e657de2aefd-kube-api-access-lp4xw\") pod \"nova-api-0\" (UID: \"8aecb332-b72b-4862-8416-3e657de2aefd\") " pod="openstack/nova-api-0" Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.573838 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8aecb332-b72b-4862-8416-3e657de2aefd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8aecb332-b72b-4862-8416-3e657de2aefd\") " pod="openstack/nova-api-0" Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.675398 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmvd5\" (UniqueName: \"kubernetes.io/projected/991dd5fb-9091-44df-a27a-9f130b663a51-kube-api-access-vmvd5\") pod \"nova-scheduler-0\" (UID: \"991dd5fb-9091-44df-a27a-9f130b663a51\") " pod="openstack/nova-scheduler-0" Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.675731 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/991dd5fb-9091-44df-a27a-9f130b663a51-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"991dd5fb-9091-44df-a27a-9f130b663a51\") " pod="openstack/nova-scheduler-0" Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.676468 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8aecb332-b72b-4862-8416-3e657de2aefd-logs\") pod \"nova-api-0\" (UID: \"8aecb332-b72b-4862-8416-3e657de2aefd\") " pod="openstack/nova-api-0" Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.676567 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/991dd5fb-9091-44df-a27a-9f130b663a51-config-data\") pod \"nova-scheduler-0\" (UID: \"991dd5fb-9091-44df-a27a-9f130b663a51\") " pod="openstack/nova-scheduler-0" Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.676680 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8aecb332-b72b-4862-8416-3e657de2aefd-config-data\") pod \"nova-api-0\" (UID: \"8aecb332-b72b-4862-8416-3e657de2aefd\") " pod="openstack/nova-api-0" Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.676791 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lp4xw\" (UniqueName: \"kubernetes.io/projected/8aecb332-b72b-4862-8416-3e657de2aefd-kube-api-access-lp4xw\") pod \"nova-api-0\" (UID: \"8aecb332-b72b-4862-8416-3e657de2aefd\") " pod="openstack/nova-api-0" Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.676861 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8aecb332-b72b-4862-8416-3e657de2aefd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8aecb332-b72b-4862-8416-3e657de2aefd\") " pod="openstack/nova-api-0" Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.676904 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8aecb332-b72b-4862-8416-3e657de2aefd-logs\") pod \"nova-api-0\" (UID: \"8aecb332-b72b-4862-8416-3e657de2aefd\") " pod="openstack/nova-api-0" Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.680575 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8aecb332-b72b-4862-8416-3e657de2aefd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8aecb332-b72b-4862-8416-3e657de2aefd\") " pod="openstack/nova-api-0" Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.682413 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/991dd5fb-9091-44df-a27a-9f130b663a51-config-data\") pod \"nova-scheduler-0\" (UID: \"991dd5fb-9091-44df-a27a-9f130b663a51\") " pod="openstack/nova-scheduler-0" Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.682914 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8aecb332-b72b-4862-8416-3e657de2aefd-config-data\") pod \"nova-api-0\" (UID: \"8aecb332-b72b-4862-8416-3e657de2aefd\") " pod="openstack/nova-api-0" Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.690021 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/991dd5fb-9091-44df-a27a-9f130b663a51-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"991dd5fb-9091-44df-a27a-9f130b663a51\") " pod="openstack/nova-scheduler-0" Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.693926 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmvd5\" (UniqueName: \"kubernetes.io/projected/991dd5fb-9091-44df-a27a-9f130b663a51-kube-api-access-vmvd5\") pod \"nova-scheduler-0\" (UID: \"991dd5fb-9091-44df-a27a-9f130b663a51\") " pod="openstack/nova-scheduler-0" Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.694199 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lp4xw\" (UniqueName: \"kubernetes.io/projected/8aecb332-b72b-4862-8416-3e657de2aefd-kube-api-access-lp4xw\") pod \"nova-api-0\" (UID: \"8aecb332-b72b-4862-8416-3e657de2aefd\") " pod="openstack/nova-api-0" Dec 01 14:17:51 crc kubenswrapper[4585]: I1201 14:17:51.807496 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 14:17:52 crc kubenswrapper[4585]: I1201 14:17:51.889344 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 14:17:52 crc kubenswrapper[4585]: I1201 14:17:52.425102 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c214024-05df-4ee0-a121-04adc2e5f5c5" path="/var/lib/kubelet/pods/0c214024-05df-4ee0-a121-04adc2e5f5c5/volumes" Dec 01 14:17:52 crc kubenswrapper[4585]: I1201 14:17:52.428018 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad4421f3-1cd6-47fe-a017-8b7ea120652c" path="/var/lib/kubelet/pods/ad4421f3-1cd6-47fe-a017-8b7ea120652c/volumes" Dec 01 14:17:52 crc kubenswrapper[4585]: I1201 14:17:52.433507 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 14:17:52 crc kubenswrapper[4585]: I1201 14:17:52.442594 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 14:17:53 crc kubenswrapper[4585]: I1201 14:17:53.114507 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8aecb332-b72b-4862-8416-3e657de2aefd","Type":"ContainerStarted","Data":"e77ac58462a4b79108075f8a16c90c07b74d4a30a529d6f0b7e5d6d14b7b357a"} Dec 01 14:17:53 crc kubenswrapper[4585]: I1201 14:17:53.114779 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8aecb332-b72b-4862-8416-3e657de2aefd","Type":"ContainerStarted","Data":"3c7093cd7173bbcba9cabf4360fad48fc3e1868aaef456aa42e47b0b8effce35"} Dec 01 14:17:53 crc kubenswrapper[4585]: I1201 14:17:53.114790 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8aecb332-b72b-4862-8416-3e657de2aefd","Type":"ContainerStarted","Data":"2a07bfcf03a191231b2b80d236993efd8c37b5df60737f3cbde1067e69b4ba5d"} Dec 01 14:17:53 crc kubenswrapper[4585]: I1201 14:17:53.117868 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"991dd5fb-9091-44df-a27a-9f130b663a51","Type":"ContainerStarted","Data":"da1082cd1c1f62526de89333e81b0e2a2c3f5a276706131e9ce35f4059abf6e0"} Dec 01 14:17:53 crc kubenswrapper[4585]: I1201 14:17:53.117916 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"991dd5fb-9091-44df-a27a-9f130b663a51","Type":"ContainerStarted","Data":"0e4f554130f94fbfb3a664a495b63147e23c556ba2760b4e375c1eb98e15a5bb"} Dec 01 14:17:53 crc kubenswrapper[4585]: I1201 14:17:53.140710 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.140692161 podStartE2EDuration="2.140692161s" podCreationTimestamp="2025-12-01 14:17:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:17:53.139073128 +0000 UTC m=+1187.123286993" watchObservedRunningTime="2025-12-01 14:17:53.140692161 +0000 UTC m=+1187.124906016" Dec 01 14:17:53 crc kubenswrapper[4585]: I1201 14:17:53.164071 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.164050914 podStartE2EDuration="2.164050914s" podCreationTimestamp="2025-12-01 14:17:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:17:53.159431991 +0000 UTC m=+1187.143645866" watchObservedRunningTime="2025-12-01 14:17:53.164050914 +0000 UTC m=+1187.148264769" Dec 01 14:17:54 crc kubenswrapper[4585]: I1201 14:17:54.387636 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 01 14:17:54 crc kubenswrapper[4585]: I1201 14:17:54.454080 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 01 14:17:54 crc kubenswrapper[4585]: I1201 14:17:54.454462 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.133218 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.140079 4585 generic.go:334] "Generic (PLEG): container finished" podID="8c322955-ba56-4357-bc08-9828e570d4c8" containerID="1d216ea0f7b1de6b629daaaf85cfba679c69c16169c5fb9b4fb31b443e1d6dde" exitCode=0 Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.140162 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.140220 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c322955-ba56-4357-bc08-9828e570d4c8","Type":"ContainerDied","Data":"1d216ea0f7b1de6b629daaaf85cfba679c69c16169c5fb9b4fb31b443e1d6dde"} Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.140270 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c322955-ba56-4357-bc08-9828e570d4c8","Type":"ContainerDied","Data":"a2da8c133db69f02f05b57a299d76c494c949338982f490843070fc82c66a6e8"} Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.140292 4585 scope.go:117] "RemoveContainer" containerID="e89015dbc6948547868eba5afb2622cd8a3efca48252527b738f709b7f9166f8" Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.171146 4585 scope.go:117] "RemoveContainer" containerID="0850bc1c40a8a1d86ac60cfedd96fe073a01266bea49d51fa948fad812494887" Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.223703 4585 scope.go:117] "RemoveContainer" containerID="1d216ea0f7b1de6b629daaaf85cfba679c69c16169c5fb9b4fb31b443e1d6dde" Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.250253 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c322955-ba56-4357-bc08-9828e570d4c8-run-httpd\") pod \"8c322955-ba56-4357-bc08-9828e570d4c8\" (UID: \"8c322955-ba56-4357-bc08-9828e570d4c8\") " Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.250288 4585 scope.go:117] "RemoveContainer" containerID="349cee5c3d4760c56f3bb57a662be7c9f966bdbb9fc69f4ce1600cdc2e558a2f" Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.250558 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8c322955-ba56-4357-bc08-9828e570d4c8-sg-core-conf-yaml\") pod \"8c322955-ba56-4357-bc08-9828e570d4c8\" (UID: \"8c322955-ba56-4357-bc08-9828e570d4c8\") " Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.250595 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c322955-ba56-4357-bc08-9828e570d4c8-log-httpd\") pod \"8c322955-ba56-4357-bc08-9828e570d4c8\" (UID: \"8c322955-ba56-4357-bc08-9828e570d4c8\") " Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.250635 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c322955-ba56-4357-bc08-9828e570d4c8-combined-ca-bundle\") pod \"8c322955-ba56-4357-bc08-9828e570d4c8\" (UID: \"8c322955-ba56-4357-bc08-9828e570d4c8\") " Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.250651 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c322955-ba56-4357-bc08-9828e570d4c8-config-data\") pod \"8c322955-ba56-4357-bc08-9828e570d4c8\" (UID: \"8c322955-ba56-4357-bc08-9828e570d4c8\") " Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.250674 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mjg2\" (UniqueName: \"kubernetes.io/projected/8c322955-ba56-4357-bc08-9828e570d4c8-kube-api-access-6mjg2\") pod \"8c322955-ba56-4357-bc08-9828e570d4c8\" (UID: \"8c322955-ba56-4357-bc08-9828e570d4c8\") " Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.250749 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c322955-ba56-4357-bc08-9828e570d4c8-scripts\") pod \"8c322955-ba56-4357-bc08-9828e570d4c8\" (UID: \"8c322955-ba56-4357-bc08-9828e570d4c8\") " Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.250825 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c322955-ba56-4357-bc08-9828e570d4c8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8c322955-ba56-4357-bc08-9828e570d4c8" (UID: "8c322955-ba56-4357-bc08-9828e570d4c8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.251237 4585 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c322955-ba56-4357-bc08-9828e570d4c8-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.252317 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c322955-ba56-4357-bc08-9828e570d4c8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8c322955-ba56-4357-bc08-9828e570d4c8" (UID: "8c322955-ba56-4357-bc08-9828e570d4c8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.258102 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c322955-ba56-4357-bc08-9828e570d4c8-scripts" (OuterVolumeSpecName: "scripts") pod "8c322955-ba56-4357-bc08-9828e570d4c8" (UID: "8c322955-ba56-4357-bc08-9828e570d4c8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.259141 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c322955-ba56-4357-bc08-9828e570d4c8-kube-api-access-6mjg2" (OuterVolumeSpecName: "kube-api-access-6mjg2") pod "8c322955-ba56-4357-bc08-9828e570d4c8" (UID: "8c322955-ba56-4357-bc08-9828e570d4c8"). InnerVolumeSpecName "kube-api-access-6mjg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.281065 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c322955-ba56-4357-bc08-9828e570d4c8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8c322955-ba56-4357-bc08-9828e570d4c8" (UID: "8c322955-ba56-4357-bc08-9828e570d4c8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.330268 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c322955-ba56-4357-bc08-9828e570d4c8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8c322955-ba56-4357-bc08-9828e570d4c8" (UID: "8c322955-ba56-4357-bc08-9828e570d4c8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.353107 4585 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c322955-ba56-4357-bc08-9828e570d4c8-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.353508 4585 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8c322955-ba56-4357-bc08-9828e570d4c8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.353519 4585 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c322955-ba56-4357-bc08-9828e570d4c8-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.353529 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c322955-ba56-4357-bc08-9828e570d4c8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.353538 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mjg2\" (UniqueName: \"kubernetes.io/projected/8c322955-ba56-4357-bc08-9828e570d4c8-kube-api-access-6mjg2\") on node \"crc\" DevicePath \"\"" Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.357130 4585 scope.go:117] "RemoveContainer" containerID="e89015dbc6948547868eba5afb2622cd8a3efca48252527b738f709b7f9166f8" Dec 01 14:17:55 crc kubenswrapper[4585]: E1201 14:17:55.357485 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e89015dbc6948547868eba5afb2622cd8a3efca48252527b738f709b7f9166f8\": container with ID starting with e89015dbc6948547868eba5afb2622cd8a3efca48252527b738f709b7f9166f8 not found: ID does not exist" containerID="e89015dbc6948547868eba5afb2622cd8a3efca48252527b738f709b7f9166f8" Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.357529 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e89015dbc6948547868eba5afb2622cd8a3efca48252527b738f709b7f9166f8"} err="failed to get container status \"e89015dbc6948547868eba5afb2622cd8a3efca48252527b738f709b7f9166f8\": rpc error: code = NotFound desc = could not find container \"e89015dbc6948547868eba5afb2622cd8a3efca48252527b738f709b7f9166f8\": container with ID starting with e89015dbc6948547868eba5afb2622cd8a3efca48252527b738f709b7f9166f8 not found: ID does not exist" Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.357555 4585 scope.go:117] "RemoveContainer" containerID="0850bc1c40a8a1d86ac60cfedd96fe073a01266bea49d51fa948fad812494887" Dec 01 14:17:55 crc kubenswrapper[4585]: E1201 14:17:55.357834 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0850bc1c40a8a1d86ac60cfedd96fe073a01266bea49d51fa948fad812494887\": container with ID starting with 0850bc1c40a8a1d86ac60cfedd96fe073a01266bea49d51fa948fad812494887 not found: ID does not exist" containerID="0850bc1c40a8a1d86ac60cfedd96fe073a01266bea49d51fa948fad812494887" Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.357865 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0850bc1c40a8a1d86ac60cfedd96fe073a01266bea49d51fa948fad812494887"} err="failed to get container status \"0850bc1c40a8a1d86ac60cfedd96fe073a01266bea49d51fa948fad812494887\": rpc error: code = NotFound desc = could not find container \"0850bc1c40a8a1d86ac60cfedd96fe073a01266bea49d51fa948fad812494887\": container with ID starting with 0850bc1c40a8a1d86ac60cfedd96fe073a01266bea49d51fa948fad812494887 not found: ID does not exist" Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.357882 4585 scope.go:117] "RemoveContainer" containerID="1d216ea0f7b1de6b629daaaf85cfba679c69c16169c5fb9b4fb31b443e1d6dde" Dec 01 14:17:55 crc kubenswrapper[4585]: E1201 14:17:55.358151 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d216ea0f7b1de6b629daaaf85cfba679c69c16169c5fb9b4fb31b443e1d6dde\": container with ID starting with 1d216ea0f7b1de6b629daaaf85cfba679c69c16169c5fb9b4fb31b443e1d6dde not found: ID does not exist" containerID="1d216ea0f7b1de6b629daaaf85cfba679c69c16169c5fb9b4fb31b443e1d6dde" Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.358179 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d216ea0f7b1de6b629daaaf85cfba679c69c16169c5fb9b4fb31b443e1d6dde"} err="failed to get container status \"1d216ea0f7b1de6b629daaaf85cfba679c69c16169c5fb9b4fb31b443e1d6dde\": rpc error: code = NotFound desc = could not find container \"1d216ea0f7b1de6b629daaaf85cfba679c69c16169c5fb9b4fb31b443e1d6dde\": container with ID starting with 1d216ea0f7b1de6b629daaaf85cfba679c69c16169c5fb9b4fb31b443e1d6dde not found: ID does not exist" Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.358196 4585 scope.go:117] "RemoveContainer" containerID="349cee5c3d4760c56f3bb57a662be7c9f966bdbb9fc69f4ce1600cdc2e558a2f" Dec 01 14:17:55 crc kubenswrapper[4585]: E1201 14:17:55.358391 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"349cee5c3d4760c56f3bb57a662be7c9f966bdbb9fc69f4ce1600cdc2e558a2f\": container with ID starting with 349cee5c3d4760c56f3bb57a662be7c9f966bdbb9fc69f4ce1600cdc2e558a2f not found: ID does not exist" containerID="349cee5c3d4760c56f3bb57a662be7c9f966bdbb9fc69f4ce1600cdc2e558a2f" Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.358423 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"349cee5c3d4760c56f3bb57a662be7c9f966bdbb9fc69f4ce1600cdc2e558a2f"} err="failed to get container status \"349cee5c3d4760c56f3bb57a662be7c9f966bdbb9fc69f4ce1600cdc2e558a2f\": rpc error: code = NotFound desc = could not find container \"349cee5c3d4760c56f3bb57a662be7c9f966bdbb9fc69f4ce1600cdc2e558a2f\": container with ID starting with 349cee5c3d4760c56f3bb57a662be7c9f966bdbb9fc69f4ce1600cdc2e558a2f not found: ID does not exist" Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.380003 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c322955-ba56-4357-bc08-9828e570d4c8-config-data" (OuterVolumeSpecName: "config-data") pod "8c322955-ba56-4357-bc08-9828e570d4c8" (UID: "8c322955-ba56-4357-bc08-9828e570d4c8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.455817 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c322955-ba56-4357-bc08-9828e570d4c8-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.473625 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.481085 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.494815 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 14:17:55 crc kubenswrapper[4585]: E1201 14:17:55.495252 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c322955-ba56-4357-bc08-9828e570d4c8" containerName="sg-core" Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.495271 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c322955-ba56-4357-bc08-9828e570d4c8" containerName="sg-core" Dec 01 14:17:55 crc kubenswrapper[4585]: E1201 14:17:55.495293 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c322955-ba56-4357-bc08-9828e570d4c8" containerName="proxy-httpd" Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.495298 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c322955-ba56-4357-bc08-9828e570d4c8" containerName="proxy-httpd" Dec 01 14:17:55 crc kubenswrapper[4585]: E1201 14:17:55.495311 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c322955-ba56-4357-bc08-9828e570d4c8" containerName="ceilometer-notification-agent" Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.495319 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c322955-ba56-4357-bc08-9828e570d4c8" containerName="ceilometer-notification-agent" Dec 01 14:17:55 crc kubenswrapper[4585]: E1201 14:17:55.495342 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c322955-ba56-4357-bc08-9828e570d4c8" containerName="ceilometer-central-agent" Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.495349 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c322955-ba56-4357-bc08-9828e570d4c8" containerName="ceilometer-central-agent" Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.495522 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c322955-ba56-4357-bc08-9828e570d4c8" containerName="ceilometer-notification-agent" Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.495541 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c322955-ba56-4357-bc08-9828e570d4c8" containerName="ceilometer-central-agent" Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.495551 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c322955-ba56-4357-bc08-9828e570d4c8" containerName="proxy-httpd" Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.495563 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c322955-ba56-4357-bc08-9828e570d4c8" containerName="sg-core" Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.497554 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.499259 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.507660 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.507866 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.526005 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.557588 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/598a75f1-7d24-40ff-818c-9cc023cf6b1d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"598a75f1-7d24-40ff-818c-9cc023cf6b1d\") " pod="openstack/ceilometer-0" Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.557666 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/598a75f1-7d24-40ff-818c-9cc023cf6b1d-config-data\") pod \"ceilometer-0\" (UID: \"598a75f1-7d24-40ff-818c-9cc023cf6b1d\") " pod="openstack/ceilometer-0" Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.557758 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/598a75f1-7d24-40ff-818c-9cc023cf6b1d-scripts\") pod \"ceilometer-0\" (UID: \"598a75f1-7d24-40ff-818c-9cc023cf6b1d\") " pod="openstack/ceilometer-0" Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.557835 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/598a75f1-7d24-40ff-818c-9cc023cf6b1d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"598a75f1-7d24-40ff-818c-9cc023cf6b1d\") " pod="openstack/ceilometer-0" Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.557879 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/598a75f1-7d24-40ff-818c-9cc023cf6b1d-log-httpd\") pod \"ceilometer-0\" (UID: \"598a75f1-7d24-40ff-818c-9cc023cf6b1d\") " pod="openstack/ceilometer-0" Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.557946 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/598a75f1-7d24-40ff-818c-9cc023cf6b1d-run-httpd\") pod \"ceilometer-0\" (UID: \"598a75f1-7d24-40ff-818c-9cc023cf6b1d\") " pod="openstack/ceilometer-0" Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.557995 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/598a75f1-7d24-40ff-818c-9cc023cf6b1d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"598a75f1-7d24-40ff-818c-9cc023cf6b1d\") " pod="openstack/ceilometer-0" Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.558065 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zf98\" (UniqueName: \"kubernetes.io/projected/598a75f1-7d24-40ff-818c-9cc023cf6b1d-kube-api-access-7zf98\") pod \"ceilometer-0\" (UID: \"598a75f1-7d24-40ff-818c-9cc023cf6b1d\") " pod="openstack/ceilometer-0" Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.659636 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/598a75f1-7d24-40ff-818c-9cc023cf6b1d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"598a75f1-7d24-40ff-818c-9cc023cf6b1d\") " pod="openstack/ceilometer-0" Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.659711 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/598a75f1-7d24-40ff-818c-9cc023cf6b1d-config-data\") pod \"ceilometer-0\" (UID: \"598a75f1-7d24-40ff-818c-9cc023cf6b1d\") " pod="openstack/ceilometer-0" Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.659748 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/598a75f1-7d24-40ff-818c-9cc023cf6b1d-scripts\") pod \"ceilometer-0\" (UID: \"598a75f1-7d24-40ff-818c-9cc023cf6b1d\") " pod="openstack/ceilometer-0" Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.659794 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/598a75f1-7d24-40ff-818c-9cc023cf6b1d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"598a75f1-7d24-40ff-818c-9cc023cf6b1d\") " pod="openstack/ceilometer-0" Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.659822 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/598a75f1-7d24-40ff-818c-9cc023cf6b1d-log-httpd\") pod \"ceilometer-0\" (UID: \"598a75f1-7d24-40ff-818c-9cc023cf6b1d\") " pod="openstack/ceilometer-0" Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.659858 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/598a75f1-7d24-40ff-818c-9cc023cf6b1d-run-httpd\") pod \"ceilometer-0\" (UID: \"598a75f1-7d24-40ff-818c-9cc023cf6b1d\") " pod="openstack/ceilometer-0" Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.659884 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/598a75f1-7d24-40ff-818c-9cc023cf6b1d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"598a75f1-7d24-40ff-818c-9cc023cf6b1d\") " pod="openstack/ceilometer-0" Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.659925 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zf98\" (UniqueName: \"kubernetes.io/projected/598a75f1-7d24-40ff-818c-9cc023cf6b1d-kube-api-access-7zf98\") pod \"ceilometer-0\" (UID: \"598a75f1-7d24-40ff-818c-9cc023cf6b1d\") " pod="openstack/ceilometer-0" Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.660812 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/598a75f1-7d24-40ff-818c-9cc023cf6b1d-log-httpd\") pod \"ceilometer-0\" (UID: \"598a75f1-7d24-40ff-818c-9cc023cf6b1d\") " pod="openstack/ceilometer-0" Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.660905 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/598a75f1-7d24-40ff-818c-9cc023cf6b1d-run-httpd\") pod \"ceilometer-0\" (UID: \"598a75f1-7d24-40ff-818c-9cc023cf6b1d\") " pod="openstack/ceilometer-0" Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.666764 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/598a75f1-7d24-40ff-818c-9cc023cf6b1d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"598a75f1-7d24-40ff-818c-9cc023cf6b1d\") " pod="openstack/ceilometer-0" Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.667225 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/598a75f1-7d24-40ff-818c-9cc023cf6b1d-config-data\") pod \"ceilometer-0\" (UID: \"598a75f1-7d24-40ff-818c-9cc023cf6b1d\") " pod="openstack/ceilometer-0" Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.667355 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/598a75f1-7d24-40ff-818c-9cc023cf6b1d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"598a75f1-7d24-40ff-818c-9cc023cf6b1d\") " pod="openstack/ceilometer-0" Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.668729 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/598a75f1-7d24-40ff-818c-9cc023cf6b1d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"598a75f1-7d24-40ff-818c-9cc023cf6b1d\") " pod="openstack/ceilometer-0" Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.673932 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/598a75f1-7d24-40ff-818c-9cc023cf6b1d-scripts\") pod \"ceilometer-0\" (UID: \"598a75f1-7d24-40ff-818c-9cc023cf6b1d\") " pod="openstack/ceilometer-0" Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.675877 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zf98\" (UniqueName: \"kubernetes.io/projected/598a75f1-7d24-40ff-818c-9cc023cf6b1d-kube-api-access-7zf98\") pod \"ceilometer-0\" (UID: \"598a75f1-7d24-40ff-818c-9cc023cf6b1d\") " pod="openstack/ceilometer-0" Dec 01 14:17:55 crc kubenswrapper[4585]: I1201 14:17:55.813355 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 14:17:56 crc kubenswrapper[4585]: I1201 14:17:56.261089 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 14:17:56 crc kubenswrapper[4585]: W1201 14:17:56.274316 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod598a75f1_7d24_40ff_818c_9cc023cf6b1d.slice/crio-1862be9a992370ed34411787960e22c27d7aed1db079a059e7a20cb8864e5dd0 WatchSource:0}: Error finding container 1862be9a992370ed34411787960e22c27d7aed1db079a059e7a20cb8864e5dd0: Status 404 returned error can't find the container with id 1862be9a992370ed34411787960e22c27d7aed1db079a059e7a20cb8864e5dd0 Dec 01 14:17:56 crc kubenswrapper[4585]: I1201 14:17:56.421485 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c322955-ba56-4357-bc08-9828e570d4c8" path="/var/lib/kubelet/pods/8c322955-ba56-4357-bc08-9828e570d4c8/volumes" Dec 01 14:17:56 crc kubenswrapper[4585]: I1201 14:17:56.807966 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 01 14:17:57 crc kubenswrapper[4585]: I1201 14:17:57.158787 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"598a75f1-7d24-40ff-818c-9cc023cf6b1d","Type":"ContainerStarted","Data":"1862be9a992370ed34411787960e22c27d7aed1db079a059e7a20cb8864e5dd0"} Dec 01 14:17:57 crc kubenswrapper[4585]: I1201 14:17:57.445864 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 01 14:17:58 crc kubenswrapper[4585]: I1201 14:17:58.169745 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"598a75f1-7d24-40ff-818c-9cc023cf6b1d","Type":"ContainerStarted","Data":"56ed34e01ef5221e28918e126f09c5316f9abcecff5977c46db1295865fa3514"} Dec 01 14:17:59 crc kubenswrapper[4585]: I1201 14:17:59.180552 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"598a75f1-7d24-40ff-818c-9cc023cf6b1d","Type":"ContainerStarted","Data":"f7df3c8837871732f4cb59973da980a1f548e7c24cb1981ec75c59f6478c8709"} Dec 01 14:17:59 crc kubenswrapper[4585]: I1201 14:17:59.453993 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 01 14:17:59 crc kubenswrapper[4585]: I1201 14:17:59.455546 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 01 14:18:00 crc kubenswrapper[4585]: I1201 14:18:00.190639 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"598a75f1-7d24-40ff-818c-9cc023cf6b1d","Type":"ContainerStarted","Data":"bcd328a250ed29664286a49c670772b93ce9f80a63db6bd812142e2a4a577a28"} Dec 01 14:18:00 crc kubenswrapper[4585]: I1201 14:18:00.464263 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="d9080235-3e98-4f0f-945c-475e6de22a49" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 01 14:18:00 crc kubenswrapper[4585]: I1201 14:18:00.465509 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="d9080235-3e98-4f0f-945c-475e6de22a49" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 01 14:18:01 crc kubenswrapper[4585]: I1201 14:18:01.808670 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 01 14:18:01 crc kubenswrapper[4585]: I1201 14:18:01.847702 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 01 14:18:01 crc kubenswrapper[4585]: I1201 14:18:01.892904 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 14:18:01 crc kubenswrapper[4585]: I1201 14:18:01.895200 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 14:18:02 crc kubenswrapper[4585]: I1201 14:18:02.243950 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 01 14:18:02 crc kubenswrapper[4585]: I1201 14:18:02.976230 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8aecb332-b72b-4862-8416-3e657de2aefd" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.195:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 14:18:02 crc kubenswrapper[4585]: I1201 14:18:02.976250 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8aecb332-b72b-4862-8416-3e657de2aefd" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.195:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 14:18:03 crc kubenswrapper[4585]: I1201 14:18:03.221000 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"598a75f1-7d24-40ff-818c-9cc023cf6b1d","Type":"ContainerStarted","Data":"b4813c79824deeaa37b1241c0d9acfa1e5dba7a9b048edb7a66b20a5bf0b54c7"} Dec 01 14:18:03 crc kubenswrapper[4585]: I1201 14:18:03.250194 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.680010903 podStartE2EDuration="8.250173648s" podCreationTimestamp="2025-12-01 14:17:55 +0000 UTC" firstStartedPulling="2025-12-01 14:17:56.27628479 +0000 UTC m=+1190.260498645" lastFinishedPulling="2025-12-01 14:18:01.846447535 +0000 UTC m=+1195.830661390" observedRunningTime="2025-12-01 14:18:03.23785766 +0000 UTC m=+1197.222071515" watchObservedRunningTime="2025-12-01 14:18:03.250173648 +0000 UTC m=+1197.234387493" Dec 01 14:18:04 crc kubenswrapper[4585]: I1201 14:18:04.230530 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 14:18:09 crc kubenswrapper[4585]: I1201 14:18:09.210322 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 01 14:18:09 crc kubenswrapper[4585]: I1201 14:18:09.278397 4585 generic.go:334] "Generic (PLEG): container finished" podID="b498816b-102e-4feb-bf5d-0666528a370d" containerID="e42a0432904abcccbd590e9b876347e521aecda15e651758e5e493362a880751" exitCode=137 Dec 01 14:18:09 crc kubenswrapper[4585]: I1201 14:18:09.278445 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b498816b-102e-4feb-bf5d-0666528a370d","Type":"ContainerDied","Data":"e42a0432904abcccbd590e9b876347e521aecda15e651758e5e493362a880751"} Dec 01 14:18:09 crc kubenswrapper[4585]: I1201 14:18:09.278448 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 01 14:18:09 crc kubenswrapper[4585]: I1201 14:18:09.278473 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b498816b-102e-4feb-bf5d-0666528a370d","Type":"ContainerDied","Data":"ff0a5cf12435ab507d74a93d86a901dda87ae28301dc967622d09cca206e9357"} Dec 01 14:18:09 crc kubenswrapper[4585]: I1201 14:18:09.278491 4585 scope.go:117] "RemoveContainer" containerID="e42a0432904abcccbd590e9b876347e521aecda15e651758e5e493362a880751" Dec 01 14:18:09 crc kubenswrapper[4585]: I1201 14:18:09.301106 4585 scope.go:117] "RemoveContainer" containerID="e42a0432904abcccbd590e9b876347e521aecda15e651758e5e493362a880751" Dec 01 14:18:09 crc kubenswrapper[4585]: E1201 14:18:09.301500 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e42a0432904abcccbd590e9b876347e521aecda15e651758e5e493362a880751\": container with ID starting with e42a0432904abcccbd590e9b876347e521aecda15e651758e5e493362a880751 not found: ID does not exist" containerID="e42a0432904abcccbd590e9b876347e521aecda15e651758e5e493362a880751" Dec 01 14:18:09 crc kubenswrapper[4585]: I1201 14:18:09.301541 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e42a0432904abcccbd590e9b876347e521aecda15e651758e5e493362a880751"} err="failed to get container status \"e42a0432904abcccbd590e9b876347e521aecda15e651758e5e493362a880751\": rpc error: code = NotFound desc = could not find container \"e42a0432904abcccbd590e9b876347e521aecda15e651758e5e493362a880751\": container with ID starting with e42a0432904abcccbd590e9b876347e521aecda15e651758e5e493362a880751 not found: ID does not exist" Dec 01 14:18:09 crc kubenswrapper[4585]: I1201 14:18:09.331067 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fg65b\" (UniqueName: \"kubernetes.io/projected/b498816b-102e-4feb-bf5d-0666528a370d-kube-api-access-fg65b\") pod \"b498816b-102e-4feb-bf5d-0666528a370d\" (UID: \"b498816b-102e-4feb-bf5d-0666528a370d\") " Dec 01 14:18:09 crc kubenswrapper[4585]: I1201 14:18:09.331106 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b498816b-102e-4feb-bf5d-0666528a370d-config-data\") pod \"b498816b-102e-4feb-bf5d-0666528a370d\" (UID: \"b498816b-102e-4feb-bf5d-0666528a370d\") " Dec 01 14:18:09 crc kubenswrapper[4585]: I1201 14:18:09.331339 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b498816b-102e-4feb-bf5d-0666528a370d-combined-ca-bundle\") pod \"b498816b-102e-4feb-bf5d-0666528a370d\" (UID: \"b498816b-102e-4feb-bf5d-0666528a370d\") " Dec 01 14:18:09 crc kubenswrapper[4585]: I1201 14:18:09.336753 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b498816b-102e-4feb-bf5d-0666528a370d-kube-api-access-fg65b" (OuterVolumeSpecName: "kube-api-access-fg65b") pod "b498816b-102e-4feb-bf5d-0666528a370d" (UID: "b498816b-102e-4feb-bf5d-0666528a370d"). InnerVolumeSpecName "kube-api-access-fg65b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:18:09 crc kubenswrapper[4585]: I1201 14:18:09.358118 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b498816b-102e-4feb-bf5d-0666528a370d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b498816b-102e-4feb-bf5d-0666528a370d" (UID: "b498816b-102e-4feb-bf5d-0666528a370d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:18:09 crc kubenswrapper[4585]: I1201 14:18:09.362456 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b498816b-102e-4feb-bf5d-0666528a370d-config-data" (OuterVolumeSpecName: "config-data") pod "b498816b-102e-4feb-bf5d-0666528a370d" (UID: "b498816b-102e-4feb-bf5d-0666528a370d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:18:09 crc kubenswrapper[4585]: I1201 14:18:09.433861 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b498816b-102e-4feb-bf5d-0666528a370d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 14:18:09 crc kubenswrapper[4585]: I1201 14:18:09.433913 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fg65b\" (UniqueName: \"kubernetes.io/projected/b498816b-102e-4feb-bf5d-0666528a370d-kube-api-access-fg65b\") on node \"crc\" DevicePath \"\"" Dec 01 14:18:09 crc kubenswrapper[4585]: I1201 14:18:09.433935 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b498816b-102e-4feb-bf5d-0666528a370d-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 14:18:09 crc kubenswrapper[4585]: I1201 14:18:09.460088 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 01 14:18:09 crc kubenswrapper[4585]: I1201 14:18:09.461451 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 01 14:18:09 crc kubenswrapper[4585]: I1201 14:18:09.468726 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 01 14:18:09 crc kubenswrapper[4585]: I1201 14:18:09.613837 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 14:18:09 crc kubenswrapper[4585]: I1201 14:18:09.630339 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 14:18:09 crc kubenswrapper[4585]: I1201 14:18:09.642183 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 14:18:09 crc kubenswrapper[4585]: E1201 14:18:09.642702 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b498816b-102e-4feb-bf5d-0666528a370d" containerName="nova-cell1-novncproxy-novncproxy" Dec 01 14:18:09 crc kubenswrapper[4585]: I1201 14:18:09.642726 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="b498816b-102e-4feb-bf5d-0666528a370d" containerName="nova-cell1-novncproxy-novncproxy" Dec 01 14:18:09 crc kubenswrapper[4585]: I1201 14:18:09.642990 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="b498816b-102e-4feb-bf5d-0666528a370d" containerName="nova-cell1-novncproxy-novncproxy" Dec 01 14:18:09 crc kubenswrapper[4585]: I1201 14:18:09.643809 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 01 14:18:09 crc kubenswrapper[4585]: I1201 14:18:09.648726 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 01 14:18:09 crc kubenswrapper[4585]: I1201 14:18:09.648919 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 01 14:18:09 crc kubenswrapper[4585]: I1201 14:18:09.649134 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 01 14:18:09 crc kubenswrapper[4585]: I1201 14:18:09.654196 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 14:18:09 crc kubenswrapper[4585]: I1201 14:18:09.738379 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5djd\" (UniqueName: \"kubernetes.io/projected/1ce769f5-6fc1-4585-a050-98ec1e1d9915-kube-api-access-n5djd\") pod \"nova-cell1-novncproxy-0\" (UID: \"1ce769f5-6fc1-4585-a050-98ec1e1d9915\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 14:18:09 crc kubenswrapper[4585]: I1201 14:18:09.738846 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ce769f5-6fc1-4585-a050-98ec1e1d9915-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1ce769f5-6fc1-4585-a050-98ec1e1d9915\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 14:18:09 crc kubenswrapper[4585]: I1201 14:18:09.739151 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ce769f5-6fc1-4585-a050-98ec1e1d9915-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"1ce769f5-6fc1-4585-a050-98ec1e1d9915\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 14:18:09 crc kubenswrapper[4585]: I1201 14:18:09.739379 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ce769f5-6fc1-4585-a050-98ec1e1d9915-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"1ce769f5-6fc1-4585-a050-98ec1e1d9915\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 14:18:09 crc kubenswrapper[4585]: I1201 14:18:09.739522 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ce769f5-6fc1-4585-a050-98ec1e1d9915-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1ce769f5-6fc1-4585-a050-98ec1e1d9915\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 14:18:09 crc kubenswrapper[4585]: I1201 14:18:09.841224 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ce769f5-6fc1-4585-a050-98ec1e1d9915-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1ce769f5-6fc1-4585-a050-98ec1e1d9915\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 14:18:09 crc kubenswrapper[4585]: I1201 14:18:09.841302 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ce769f5-6fc1-4585-a050-98ec1e1d9915-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"1ce769f5-6fc1-4585-a050-98ec1e1d9915\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 14:18:09 crc kubenswrapper[4585]: I1201 14:18:09.841334 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ce769f5-6fc1-4585-a050-98ec1e1d9915-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"1ce769f5-6fc1-4585-a050-98ec1e1d9915\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 14:18:09 crc kubenswrapper[4585]: I1201 14:18:09.841360 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ce769f5-6fc1-4585-a050-98ec1e1d9915-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1ce769f5-6fc1-4585-a050-98ec1e1d9915\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 14:18:09 crc kubenswrapper[4585]: I1201 14:18:09.841498 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5djd\" (UniqueName: \"kubernetes.io/projected/1ce769f5-6fc1-4585-a050-98ec1e1d9915-kube-api-access-n5djd\") pod \"nova-cell1-novncproxy-0\" (UID: \"1ce769f5-6fc1-4585-a050-98ec1e1d9915\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 14:18:09 crc kubenswrapper[4585]: I1201 14:18:09.844997 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ce769f5-6fc1-4585-a050-98ec1e1d9915-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1ce769f5-6fc1-4585-a050-98ec1e1d9915\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 14:18:09 crc kubenswrapper[4585]: I1201 14:18:09.845212 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ce769f5-6fc1-4585-a050-98ec1e1d9915-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"1ce769f5-6fc1-4585-a050-98ec1e1d9915\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 14:18:09 crc kubenswrapper[4585]: I1201 14:18:09.845228 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ce769f5-6fc1-4585-a050-98ec1e1d9915-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"1ce769f5-6fc1-4585-a050-98ec1e1d9915\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 14:18:09 crc kubenswrapper[4585]: I1201 14:18:09.845383 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ce769f5-6fc1-4585-a050-98ec1e1d9915-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1ce769f5-6fc1-4585-a050-98ec1e1d9915\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 14:18:09 crc kubenswrapper[4585]: I1201 14:18:09.859581 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5djd\" (UniqueName: \"kubernetes.io/projected/1ce769f5-6fc1-4585-a050-98ec1e1d9915-kube-api-access-n5djd\") pod \"nova-cell1-novncproxy-0\" (UID: \"1ce769f5-6fc1-4585-a050-98ec1e1d9915\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 14:18:09 crc kubenswrapper[4585]: I1201 14:18:09.975472 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 01 14:18:10 crc kubenswrapper[4585]: I1201 14:18:10.298870 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 01 14:18:10 crc kubenswrapper[4585]: I1201 14:18:10.424518 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b498816b-102e-4feb-bf5d-0666528a370d" path="/var/lib/kubelet/pods/b498816b-102e-4feb-bf5d-0666528a370d/volumes" Dec 01 14:18:10 crc kubenswrapper[4585]: I1201 14:18:10.466113 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 14:18:11 crc kubenswrapper[4585]: I1201 14:18:11.306986 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1ce769f5-6fc1-4585-a050-98ec1e1d9915","Type":"ContainerStarted","Data":"d548204e4fa0499daed7342bd2fd4505ee8f119d37efe7b625d6973092d4f0b2"} Dec 01 14:18:11 crc kubenswrapper[4585]: I1201 14:18:11.307278 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1ce769f5-6fc1-4585-a050-98ec1e1d9915","Type":"ContainerStarted","Data":"9030455fe5d0088c2dcf896fa553771a96e78333acebb084e64620b98fa886ee"} Dec 01 14:18:11 crc kubenswrapper[4585]: I1201 14:18:11.343270 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.343249881 podStartE2EDuration="2.343249881s" podCreationTimestamp="2025-12-01 14:18:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:18:11.329885334 +0000 UTC m=+1205.314099199" watchObservedRunningTime="2025-12-01 14:18:11.343249881 +0000 UTC m=+1205.327463736" Dec 01 14:18:11 crc kubenswrapper[4585]: I1201 14:18:11.893558 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 01 14:18:11 crc kubenswrapper[4585]: I1201 14:18:11.894831 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 01 14:18:11 crc kubenswrapper[4585]: I1201 14:18:11.895264 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 01 14:18:11 crc kubenswrapper[4585]: I1201 14:18:11.895619 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 01 14:18:11 crc kubenswrapper[4585]: I1201 14:18:11.900166 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 01 14:18:11 crc kubenswrapper[4585]: I1201 14:18:11.906046 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 01 14:18:12 crc kubenswrapper[4585]: I1201 14:18:12.122046 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-lxk24"] Dec 01 14:18:12 crc kubenswrapper[4585]: I1201 14:18:12.123523 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-lxk24" Dec 01 14:18:12 crc kubenswrapper[4585]: I1201 14:18:12.149898 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-lxk24"] Dec 01 14:18:12 crc kubenswrapper[4585]: I1201 14:18:12.184877 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkwmf\" (UniqueName: \"kubernetes.io/projected/d3580eb6-20f0-4ed5-a45b-6b081edd487d-kube-api-access-wkwmf\") pod \"dnsmasq-dns-89c5cd4d5-lxk24\" (UID: \"d3580eb6-20f0-4ed5-a45b-6b081edd487d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-lxk24" Dec 01 14:18:12 crc kubenswrapper[4585]: I1201 14:18:12.184939 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d3580eb6-20f0-4ed5-a45b-6b081edd487d-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-lxk24\" (UID: \"d3580eb6-20f0-4ed5-a45b-6b081edd487d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-lxk24" Dec 01 14:18:12 crc kubenswrapper[4585]: I1201 14:18:12.184999 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d3580eb6-20f0-4ed5-a45b-6b081edd487d-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-lxk24\" (UID: \"d3580eb6-20f0-4ed5-a45b-6b081edd487d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-lxk24" Dec 01 14:18:12 crc kubenswrapper[4585]: I1201 14:18:12.185036 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d3580eb6-20f0-4ed5-a45b-6b081edd487d-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-lxk24\" (UID: \"d3580eb6-20f0-4ed5-a45b-6b081edd487d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-lxk24" Dec 01 14:18:12 crc kubenswrapper[4585]: I1201 14:18:12.185057 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3580eb6-20f0-4ed5-a45b-6b081edd487d-config\") pod \"dnsmasq-dns-89c5cd4d5-lxk24\" (UID: \"d3580eb6-20f0-4ed5-a45b-6b081edd487d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-lxk24" Dec 01 14:18:12 crc kubenswrapper[4585]: I1201 14:18:12.185147 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d3580eb6-20f0-4ed5-a45b-6b081edd487d-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-lxk24\" (UID: \"d3580eb6-20f0-4ed5-a45b-6b081edd487d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-lxk24" Dec 01 14:18:12 crc kubenswrapper[4585]: I1201 14:18:12.286945 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d3580eb6-20f0-4ed5-a45b-6b081edd487d-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-lxk24\" (UID: \"d3580eb6-20f0-4ed5-a45b-6b081edd487d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-lxk24" Dec 01 14:18:12 crc kubenswrapper[4585]: I1201 14:18:12.287218 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkwmf\" (UniqueName: \"kubernetes.io/projected/d3580eb6-20f0-4ed5-a45b-6b081edd487d-kube-api-access-wkwmf\") pod \"dnsmasq-dns-89c5cd4d5-lxk24\" (UID: \"d3580eb6-20f0-4ed5-a45b-6b081edd487d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-lxk24" Dec 01 14:18:12 crc kubenswrapper[4585]: I1201 14:18:12.287261 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d3580eb6-20f0-4ed5-a45b-6b081edd487d-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-lxk24\" (UID: \"d3580eb6-20f0-4ed5-a45b-6b081edd487d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-lxk24" Dec 01 14:18:12 crc kubenswrapper[4585]: I1201 14:18:12.287304 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d3580eb6-20f0-4ed5-a45b-6b081edd487d-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-lxk24\" (UID: \"d3580eb6-20f0-4ed5-a45b-6b081edd487d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-lxk24" Dec 01 14:18:12 crc kubenswrapper[4585]: I1201 14:18:12.287342 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d3580eb6-20f0-4ed5-a45b-6b081edd487d-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-lxk24\" (UID: \"d3580eb6-20f0-4ed5-a45b-6b081edd487d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-lxk24" Dec 01 14:18:12 crc kubenswrapper[4585]: I1201 14:18:12.287367 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3580eb6-20f0-4ed5-a45b-6b081edd487d-config\") pod \"dnsmasq-dns-89c5cd4d5-lxk24\" (UID: \"d3580eb6-20f0-4ed5-a45b-6b081edd487d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-lxk24" Dec 01 14:18:12 crc kubenswrapper[4585]: I1201 14:18:12.288309 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3580eb6-20f0-4ed5-a45b-6b081edd487d-config\") pod \"dnsmasq-dns-89c5cd4d5-lxk24\" (UID: \"d3580eb6-20f0-4ed5-a45b-6b081edd487d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-lxk24" Dec 01 14:18:12 crc kubenswrapper[4585]: I1201 14:18:12.288959 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d3580eb6-20f0-4ed5-a45b-6b081edd487d-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-lxk24\" (UID: \"d3580eb6-20f0-4ed5-a45b-6b081edd487d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-lxk24" Dec 01 14:18:12 crc kubenswrapper[4585]: I1201 14:18:12.289903 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d3580eb6-20f0-4ed5-a45b-6b081edd487d-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-lxk24\" (UID: \"d3580eb6-20f0-4ed5-a45b-6b081edd487d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-lxk24" Dec 01 14:18:12 crc kubenswrapper[4585]: I1201 14:18:12.291894 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d3580eb6-20f0-4ed5-a45b-6b081edd487d-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-lxk24\" (UID: \"d3580eb6-20f0-4ed5-a45b-6b081edd487d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-lxk24" Dec 01 14:18:12 crc kubenswrapper[4585]: I1201 14:18:12.292575 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d3580eb6-20f0-4ed5-a45b-6b081edd487d-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-lxk24\" (UID: \"d3580eb6-20f0-4ed5-a45b-6b081edd487d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-lxk24" Dec 01 14:18:12 crc kubenswrapper[4585]: I1201 14:18:12.310562 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkwmf\" (UniqueName: \"kubernetes.io/projected/d3580eb6-20f0-4ed5-a45b-6b081edd487d-kube-api-access-wkwmf\") pod \"dnsmasq-dns-89c5cd4d5-lxk24\" (UID: \"d3580eb6-20f0-4ed5-a45b-6b081edd487d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-lxk24" Dec 01 14:18:12 crc kubenswrapper[4585]: I1201 14:18:12.442687 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-lxk24" Dec 01 14:18:12 crc kubenswrapper[4585]: I1201 14:18:12.993338 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-lxk24"] Dec 01 14:18:13 crc kubenswrapper[4585]: I1201 14:18:13.366610 4585 generic.go:334] "Generic (PLEG): container finished" podID="d3580eb6-20f0-4ed5-a45b-6b081edd487d" containerID="9ed97749f4ec64edb57f645adc69e887c4cd1acd81ecf8eb255c340f7ea63f5b" exitCode=0 Dec 01 14:18:13 crc kubenswrapper[4585]: I1201 14:18:13.366773 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-lxk24" event={"ID":"d3580eb6-20f0-4ed5-a45b-6b081edd487d","Type":"ContainerDied","Data":"9ed97749f4ec64edb57f645adc69e887c4cd1acd81ecf8eb255c340f7ea63f5b"} Dec 01 14:18:13 crc kubenswrapper[4585]: I1201 14:18:13.366900 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-lxk24" event={"ID":"d3580eb6-20f0-4ed5-a45b-6b081edd487d","Type":"ContainerStarted","Data":"62c5562efca0f4a33895898c2e5e2b7ddc4301d722153d06ddd94c5e42bbbcfe"} Dec 01 14:18:13 crc kubenswrapper[4585]: I1201 14:18:13.716633 4585 patch_prober.go:28] interesting pod/machine-config-daemon-lj9gs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 14:18:13 crc kubenswrapper[4585]: I1201 14:18:13.716997 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 14:18:14 crc kubenswrapper[4585]: I1201 14:18:14.377775 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-lxk24" event={"ID":"d3580eb6-20f0-4ed5-a45b-6b081edd487d","Type":"ContainerStarted","Data":"f4981f468997dae832f470c09a3c8b928be8b9dca1ef159716ef82c1c38d2ff9"} Dec 01 14:18:14 crc kubenswrapper[4585]: I1201 14:18:14.378034 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-89c5cd4d5-lxk24" Dec 01 14:18:14 crc kubenswrapper[4585]: I1201 14:18:14.399958 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-89c5cd4d5-lxk24" podStartSLOduration=2.399938206 podStartE2EDuration="2.399938206s" podCreationTimestamp="2025-12-01 14:18:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:18:14.398843546 +0000 UTC m=+1208.383057411" watchObservedRunningTime="2025-12-01 14:18:14.399938206 +0000 UTC m=+1208.384152061" Dec 01 14:18:14 crc kubenswrapper[4585]: I1201 14:18:14.670824 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 14:18:14 crc kubenswrapper[4585]: I1201 14:18:14.671425 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="598a75f1-7d24-40ff-818c-9cc023cf6b1d" containerName="ceilometer-central-agent" containerID="cri-o://56ed34e01ef5221e28918e126f09c5316f9abcecff5977c46db1295865fa3514" gracePeriod=30 Dec 01 14:18:14 crc kubenswrapper[4585]: I1201 14:18:14.671581 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="598a75f1-7d24-40ff-818c-9cc023cf6b1d" containerName="proxy-httpd" containerID="cri-o://b4813c79824deeaa37b1241c0d9acfa1e5dba7a9b048edb7a66b20a5bf0b54c7" gracePeriod=30 Dec 01 14:18:14 crc kubenswrapper[4585]: I1201 14:18:14.671636 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="598a75f1-7d24-40ff-818c-9cc023cf6b1d" containerName="sg-core" containerID="cri-o://bcd328a250ed29664286a49c670772b93ce9f80a63db6bd812142e2a4a577a28" gracePeriod=30 Dec 01 14:18:14 crc kubenswrapper[4585]: I1201 14:18:14.671680 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="598a75f1-7d24-40ff-818c-9cc023cf6b1d" containerName="ceilometer-notification-agent" containerID="cri-o://f7df3c8837871732f4cb59973da980a1f548e7c24cb1981ec75c59f6478c8709" gracePeriod=30 Dec 01 14:18:14 crc kubenswrapper[4585]: I1201 14:18:14.685937 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 01 14:18:14 crc kubenswrapper[4585]: I1201 14:18:14.948337 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 01 14:18:14 crc kubenswrapper[4585]: I1201 14:18:14.950167 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8aecb332-b72b-4862-8416-3e657de2aefd" containerName="nova-api-api" containerID="cri-o://e77ac58462a4b79108075f8a16c90c07b74d4a30a529d6f0b7e5d6d14b7b357a" gracePeriod=30 Dec 01 14:18:14 crc kubenswrapper[4585]: I1201 14:18:14.950087 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8aecb332-b72b-4862-8416-3e657de2aefd" containerName="nova-api-log" containerID="cri-o://3c7093cd7173bbcba9cabf4360fad48fc3e1868aaef456aa42e47b0b8effce35" gracePeriod=30 Dec 01 14:18:14 crc kubenswrapper[4585]: I1201 14:18:14.976862 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 01 14:18:15 crc kubenswrapper[4585]: I1201 14:18:15.391013 4585 generic.go:334] "Generic (PLEG): container finished" podID="8aecb332-b72b-4862-8416-3e657de2aefd" containerID="3c7093cd7173bbcba9cabf4360fad48fc3e1868aaef456aa42e47b0b8effce35" exitCode=143 Dec 01 14:18:15 crc kubenswrapper[4585]: I1201 14:18:15.391276 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8aecb332-b72b-4862-8416-3e657de2aefd","Type":"ContainerDied","Data":"3c7093cd7173bbcba9cabf4360fad48fc3e1868aaef456aa42e47b0b8effce35"} Dec 01 14:18:15 crc kubenswrapper[4585]: I1201 14:18:15.393330 4585 generic.go:334] "Generic (PLEG): container finished" podID="598a75f1-7d24-40ff-818c-9cc023cf6b1d" containerID="b4813c79824deeaa37b1241c0d9acfa1e5dba7a9b048edb7a66b20a5bf0b54c7" exitCode=0 Dec 01 14:18:15 crc kubenswrapper[4585]: I1201 14:18:15.393381 4585 generic.go:334] "Generic (PLEG): container finished" podID="598a75f1-7d24-40ff-818c-9cc023cf6b1d" containerID="bcd328a250ed29664286a49c670772b93ce9f80a63db6bd812142e2a4a577a28" exitCode=2 Dec 01 14:18:15 crc kubenswrapper[4585]: I1201 14:18:15.393393 4585 generic.go:334] "Generic (PLEG): container finished" podID="598a75f1-7d24-40ff-818c-9cc023cf6b1d" containerID="56ed34e01ef5221e28918e126f09c5316f9abcecff5977c46db1295865fa3514" exitCode=0 Dec 01 14:18:15 crc kubenswrapper[4585]: I1201 14:18:15.394520 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"598a75f1-7d24-40ff-818c-9cc023cf6b1d","Type":"ContainerDied","Data":"b4813c79824deeaa37b1241c0d9acfa1e5dba7a9b048edb7a66b20a5bf0b54c7"} Dec 01 14:18:15 crc kubenswrapper[4585]: I1201 14:18:15.394552 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"598a75f1-7d24-40ff-818c-9cc023cf6b1d","Type":"ContainerDied","Data":"bcd328a250ed29664286a49c670772b93ce9f80a63db6bd812142e2a4a577a28"} Dec 01 14:18:15 crc kubenswrapper[4585]: I1201 14:18:15.394567 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"598a75f1-7d24-40ff-818c-9cc023cf6b1d","Type":"ContainerDied","Data":"56ed34e01ef5221e28918e126f09c5316f9abcecff5977c46db1295865fa3514"} Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.124611 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.188684 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/598a75f1-7d24-40ff-818c-9cc023cf6b1d-log-httpd\") pod \"598a75f1-7d24-40ff-818c-9cc023cf6b1d\" (UID: \"598a75f1-7d24-40ff-818c-9cc023cf6b1d\") " Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.188766 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/598a75f1-7d24-40ff-818c-9cc023cf6b1d-run-httpd\") pod \"598a75f1-7d24-40ff-818c-9cc023cf6b1d\" (UID: \"598a75f1-7d24-40ff-818c-9cc023cf6b1d\") " Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.188833 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/598a75f1-7d24-40ff-818c-9cc023cf6b1d-ceilometer-tls-certs\") pod \"598a75f1-7d24-40ff-818c-9cc023cf6b1d\" (UID: \"598a75f1-7d24-40ff-818c-9cc023cf6b1d\") " Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.188906 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/598a75f1-7d24-40ff-818c-9cc023cf6b1d-sg-core-conf-yaml\") pod \"598a75f1-7d24-40ff-818c-9cc023cf6b1d\" (UID: \"598a75f1-7d24-40ff-818c-9cc023cf6b1d\") " Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.188953 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zf98\" (UniqueName: \"kubernetes.io/projected/598a75f1-7d24-40ff-818c-9cc023cf6b1d-kube-api-access-7zf98\") pod \"598a75f1-7d24-40ff-818c-9cc023cf6b1d\" (UID: \"598a75f1-7d24-40ff-818c-9cc023cf6b1d\") " Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.188993 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/598a75f1-7d24-40ff-818c-9cc023cf6b1d-combined-ca-bundle\") pod \"598a75f1-7d24-40ff-818c-9cc023cf6b1d\" (UID: \"598a75f1-7d24-40ff-818c-9cc023cf6b1d\") " Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.189031 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/598a75f1-7d24-40ff-818c-9cc023cf6b1d-scripts\") pod \"598a75f1-7d24-40ff-818c-9cc023cf6b1d\" (UID: \"598a75f1-7d24-40ff-818c-9cc023cf6b1d\") " Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.189148 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/598a75f1-7d24-40ff-818c-9cc023cf6b1d-config-data\") pod \"598a75f1-7d24-40ff-818c-9cc023cf6b1d\" (UID: \"598a75f1-7d24-40ff-818c-9cc023cf6b1d\") " Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.189371 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/598a75f1-7d24-40ff-818c-9cc023cf6b1d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "598a75f1-7d24-40ff-818c-9cc023cf6b1d" (UID: "598a75f1-7d24-40ff-818c-9cc023cf6b1d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.189760 4585 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/598a75f1-7d24-40ff-818c-9cc023cf6b1d-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.190069 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/598a75f1-7d24-40ff-818c-9cc023cf6b1d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "598a75f1-7d24-40ff-818c-9cc023cf6b1d" (UID: "598a75f1-7d24-40ff-818c-9cc023cf6b1d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.198752 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/598a75f1-7d24-40ff-818c-9cc023cf6b1d-kube-api-access-7zf98" (OuterVolumeSpecName: "kube-api-access-7zf98") pod "598a75f1-7d24-40ff-818c-9cc023cf6b1d" (UID: "598a75f1-7d24-40ff-818c-9cc023cf6b1d"). InnerVolumeSpecName "kube-api-access-7zf98". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.212012 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/598a75f1-7d24-40ff-818c-9cc023cf6b1d-scripts" (OuterVolumeSpecName: "scripts") pod "598a75f1-7d24-40ff-818c-9cc023cf6b1d" (UID: "598a75f1-7d24-40ff-818c-9cc023cf6b1d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.254161 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/598a75f1-7d24-40ff-818c-9cc023cf6b1d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "598a75f1-7d24-40ff-818c-9cc023cf6b1d" (UID: "598a75f1-7d24-40ff-818c-9cc023cf6b1d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.291304 4585 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/598a75f1-7d24-40ff-818c-9cc023cf6b1d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.291342 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zf98\" (UniqueName: \"kubernetes.io/projected/598a75f1-7d24-40ff-818c-9cc023cf6b1d-kube-api-access-7zf98\") on node \"crc\" DevicePath \"\"" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.291356 4585 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/598a75f1-7d24-40ff-818c-9cc023cf6b1d-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.291368 4585 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/598a75f1-7d24-40ff-818c-9cc023cf6b1d-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.302444 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/598a75f1-7d24-40ff-818c-9cc023cf6b1d-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "598a75f1-7d24-40ff-818c-9cc023cf6b1d" (UID: "598a75f1-7d24-40ff-818c-9cc023cf6b1d"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.332503 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/598a75f1-7d24-40ff-818c-9cc023cf6b1d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "598a75f1-7d24-40ff-818c-9cc023cf6b1d" (UID: "598a75f1-7d24-40ff-818c-9cc023cf6b1d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.352132 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/598a75f1-7d24-40ff-818c-9cc023cf6b1d-config-data" (OuterVolumeSpecName: "config-data") pod "598a75f1-7d24-40ff-818c-9cc023cf6b1d" (UID: "598a75f1-7d24-40ff-818c-9cc023cf6b1d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.392645 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/598a75f1-7d24-40ff-818c-9cc023cf6b1d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.393332 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/598a75f1-7d24-40ff-818c-9cc023cf6b1d-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.393414 4585 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/598a75f1-7d24-40ff-818c-9cc023cf6b1d-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.403540 4585 generic.go:334] "Generic (PLEG): container finished" podID="598a75f1-7d24-40ff-818c-9cc023cf6b1d" containerID="f7df3c8837871732f4cb59973da980a1f548e7c24cb1981ec75c59f6478c8709" exitCode=0 Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.403578 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"598a75f1-7d24-40ff-818c-9cc023cf6b1d","Type":"ContainerDied","Data":"f7df3c8837871732f4cb59973da980a1f548e7c24cb1981ec75c59f6478c8709"} Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.403600 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.403612 4585 scope.go:117] "RemoveContainer" containerID="b4813c79824deeaa37b1241c0d9acfa1e5dba7a9b048edb7a66b20a5bf0b54c7" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.403601 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"598a75f1-7d24-40ff-818c-9cc023cf6b1d","Type":"ContainerDied","Data":"1862be9a992370ed34411787960e22c27d7aed1db079a059e7a20cb8864e5dd0"} Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.436227 4585 scope.go:117] "RemoveContainer" containerID="bcd328a250ed29664286a49c670772b93ce9f80a63db6bd812142e2a4a577a28" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.453402 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.463471 4585 scope.go:117] "RemoveContainer" containerID="f7df3c8837871732f4cb59973da980a1f548e7c24cb1981ec75c59f6478c8709" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.478877 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.494001 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 14:18:16 crc kubenswrapper[4585]: E1201 14:18:16.502496 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="598a75f1-7d24-40ff-818c-9cc023cf6b1d" containerName="ceilometer-notification-agent" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.502532 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="598a75f1-7d24-40ff-818c-9cc023cf6b1d" containerName="ceilometer-notification-agent" Dec 01 14:18:16 crc kubenswrapper[4585]: E1201 14:18:16.502555 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="598a75f1-7d24-40ff-818c-9cc023cf6b1d" containerName="sg-core" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.502562 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="598a75f1-7d24-40ff-818c-9cc023cf6b1d" containerName="sg-core" Dec 01 14:18:16 crc kubenswrapper[4585]: E1201 14:18:16.502581 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="598a75f1-7d24-40ff-818c-9cc023cf6b1d" containerName="proxy-httpd" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.502588 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="598a75f1-7d24-40ff-818c-9cc023cf6b1d" containerName="proxy-httpd" Dec 01 14:18:16 crc kubenswrapper[4585]: E1201 14:18:16.502606 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="598a75f1-7d24-40ff-818c-9cc023cf6b1d" containerName="ceilometer-central-agent" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.502613 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="598a75f1-7d24-40ff-818c-9cc023cf6b1d" containerName="ceilometer-central-agent" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.502817 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="598a75f1-7d24-40ff-818c-9cc023cf6b1d" containerName="proxy-httpd" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.502832 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="598a75f1-7d24-40ff-818c-9cc023cf6b1d" containerName="ceilometer-notification-agent" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.502843 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="598a75f1-7d24-40ff-818c-9cc023cf6b1d" containerName="ceilometer-central-agent" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.502865 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="598a75f1-7d24-40ff-818c-9cc023cf6b1d" containerName="sg-core" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.504557 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.509795 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.510092 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.511203 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.520181 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.532090 4585 scope.go:117] "RemoveContainer" containerID="56ed34e01ef5221e28918e126f09c5316f9abcecff5977c46db1295865fa3514" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.554468 4585 scope.go:117] "RemoveContainer" containerID="b4813c79824deeaa37b1241c0d9acfa1e5dba7a9b048edb7a66b20a5bf0b54c7" Dec 01 14:18:16 crc kubenswrapper[4585]: E1201 14:18:16.559957 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4813c79824deeaa37b1241c0d9acfa1e5dba7a9b048edb7a66b20a5bf0b54c7\": container with ID starting with b4813c79824deeaa37b1241c0d9acfa1e5dba7a9b048edb7a66b20a5bf0b54c7 not found: ID does not exist" containerID="b4813c79824deeaa37b1241c0d9acfa1e5dba7a9b048edb7a66b20a5bf0b54c7" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.560098 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4813c79824deeaa37b1241c0d9acfa1e5dba7a9b048edb7a66b20a5bf0b54c7"} err="failed to get container status \"b4813c79824deeaa37b1241c0d9acfa1e5dba7a9b048edb7a66b20a5bf0b54c7\": rpc error: code = NotFound desc = could not find container \"b4813c79824deeaa37b1241c0d9acfa1e5dba7a9b048edb7a66b20a5bf0b54c7\": container with ID starting with b4813c79824deeaa37b1241c0d9acfa1e5dba7a9b048edb7a66b20a5bf0b54c7 not found: ID does not exist" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.560127 4585 scope.go:117] "RemoveContainer" containerID="bcd328a250ed29664286a49c670772b93ce9f80a63db6bd812142e2a4a577a28" Dec 01 14:18:16 crc kubenswrapper[4585]: E1201 14:18:16.560527 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcd328a250ed29664286a49c670772b93ce9f80a63db6bd812142e2a4a577a28\": container with ID starting with bcd328a250ed29664286a49c670772b93ce9f80a63db6bd812142e2a4a577a28 not found: ID does not exist" containerID="bcd328a250ed29664286a49c670772b93ce9f80a63db6bd812142e2a4a577a28" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.560549 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcd328a250ed29664286a49c670772b93ce9f80a63db6bd812142e2a4a577a28"} err="failed to get container status \"bcd328a250ed29664286a49c670772b93ce9f80a63db6bd812142e2a4a577a28\": rpc error: code = NotFound desc = could not find container \"bcd328a250ed29664286a49c670772b93ce9f80a63db6bd812142e2a4a577a28\": container with ID starting with bcd328a250ed29664286a49c670772b93ce9f80a63db6bd812142e2a4a577a28 not found: ID does not exist" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.560562 4585 scope.go:117] "RemoveContainer" containerID="f7df3c8837871732f4cb59973da980a1f548e7c24cb1981ec75c59f6478c8709" Dec 01 14:18:16 crc kubenswrapper[4585]: E1201 14:18:16.564018 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7df3c8837871732f4cb59973da980a1f548e7c24cb1981ec75c59f6478c8709\": container with ID starting with f7df3c8837871732f4cb59973da980a1f548e7c24cb1981ec75c59f6478c8709 not found: ID does not exist" containerID="f7df3c8837871732f4cb59973da980a1f548e7c24cb1981ec75c59f6478c8709" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.564059 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7df3c8837871732f4cb59973da980a1f548e7c24cb1981ec75c59f6478c8709"} err="failed to get container status \"f7df3c8837871732f4cb59973da980a1f548e7c24cb1981ec75c59f6478c8709\": rpc error: code = NotFound desc = could not find container \"f7df3c8837871732f4cb59973da980a1f548e7c24cb1981ec75c59f6478c8709\": container with ID starting with f7df3c8837871732f4cb59973da980a1f548e7c24cb1981ec75c59f6478c8709 not found: ID does not exist" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.564084 4585 scope.go:117] "RemoveContainer" containerID="56ed34e01ef5221e28918e126f09c5316f9abcecff5977c46db1295865fa3514" Dec 01 14:18:16 crc kubenswrapper[4585]: E1201 14:18:16.570579 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56ed34e01ef5221e28918e126f09c5316f9abcecff5977c46db1295865fa3514\": container with ID starting with 56ed34e01ef5221e28918e126f09c5316f9abcecff5977c46db1295865fa3514 not found: ID does not exist" containerID="56ed34e01ef5221e28918e126f09c5316f9abcecff5977c46db1295865fa3514" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.570616 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56ed34e01ef5221e28918e126f09c5316f9abcecff5977c46db1295865fa3514"} err="failed to get container status \"56ed34e01ef5221e28918e126f09c5316f9abcecff5977c46db1295865fa3514\": rpc error: code = NotFound desc = could not find container \"56ed34e01ef5221e28918e126f09c5316f9abcecff5977c46db1295865fa3514\": container with ID starting with 56ed34e01ef5221e28918e126f09c5316f9abcecff5977c46db1295865fa3514 not found: ID does not exist" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.602406 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65qtq\" (UniqueName: \"kubernetes.io/projected/f1b1f236-fce8-4c89-9a75-d2e965fa825d-kube-api-access-65qtq\") pod \"ceilometer-0\" (UID: \"f1b1f236-fce8-4c89-9a75-d2e965fa825d\") " pod="openstack/ceilometer-0" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.602481 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1b1f236-fce8-4c89-9a75-d2e965fa825d-log-httpd\") pod \"ceilometer-0\" (UID: \"f1b1f236-fce8-4c89-9a75-d2e965fa825d\") " pod="openstack/ceilometer-0" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.602502 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1b1f236-fce8-4c89-9a75-d2e965fa825d-config-data\") pod \"ceilometer-0\" (UID: \"f1b1f236-fce8-4c89-9a75-d2e965fa825d\") " pod="openstack/ceilometer-0" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.602534 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1b1f236-fce8-4c89-9a75-d2e965fa825d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f1b1f236-fce8-4c89-9a75-d2e965fa825d\") " pod="openstack/ceilometer-0" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.602550 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f1b1f236-fce8-4c89-9a75-d2e965fa825d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f1b1f236-fce8-4c89-9a75-d2e965fa825d\") " pod="openstack/ceilometer-0" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.602568 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1b1f236-fce8-4c89-9a75-d2e965fa825d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f1b1f236-fce8-4c89-9a75-d2e965fa825d\") " pod="openstack/ceilometer-0" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.602583 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1b1f236-fce8-4c89-9a75-d2e965fa825d-scripts\") pod \"ceilometer-0\" (UID: \"f1b1f236-fce8-4c89-9a75-d2e965fa825d\") " pod="openstack/ceilometer-0" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.602655 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1b1f236-fce8-4c89-9a75-d2e965fa825d-run-httpd\") pod \"ceilometer-0\" (UID: \"f1b1f236-fce8-4c89-9a75-d2e965fa825d\") " pod="openstack/ceilometer-0" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.703924 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1b1f236-fce8-4c89-9a75-d2e965fa825d-log-httpd\") pod \"ceilometer-0\" (UID: \"f1b1f236-fce8-4c89-9a75-d2e965fa825d\") " pod="openstack/ceilometer-0" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.704088 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1b1f236-fce8-4c89-9a75-d2e965fa825d-config-data\") pod \"ceilometer-0\" (UID: \"f1b1f236-fce8-4c89-9a75-d2e965fa825d\") " pod="openstack/ceilometer-0" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.704147 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1b1f236-fce8-4c89-9a75-d2e965fa825d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f1b1f236-fce8-4c89-9a75-d2e965fa825d\") " pod="openstack/ceilometer-0" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.704175 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f1b1f236-fce8-4c89-9a75-d2e965fa825d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f1b1f236-fce8-4c89-9a75-d2e965fa825d\") " pod="openstack/ceilometer-0" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.704204 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1b1f236-fce8-4c89-9a75-d2e965fa825d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f1b1f236-fce8-4c89-9a75-d2e965fa825d\") " pod="openstack/ceilometer-0" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.704229 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1b1f236-fce8-4c89-9a75-d2e965fa825d-scripts\") pod \"ceilometer-0\" (UID: \"f1b1f236-fce8-4c89-9a75-d2e965fa825d\") " pod="openstack/ceilometer-0" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.704326 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1b1f236-fce8-4c89-9a75-d2e965fa825d-run-httpd\") pod \"ceilometer-0\" (UID: \"f1b1f236-fce8-4c89-9a75-d2e965fa825d\") " pod="openstack/ceilometer-0" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.704391 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65qtq\" (UniqueName: \"kubernetes.io/projected/f1b1f236-fce8-4c89-9a75-d2e965fa825d-kube-api-access-65qtq\") pod \"ceilometer-0\" (UID: \"f1b1f236-fce8-4c89-9a75-d2e965fa825d\") " pod="openstack/ceilometer-0" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.704471 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1b1f236-fce8-4c89-9a75-d2e965fa825d-log-httpd\") pod \"ceilometer-0\" (UID: \"f1b1f236-fce8-4c89-9a75-d2e965fa825d\") " pod="openstack/ceilometer-0" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.704795 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1b1f236-fce8-4c89-9a75-d2e965fa825d-run-httpd\") pod \"ceilometer-0\" (UID: \"f1b1f236-fce8-4c89-9a75-d2e965fa825d\") " pod="openstack/ceilometer-0" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.708995 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1b1f236-fce8-4c89-9a75-d2e965fa825d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f1b1f236-fce8-4c89-9a75-d2e965fa825d\") " pod="openstack/ceilometer-0" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.709173 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f1b1f236-fce8-4c89-9a75-d2e965fa825d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f1b1f236-fce8-4c89-9a75-d2e965fa825d\") " pod="openstack/ceilometer-0" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.710734 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1b1f236-fce8-4c89-9a75-d2e965fa825d-config-data\") pod \"ceilometer-0\" (UID: \"f1b1f236-fce8-4c89-9a75-d2e965fa825d\") " pod="openstack/ceilometer-0" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.711253 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1b1f236-fce8-4c89-9a75-d2e965fa825d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f1b1f236-fce8-4c89-9a75-d2e965fa825d\") " pod="openstack/ceilometer-0" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.715395 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1b1f236-fce8-4c89-9a75-d2e965fa825d-scripts\") pod \"ceilometer-0\" (UID: \"f1b1f236-fce8-4c89-9a75-d2e965fa825d\") " pod="openstack/ceilometer-0" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.721551 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65qtq\" (UniqueName: \"kubernetes.io/projected/f1b1f236-fce8-4c89-9a75-d2e965fa825d-kube-api-access-65qtq\") pod \"ceilometer-0\" (UID: \"f1b1f236-fce8-4c89-9a75-d2e965fa825d\") " pod="openstack/ceilometer-0" Dec 01 14:18:16 crc kubenswrapper[4585]: I1201 14:18:16.823156 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 14:18:17 crc kubenswrapper[4585]: I1201 14:18:17.347213 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 14:18:17 crc kubenswrapper[4585]: W1201 14:18:17.353068 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1b1f236_fce8_4c89_9a75_d2e965fa825d.slice/crio-82a21afb2d52f429b13c9a3048dce436ef1998664a99a196057b01ef77bf2a72 WatchSource:0}: Error finding container 82a21afb2d52f429b13c9a3048dce436ef1998664a99a196057b01ef77bf2a72: Status 404 returned error can't find the container with id 82a21afb2d52f429b13c9a3048dce436ef1998664a99a196057b01ef77bf2a72 Dec 01 14:18:17 crc kubenswrapper[4585]: I1201 14:18:17.417131 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1b1f236-fce8-4c89-9a75-d2e965fa825d","Type":"ContainerStarted","Data":"82a21afb2d52f429b13c9a3048dce436ef1998664a99a196057b01ef77bf2a72"} Dec 01 14:18:17 crc kubenswrapper[4585]: I1201 14:18:17.419209 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 14:18:18 crc kubenswrapper[4585]: I1201 14:18:18.427284 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="598a75f1-7d24-40ff-818c-9cc023cf6b1d" path="/var/lib/kubelet/pods/598a75f1-7d24-40ff-818c-9cc023cf6b1d/volumes" Dec 01 14:18:18 crc kubenswrapper[4585]: I1201 14:18:18.455169 4585 generic.go:334] "Generic (PLEG): container finished" podID="8aecb332-b72b-4862-8416-3e657de2aefd" containerID="e77ac58462a4b79108075f8a16c90c07b74d4a30a529d6f0b7e5d6d14b7b357a" exitCode=0 Dec 01 14:18:18 crc kubenswrapper[4585]: I1201 14:18:18.455265 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8aecb332-b72b-4862-8416-3e657de2aefd","Type":"ContainerDied","Data":"e77ac58462a4b79108075f8a16c90c07b74d4a30a529d6f0b7e5d6d14b7b357a"} Dec 01 14:18:18 crc kubenswrapper[4585]: I1201 14:18:18.457408 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1b1f236-fce8-4c89-9a75-d2e965fa825d","Type":"ContainerStarted","Data":"5c79f0732bf51b071ac5241f64c9ae67550f1697d0b226ec87fefc3696a092c1"} Dec 01 14:18:18 crc kubenswrapper[4585]: I1201 14:18:18.502547 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 14:18:18 crc kubenswrapper[4585]: I1201 14:18:18.652985 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8aecb332-b72b-4862-8416-3e657de2aefd-combined-ca-bundle\") pod \"8aecb332-b72b-4862-8416-3e657de2aefd\" (UID: \"8aecb332-b72b-4862-8416-3e657de2aefd\") " Dec 01 14:18:18 crc kubenswrapper[4585]: I1201 14:18:18.653232 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lp4xw\" (UniqueName: \"kubernetes.io/projected/8aecb332-b72b-4862-8416-3e657de2aefd-kube-api-access-lp4xw\") pod \"8aecb332-b72b-4862-8416-3e657de2aefd\" (UID: \"8aecb332-b72b-4862-8416-3e657de2aefd\") " Dec 01 14:18:18 crc kubenswrapper[4585]: I1201 14:18:18.653278 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8aecb332-b72b-4862-8416-3e657de2aefd-config-data\") pod \"8aecb332-b72b-4862-8416-3e657de2aefd\" (UID: \"8aecb332-b72b-4862-8416-3e657de2aefd\") " Dec 01 14:18:18 crc kubenswrapper[4585]: I1201 14:18:18.653356 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8aecb332-b72b-4862-8416-3e657de2aefd-logs\") pod \"8aecb332-b72b-4862-8416-3e657de2aefd\" (UID: \"8aecb332-b72b-4862-8416-3e657de2aefd\") " Dec 01 14:18:18 crc kubenswrapper[4585]: I1201 14:18:18.654312 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8aecb332-b72b-4862-8416-3e657de2aefd-logs" (OuterVolumeSpecName: "logs") pod "8aecb332-b72b-4862-8416-3e657de2aefd" (UID: "8aecb332-b72b-4862-8416-3e657de2aefd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:18:18 crc kubenswrapper[4585]: I1201 14:18:18.662294 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8aecb332-b72b-4862-8416-3e657de2aefd-kube-api-access-lp4xw" (OuterVolumeSpecName: "kube-api-access-lp4xw") pod "8aecb332-b72b-4862-8416-3e657de2aefd" (UID: "8aecb332-b72b-4862-8416-3e657de2aefd"). InnerVolumeSpecName "kube-api-access-lp4xw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:18:18 crc kubenswrapper[4585]: I1201 14:18:18.683922 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8aecb332-b72b-4862-8416-3e657de2aefd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8aecb332-b72b-4862-8416-3e657de2aefd" (UID: "8aecb332-b72b-4862-8416-3e657de2aefd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:18:18 crc kubenswrapper[4585]: I1201 14:18:18.697941 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8aecb332-b72b-4862-8416-3e657de2aefd-config-data" (OuterVolumeSpecName: "config-data") pod "8aecb332-b72b-4862-8416-3e657de2aefd" (UID: "8aecb332-b72b-4862-8416-3e657de2aefd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:18:18 crc kubenswrapper[4585]: I1201 14:18:18.755847 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lp4xw\" (UniqueName: \"kubernetes.io/projected/8aecb332-b72b-4862-8416-3e657de2aefd-kube-api-access-lp4xw\") on node \"crc\" DevicePath \"\"" Dec 01 14:18:18 crc kubenswrapper[4585]: I1201 14:18:18.755880 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8aecb332-b72b-4862-8416-3e657de2aefd-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 14:18:18 crc kubenswrapper[4585]: I1201 14:18:18.755891 4585 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8aecb332-b72b-4862-8416-3e657de2aefd-logs\") on node \"crc\" DevicePath \"\"" Dec 01 14:18:18 crc kubenswrapper[4585]: I1201 14:18:18.755903 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8aecb332-b72b-4862-8416-3e657de2aefd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 14:18:19 crc kubenswrapper[4585]: I1201 14:18:19.468813 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8aecb332-b72b-4862-8416-3e657de2aefd","Type":"ContainerDied","Data":"2a07bfcf03a191231b2b80d236993efd8c37b5df60737f3cbde1067e69b4ba5d"} Dec 01 14:18:19 crc kubenswrapper[4585]: I1201 14:18:19.468857 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 14:18:19 crc kubenswrapper[4585]: I1201 14:18:19.469171 4585 scope.go:117] "RemoveContainer" containerID="e77ac58462a4b79108075f8a16c90c07b74d4a30a529d6f0b7e5d6d14b7b357a" Dec 01 14:18:19 crc kubenswrapper[4585]: I1201 14:18:19.472602 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1b1f236-fce8-4c89-9a75-d2e965fa825d","Type":"ContainerStarted","Data":"bd7b61ade92593f15fa3a58d624b549bca8354a86f6095eec127c709e376c232"} Dec 01 14:18:19 crc kubenswrapper[4585]: I1201 14:18:19.489113 4585 scope.go:117] "RemoveContainer" containerID="3c7093cd7173bbcba9cabf4360fad48fc3e1868aaef456aa42e47b0b8effce35" Dec 01 14:18:19 crc kubenswrapper[4585]: I1201 14:18:19.512331 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 01 14:18:19 crc kubenswrapper[4585]: I1201 14:18:19.519681 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 01 14:18:19 crc kubenswrapper[4585]: I1201 14:18:19.537920 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 01 14:18:19 crc kubenswrapper[4585]: E1201 14:18:19.538324 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8aecb332-b72b-4862-8416-3e657de2aefd" containerName="nova-api-log" Dec 01 14:18:19 crc kubenswrapper[4585]: I1201 14:18:19.538344 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="8aecb332-b72b-4862-8416-3e657de2aefd" containerName="nova-api-log" Dec 01 14:18:19 crc kubenswrapper[4585]: E1201 14:18:19.538354 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8aecb332-b72b-4862-8416-3e657de2aefd" containerName="nova-api-api" Dec 01 14:18:19 crc kubenswrapper[4585]: I1201 14:18:19.538360 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="8aecb332-b72b-4862-8416-3e657de2aefd" containerName="nova-api-api" Dec 01 14:18:19 crc kubenswrapper[4585]: I1201 14:18:19.538544 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="8aecb332-b72b-4862-8416-3e657de2aefd" containerName="nova-api-api" Dec 01 14:18:19 crc kubenswrapper[4585]: I1201 14:18:19.538559 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="8aecb332-b72b-4862-8416-3e657de2aefd" containerName="nova-api-log" Dec 01 14:18:19 crc kubenswrapper[4585]: I1201 14:18:19.539472 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 14:18:19 crc kubenswrapper[4585]: I1201 14:18:19.541654 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 01 14:18:19 crc kubenswrapper[4585]: I1201 14:18:19.543380 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 01 14:18:19 crc kubenswrapper[4585]: I1201 14:18:19.559814 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 14:18:19 crc kubenswrapper[4585]: I1201 14:18:19.560489 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 01 14:18:19 crc kubenswrapper[4585]: I1201 14:18:19.683090 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xf6jv\" (UniqueName: \"kubernetes.io/projected/1f8c300c-5295-4071-a82d-f3e5884bb729-kube-api-access-xf6jv\") pod \"nova-api-0\" (UID: \"1f8c300c-5295-4071-a82d-f3e5884bb729\") " pod="openstack/nova-api-0" Dec 01 14:18:19 crc kubenswrapper[4585]: I1201 14:18:19.683222 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f8c300c-5295-4071-a82d-f3e5884bb729-logs\") pod \"nova-api-0\" (UID: \"1f8c300c-5295-4071-a82d-f3e5884bb729\") " pod="openstack/nova-api-0" Dec 01 14:18:19 crc kubenswrapper[4585]: I1201 14:18:19.683265 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f8c300c-5295-4071-a82d-f3e5884bb729-config-data\") pod \"nova-api-0\" (UID: \"1f8c300c-5295-4071-a82d-f3e5884bb729\") " pod="openstack/nova-api-0" Dec 01 14:18:19 crc kubenswrapper[4585]: I1201 14:18:19.683347 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f8c300c-5295-4071-a82d-f3e5884bb729-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1f8c300c-5295-4071-a82d-f3e5884bb729\") " pod="openstack/nova-api-0" Dec 01 14:18:19 crc kubenswrapper[4585]: I1201 14:18:19.683564 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f8c300c-5295-4071-a82d-f3e5884bb729-public-tls-certs\") pod \"nova-api-0\" (UID: \"1f8c300c-5295-4071-a82d-f3e5884bb729\") " pod="openstack/nova-api-0" Dec 01 14:18:19 crc kubenswrapper[4585]: I1201 14:18:19.683619 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f8c300c-5295-4071-a82d-f3e5884bb729-internal-tls-certs\") pod \"nova-api-0\" (UID: \"1f8c300c-5295-4071-a82d-f3e5884bb729\") " pod="openstack/nova-api-0" Dec 01 14:18:19 crc kubenswrapper[4585]: I1201 14:18:19.785563 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xf6jv\" (UniqueName: \"kubernetes.io/projected/1f8c300c-5295-4071-a82d-f3e5884bb729-kube-api-access-xf6jv\") pod \"nova-api-0\" (UID: \"1f8c300c-5295-4071-a82d-f3e5884bb729\") " pod="openstack/nova-api-0" Dec 01 14:18:19 crc kubenswrapper[4585]: I1201 14:18:19.785660 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f8c300c-5295-4071-a82d-f3e5884bb729-logs\") pod \"nova-api-0\" (UID: \"1f8c300c-5295-4071-a82d-f3e5884bb729\") " pod="openstack/nova-api-0" Dec 01 14:18:19 crc kubenswrapper[4585]: I1201 14:18:19.785683 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f8c300c-5295-4071-a82d-f3e5884bb729-config-data\") pod \"nova-api-0\" (UID: \"1f8c300c-5295-4071-a82d-f3e5884bb729\") " pod="openstack/nova-api-0" Dec 01 14:18:19 crc kubenswrapper[4585]: I1201 14:18:19.785743 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f8c300c-5295-4071-a82d-f3e5884bb729-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1f8c300c-5295-4071-a82d-f3e5884bb729\") " pod="openstack/nova-api-0" Dec 01 14:18:19 crc kubenswrapper[4585]: I1201 14:18:19.785787 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f8c300c-5295-4071-a82d-f3e5884bb729-public-tls-certs\") pod \"nova-api-0\" (UID: \"1f8c300c-5295-4071-a82d-f3e5884bb729\") " pod="openstack/nova-api-0" Dec 01 14:18:19 crc kubenswrapper[4585]: I1201 14:18:19.785805 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f8c300c-5295-4071-a82d-f3e5884bb729-internal-tls-certs\") pod \"nova-api-0\" (UID: \"1f8c300c-5295-4071-a82d-f3e5884bb729\") " pod="openstack/nova-api-0" Dec 01 14:18:19 crc kubenswrapper[4585]: I1201 14:18:19.786087 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f8c300c-5295-4071-a82d-f3e5884bb729-logs\") pod \"nova-api-0\" (UID: \"1f8c300c-5295-4071-a82d-f3e5884bb729\") " pod="openstack/nova-api-0" Dec 01 14:18:19 crc kubenswrapper[4585]: I1201 14:18:19.791237 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f8c300c-5295-4071-a82d-f3e5884bb729-internal-tls-certs\") pod \"nova-api-0\" (UID: \"1f8c300c-5295-4071-a82d-f3e5884bb729\") " pod="openstack/nova-api-0" Dec 01 14:18:19 crc kubenswrapper[4585]: I1201 14:18:19.792083 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f8c300c-5295-4071-a82d-f3e5884bb729-config-data\") pod \"nova-api-0\" (UID: \"1f8c300c-5295-4071-a82d-f3e5884bb729\") " pod="openstack/nova-api-0" Dec 01 14:18:19 crc kubenswrapper[4585]: I1201 14:18:19.794794 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f8c300c-5295-4071-a82d-f3e5884bb729-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1f8c300c-5295-4071-a82d-f3e5884bb729\") " pod="openstack/nova-api-0" Dec 01 14:18:19 crc kubenswrapper[4585]: I1201 14:18:19.794821 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f8c300c-5295-4071-a82d-f3e5884bb729-public-tls-certs\") pod \"nova-api-0\" (UID: \"1f8c300c-5295-4071-a82d-f3e5884bb729\") " pod="openstack/nova-api-0" Dec 01 14:18:19 crc kubenswrapper[4585]: I1201 14:18:19.809278 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xf6jv\" (UniqueName: \"kubernetes.io/projected/1f8c300c-5295-4071-a82d-f3e5884bb729-kube-api-access-xf6jv\") pod \"nova-api-0\" (UID: \"1f8c300c-5295-4071-a82d-f3e5884bb729\") " pod="openstack/nova-api-0" Dec 01 14:18:19 crc kubenswrapper[4585]: I1201 14:18:19.853999 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 14:18:19 crc kubenswrapper[4585]: I1201 14:18:19.978999 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 01 14:18:19 crc kubenswrapper[4585]: I1201 14:18:19.994669 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 01 14:18:20 crc kubenswrapper[4585]: I1201 14:18:20.358995 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 14:18:20 crc kubenswrapper[4585]: I1201 14:18:20.430410 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8aecb332-b72b-4862-8416-3e657de2aefd" path="/var/lib/kubelet/pods/8aecb332-b72b-4862-8416-3e657de2aefd/volumes" Dec 01 14:18:20 crc kubenswrapper[4585]: I1201 14:18:20.487197 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1f8c300c-5295-4071-a82d-f3e5884bb729","Type":"ContainerStarted","Data":"e8ca071b4524a386efc860e8ebb9508b944c4efcd707253009835b6ecbae7e81"} Dec 01 14:18:20 crc kubenswrapper[4585]: I1201 14:18:20.499309 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1b1f236-fce8-4c89-9a75-d2e965fa825d","Type":"ContainerStarted","Data":"732c758080d9c6c71f7dd8db447471492db654b47c63a8afe337d1fa7085b983"} Dec 01 14:18:20 crc kubenswrapper[4585]: I1201 14:18:20.520899 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 01 14:18:20 crc kubenswrapper[4585]: I1201 14:18:20.666807 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-q986x"] Dec 01 14:18:20 crc kubenswrapper[4585]: I1201 14:18:20.668667 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-q986x" Dec 01 14:18:20 crc kubenswrapper[4585]: I1201 14:18:20.674323 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 01 14:18:20 crc kubenswrapper[4585]: I1201 14:18:20.674548 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 01 14:18:20 crc kubenswrapper[4585]: I1201 14:18:20.686805 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-q986x"] Dec 01 14:18:20 crc kubenswrapper[4585]: I1201 14:18:20.824739 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae2d266c-cec5-4d51-afcc-d948d3cd7903-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-q986x\" (UID: \"ae2d266c-cec5-4d51-afcc-d948d3cd7903\") " pod="openstack/nova-cell1-cell-mapping-q986x" Dec 01 14:18:20 crc kubenswrapper[4585]: I1201 14:18:20.824824 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae2d266c-cec5-4d51-afcc-d948d3cd7903-scripts\") pod \"nova-cell1-cell-mapping-q986x\" (UID: \"ae2d266c-cec5-4d51-afcc-d948d3cd7903\") " pod="openstack/nova-cell1-cell-mapping-q986x" Dec 01 14:18:20 crc kubenswrapper[4585]: I1201 14:18:20.824849 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2vn2\" (UniqueName: \"kubernetes.io/projected/ae2d266c-cec5-4d51-afcc-d948d3cd7903-kube-api-access-l2vn2\") pod \"nova-cell1-cell-mapping-q986x\" (UID: \"ae2d266c-cec5-4d51-afcc-d948d3cd7903\") " pod="openstack/nova-cell1-cell-mapping-q986x" Dec 01 14:18:20 crc kubenswrapper[4585]: I1201 14:18:20.824890 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae2d266c-cec5-4d51-afcc-d948d3cd7903-config-data\") pod \"nova-cell1-cell-mapping-q986x\" (UID: \"ae2d266c-cec5-4d51-afcc-d948d3cd7903\") " pod="openstack/nova-cell1-cell-mapping-q986x" Dec 01 14:18:20 crc kubenswrapper[4585]: I1201 14:18:20.926954 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae2d266c-cec5-4d51-afcc-d948d3cd7903-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-q986x\" (UID: \"ae2d266c-cec5-4d51-afcc-d948d3cd7903\") " pod="openstack/nova-cell1-cell-mapping-q986x" Dec 01 14:18:20 crc kubenswrapper[4585]: I1201 14:18:20.927600 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae2d266c-cec5-4d51-afcc-d948d3cd7903-scripts\") pod \"nova-cell1-cell-mapping-q986x\" (UID: \"ae2d266c-cec5-4d51-afcc-d948d3cd7903\") " pod="openstack/nova-cell1-cell-mapping-q986x" Dec 01 14:18:20 crc kubenswrapper[4585]: I1201 14:18:20.927630 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2vn2\" (UniqueName: \"kubernetes.io/projected/ae2d266c-cec5-4d51-afcc-d948d3cd7903-kube-api-access-l2vn2\") pod \"nova-cell1-cell-mapping-q986x\" (UID: \"ae2d266c-cec5-4d51-afcc-d948d3cd7903\") " pod="openstack/nova-cell1-cell-mapping-q986x" Dec 01 14:18:20 crc kubenswrapper[4585]: I1201 14:18:20.927690 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae2d266c-cec5-4d51-afcc-d948d3cd7903-config-data\") pod \"nova-cell1-cell-mapping-q986x\" (UID: \"ae2d266c-cec5-4d51-afcc-d948d3cd7903\") " pod="openstack/nova-cell1-cell-mapping-q986x" Dec 01 14:18:20 crc kubenswrapper[4585]: I1201 14:18:20.931243 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae2d266c-cec5-4d51-afcc-d948d3cd7903-config-data\") pod \"nova-cell1-cell-mapping-q986x\" (UID: \"ae2d266c-cec5-4d51-afcc-d948d3cd7903\") " pod="openstack/nova-cell1-cell-mapping-q986x" Dec 01 14:18:20 crc kubenswrapper[4585]: I1201 14:18:20.931695 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae2d266c-cec5-4d51-afcc-d948d3cd7903-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-q986x\" (UID: \"ae2d266c-cec5-4d51-afcc-d948d3cd7903\") " pod="openstack/nova-cell1-cell-mapping-q986x" Dec 01 14:18:20 crc kubenswrapper[4585]: I1201 14:18:20.932863 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae2d266c-cec5-4d51-afcc-d948d3cd7903-scripts\") pod \"nova-cell1-cell-mapping-q986x\" (UID: \"ae2d266c-cec5-4d51-afcc-d948d3cd7903\") " pod="openstack/nova-cell1-cell-mapping-q986x" Dec 01 14:18:20 crc kubenswrapper[4585]: I1201 14:18:20.947169 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2vn2\" (UniqueName: \"kubernetes.io/projected/ae2d266c-cec5-4d51-afcc-d948d3cd7903-kube-api-access-l2vn2\") pod \"nova-cell1-cell-mapping-q986x\" (UID: \"ae2d266c-cec5-4d51-afcc-d948d3cd7903\") " pod="openstack/nova-cell1-cell-mapping-q986x" Dec 01 14:18:21 crc kubenswrapper[4585]: I1201 14:18:21.011676 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-q986x" Dec 01 14:18:21 crc kubenswrapper[4585]: I1201 14:18:21.510448 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1f8c300c-5295-4071-a82d-f3e5884bb729","Type":"ContainerStarted","Data":"865eaef9d545ca0d34430e183770b2e4fdd6be28b7d1f37a030135df2976a52f"} Dec 01 14:18:21 crc kubenswrapper[4585]: I1201 14:18:21.510730 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1f8c300c-5295-4071-a82d-f3e5884bb729","Type":"ContainerStarted","Data":"a1bc6c9c9a427bd35f544fddf7f399ddbcbb9f157c8d480e5433e1169a02e7bf"} Dec 01 14:18:21 crc kubenswrapper[4585]: I1201 14:18:21.526537 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.526519456 podStartE2EDuration="2.526519456s" podCreationTimestamp="2025-12-01 14:18:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:18:21.52556914 +0000 UTC m=+1215.509782995" watchObservedRunningTime="2025-12-01 14:18:21.526519456 +0000 UTC m=+1215.510733311" Dec 01 14:18:21 crc kubenswrapper[4585]: I1201 14:18:21.567549 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-q986x"] Dec 01 14:18:22 crc kubenswrapper[4585]: I1201 14:18:22.445114 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-89c5cd4d5-lxk24" Dec 01 14:18:22 crc kubenswrapper[4585]: I1201 14:18:22.563088 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1b1f236-fce8-4c89-9a75-d2e965fa825d","Type":"ContainerStarted","Data":"8a04b83d094fc306326f46ab8c5035d6b288dfe5eb33a0be4525537324a54ceb"} Dec 01 14:18:22 crc kubenswrapper[4585]: I1201 14:18:22.563432 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 14:18:22 crc kubenswrapper[4585]: I1201 14:18:22.563170 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f1b1f236-fce8-4c89-9a75-d2e965fa825d" containerName="ceilometer-central-agent" containerID="cri-o://5c79f0732bf51b071ac5241f64c9ae67550f1697d0b226ec87fefc3696a092c1" gracePeriod=30 Dec 01 14:18:22 crc kubenswrapper[4585]: I1201 14:18:22.563648 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f1b1f236-fce8-4c89-9a75-d2e965fa825d" containerName="proxy-httpd" containerID="cri-o://8a04b83d094fc306326f46ab8c5035d6b288dfe5eb33a0be4525537324a54ceb" gracePeriod=30 Dec 01 14:18:22 crc kubenswrapper[4585]: I1201 14:18:22.563705 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f1b1f236-fce8-4c89-9a75-d2e965fa825d" containerName="ceilometer-notification-agent" containerID="cri-o://bd7b61ade92593f15fa3a58d624b549bca8354a86f6095eec127c709e376c232" gracePeriod=30 Dec 01 14:18:22 crc kubenswrapper[4585]: I1201 14:18:22.563738 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f1b1f236-fce8-4c89-9a75-d2e965fa825d" containerName="sg-core" containerID="cri-o://732c758080d9c6c71f7dd8db447471492db654b47c63a8afe337d1fa7085b983" gracePeriod=30 Dec 01 14:18:22 crc kubenswrapper[4585]: I1201 14:18:22.568873 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-hdklj"] Dec 01 14:18:22 crc kubenswrapper[4585]: I1201 14:18:22.592464 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-757b4f8459-hdklj" podUID="c2cb820e-8840-4482-9866-e51474883db3" containerName="dnsmasq-dns" containerID="cri-o://de5b5e57382acf7f9d83c321db9e9248c37525425646371ecdbe826abc330d5a" gracePeriod=10 Dec 01 14:18:22 crc kubenswrapper[4585]: I1201 14:18:22.595238 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-q986x" event={"ID":"ae2d266c-cec5-4d51-afcc-d948d3cd7903","Type":"ContainerStarted","Data":"06d1b0509366b8cb1d0354b0ea6d376a208ae3ae2bdfdfa9d6a09d3b1c2672f7"} Dec 01 14:18:22 crc kubenswrapper[4585]: I1201 14:18:22.595303 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-q986x" event={"ID":"ae2d266c-cec5-4d51-afcc-d948d3cd7903","Type":"ContainerStarted","Data":"98d9f2e82b84f74036694aee96fd27bb1d23999f0434af9a77d2d842feec8f31"} Dec 01 14:18:22 crc kubenswrapper[4585]: I1201 14:18:22.626486 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.897736584 podStartE2EDuration="6.626461379s" podCreationTimestamp="2025-12-01 14:18:16 +0000 UTC" firstStartedPulling="2025-12-01 14:18:17.355846943 +0000 UTC m=+1211.340060788" lastFinishedPulling="2025-12-01 14:18:22.084571728 +0000 UTC m=+1216.068785583" observedRunningTime="2025-12-01 14:18:22.622120613 +0000 UTC m=+1216.606334478" watchObservedRunningTime="2025-12-01 14:18:22.626461379 +0000 UTC m=+1216.610675234" Dec 01 14:18:22 crc kubenswrapper[4585]: I1201 14:18:22.685695 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-q986x" podStartSLOduration=2.685675458 podStartE2EDuration="2.685675458s" podCreationTimestamp="2025-12-01 14:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:18:22.670309088 +0000 UTC m=+1216.654522943" watchObservedRunningTime="2025-12-01 14:18:22.685675458 +0000 UTC m=+1216.669889313" Dec 01 14:18:23 crc kubenswrapper[4585]: I1201 14:18:23.295414 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-hdklj" Dec 01 14:18:23 crc kubenswrapper[4585]: I1201 14:18:23.462111 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c2cb820e-8840-4482-9866-e51474883db3-dns-svc\") pod \"c2cb820e-8840-4482-9866-e51474883db3\" (UID: \"c2cb820e-8840-4482-9866-e51474883db3\") " Dec 01 14:18:23 crc kubenswrapper[4585]: I1201 14:18:23.462199 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c2cb820e-8840-4482-9866-e51474883db3-dns-swift-storage-0\") pod \"c2cb820e-8840-4482-9866-e51474883db3\" (UID: \"c2cb820e-8840-4482-9866-e51474883db3\") " Dec 01 14:18:23 crc kubenswrapper[4585]: I1201 14:18:23.462231 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c2cb820e-8840-4482-9866-e51474883db3-ovsdbserver-nb\") pod \"c2cb820e-8840-4482-9866-e51474883db3\" (UID: \"c2cb820e-8840-4482-9866-e51474883db3\") " Dec 01 14:18:23 crc kubenswrapper[4585]: I1201 14:18:23.462291 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c2cb820e-8840-4482-9866-e51474883db3-ovsdbserver-sb\") pod \"c2cb820e-8840-4482-9866-e51474883db3\" (UID: \"c2cb820e-8840-4482-9866-e51474883db3\") " Dec 01 14:18:23 crc kubenswrapper[4585]: I1201 14:18:23.462322 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqd2h\" (UniqueName: \"kubernetes.io/projected/c2cb820e-8840-4482-9866-e51474883db3-kube-api-access-dqd2h\") pod \"c2cb820e-8840-4482-9866-e51474883db3\" (UID: \"c2cb820e-8840-4482-9866-e51474883db3\") " Dec 01 14:18:23 crc kubenswrapper[4585]: I1201 14:18:23.462364 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2cb820e-8840-4482-9866-e51474883db3-config\") pod \"c2cb820e-8840-4482-9866-e51474883db3\" (UID: \"c2cb820e-8840-4482-9866-e51474883db3\") " Dec 01 14:18:23 crc kubenswrapper[4585]: I1201 14:18:23.471197 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2cb820e-8840-4482-9866-e51474883db3-kube-api-access-dqd2h" (OuterVolumeSpecName: "kube-api-access-dqd2h") pod "c2cb820e-8840-4482-9866-e51474883db3" (UID: "c2cb820e-8840-4482-9866-e51474883db3"). InnerVolumeSpecName "kube-api-access-dqd2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:18:23 crc kubenswrapper[4585]: I1201 14:18:23.512136 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2cb820e-8840-4482-9866-e51474883db3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c2cb820e-8840-4482-9866-e51474883db3" (UID: "c2cb820e-8840-4482-9866-e51474883db3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:18:23 crc kubenswrapper[4585]: I1201 14:18:23.514860 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2cb820e-8840-4482-9866-e51474883db3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c2cb820e-8840-4482-9866-e51474883db3" (UID: "c2cb820e-8840-4482-9866-e51474883db3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:18:23 crc kubenswrapper[4585]: I1201 14:18:23.515506 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2cb820e-8840-4482-9866-e51474883db3-config" (OuterVolumeSpecName: "config") pod "c2cb820e-8840-4482-9866-e51474883db3" (UID: "c2cb820e-8840-4482-9866-e51474883db3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:18:23 crc kubenswrapper[4585]: I1201 14:18:23.519962 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2cb820e-8840-4482-9866-e51474883db3-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c2cb820e-8840-4482-9866-e51474883db3" (UID: "c2cb820e-8840-4482-9866-e51474883db3"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:18:23 crc kubenswrapper[4585]: I1201 14:18:23.529451 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2cb820e-8840-4482-9866-e51474883db3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c2cb820e-8840-4482-9866-e51474883db3" (UID: "c2cb820e-8840-4482-9866-e51474883db3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:18:23 crc kubenswrapper[4585]: I1201 14:18:23.566041 4585 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c2cb820e-8840-4482-9866-e51474883db3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 14:18:23 crc kubenswrapper[4585]: I1201 14:18:23.566071 4585 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c2cb820e-8840-4482-9866-e51474883db3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 14:18:23 crc kubenswrapper[4585]: I1201 14:18:23.566082 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqd2h\" (UniqueName: \"kubernetes.io/projected/c2cb820e-8840-4482-9866-e51474883db3-kube-api-access-dqd2h\") on node \"crc\" DevicePath \"\"" Dec 01 14:18:23 crc kubenswrapper[4585]: I1201 14:18:23.566094 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2cb820e-8840-4482-9866-e51474883db3-config\") on node \"crc\" DevicePath \"\"" Dec 01 14:18:23 crc kubenswrapper[4585]: I1201 14:18:23.566104 4585 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c2cb820e-8840-4482-9866-e51474883db3-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 14:18:23 crc kubenswrapper[4585]: I1201 14:18:23.566112 4585 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c2cb820e-8840-4482-9866-e51474883db3-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 01 14:18:23 crc kubenswrapper[4585]: I1201 14:18:23.605649 4585 generic.go:334] "Generic (PLEG): container finished" podID="f1b1f236-fce8-4c89-9a75-d2e965fa825d" containerID="8a04b83d094fc306326f46ab8c5035d6b288dfe5eb33a0be4525537324a54ceb" exitCode=0 Dec 01 14:18:23 crc kubenswrapper[4585]: I1201 14:18:23.605679 4585 generic.go:334] "Generic (PLEG): container finished" podID="f1b1f236-fce8-4c89-9a75-d2e965fa825d" containerID="732c758080d9c6c71f7dd8db447471492db654b47c63a8afe337d1fa7085b983" exitCode=2 Dec 01 14:18:23 crc kubenswrapper[4585]: I1201 14:18:23.605687 4585 generic.go:334] "Generic (PLEG): container finished" podID="f1b1f236-fce8-4c89-9a75-d2e965fa825d" containerID="bd7b61ade92593f15fa3a58d624b549bca8354a86f6095eec127c709e376c232" exitCode=0 Dec 01 14:18:23 crc kubenswrapper[4585]: I1201 14:18:23.605720 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1b1f236-fce8-4c89-9a75-d2e965fa825d","Type":"ContainerDied","Data":"8a04b83d094fc306326f46ab8c5035d6b288dfe5eb33a0be4525537324a54ceb"} Dec 01 14:18:23 crc kubenswrapper[4585]: I1201 14:18:23.605743 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1b1f236-fce8-4c89-9a75-d2e965fa825d","Type":"ContainerDied","Data":"732c758080d9c6c71f7dd8db447471492db654b47c63a8afe337d1fa7085b983"} Dec 01 14:18:23 crc kubenswrapper[4585]: I1201 14:18:23.605754 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1b1f236-fce8-4c89-9a75-d2e965fa825d","Type":"ContainerDied","Data":"bd7b61ade92593f15fa3a58d624b549bca8354a86f6095eec127c709e376c232"} Dec 01 14:18:23 crc kubenswrapper[4585]: I1201 14:18:23.607304 4585 generic.go:334] "Generic (PLEG): container finished" podID="c2cb820e-8840-4482-9866-e51474883db3" containerID="de5b5e57382acf7f9d83c321db9e9248c37525425646371ecdbe826abc330d5a" exitCode=0 Dec 01 14:18:23 crc kubenswrapper[4585]: I1201 14:18:23.608084 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-hdklj" Dec 01 14:18:23 crc kubenswrapper[4585]: I1201 14:18:23.608171 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-hdklj" event={"ID":"c2cb820e-8840-4482-9866-e51474883db3","Type":"ContainerDied","Data":"de5b5e57382acf7f9d83c321db9e9248c37525425646371ecdbe826abc330d5a"} Dec 01 14:18:23 crc kubenswrapper[4585]: I1201 14:18:23.608195 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-hdklj" event={"ID":"c2cb820e-8840-4482-9866-e51474883db3","Type":"ContainerDied","Data":"f0ec22ede8bee54b854145ba441e3e7e356b11fe43a49b9d98eadee758589cc6"} Dec 01 14:18:23 crc kubenswrapper[4585]: I1201 14:18:23.608212 4585 scope.go:117] "RemoveContainer" containerID="de5b5e57382acf7f9d83c321db9e9248c37525425646371ecdbe826abc330d5a" Dec 01 14:18:23 crc kubenswrapper[4585]: I1201 14:18:23.640324 4585 scope.go:117] "RemoveContainer" containerID="bdff6b51a06e09f8e1f880935bd6b22491c4015333f6ea4cf966f1fcf2bd3763" Dec 01 14:18:23 crc kubenswrapper[4585]: I1201 14:18:23.644368 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-hdklj"] Dec 01 14:18:23 crc kubenswrapper[4585]: I1201 14:18:23.656425 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-hdklj"] Dec 01 14:18:23 crc kubenswrapper[4585]: I1201 14:18:23.659550 4585 scope.go:117] "RemoveContainer" containerID="de5b5e57382acf7f9d83c321db9e9248c37525425646371ecdbe826abc330d5a" Dec 01 14:18:23 crc kubenswrapper[4585]: E1201 14:18:23.660336 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de5b5e57382acf7f9d83c321db9e9248c37525425646371ecdbe826abc330d5a\": container with ID starting with de5b5e57382acf7f9d83c321db9e9248c37525425646371ecdbe826abc330d5a not found: ID does not exist" containerID="de5b5e57382acf7f9d83c321db9e9248c37525425646371ecdbe826abc330d5a" Dec 01 14:18:23 crc kubenswrapper[4585]: I1201 14:18:23.660365 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de5b5e57382acf7f9d83c321db9e9248c37525425646371ecdbe826abc330d5a"} err="failed to get container status \"de5b5e57382acf7f9d83c321db9e9248c37525425646371ecdbe826abc330d5a\": rpc error: code = NotFound desc = could not find container \"de5b5e57382acf7f9d83c321db9e9248c37525425646371ecdbe826abc330d5a\": container with ID starting with de5b5e57382acf7f9d83c321db9e9248c37525425646371ecdbe826abc330d5a not found: ID does not exist" Dec 01 14:18:23 crc kubenswrapper[4585]: I1201 14:18:23.660386 4585 scope.go:117] "RemoveContainer" containerID="bdff6b51a06e09f8e1f880935bd6b22491c4015333f6ea4cf966f1fcf2bd3763" Dec 01 14:18:23 crc kubenswrapper[4585]: E1201 14:18:23.661221 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdff6b51a06e09f8e1f880935bd6b22491c4015333f6ea4cf966f1fcf2bd3763\": container with ID starting with bdff6b51a06e09f8e1f880935bd6b22491c4015333f6ea4cf966f1fcf2bd3763 not found: ID does not exist" containerID="bdff6b51a06e09f8e1f880935bd6b22491c4015333f6ea4cf966f1fcf2bd3763" Dec 01 14:18:23 crc kubenswrapper[4585]: I1201 14:18:23.661265 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdff6b51a06e09f8e1f880935bd6b22491c4015333f6ea4cf966f1fcf2bd3763"} err="failed to get container status \"bdff6b51a06e09f8e1f880935bd6b22491c4015333f6ea4cf966f1fcf2bd3763\": rpc error: code = NotFound desc = could not find container \"bdff6b51a06e09f8e1f880935bd6b22491c4015333f6ea4cf966f1fcf2bd3763\": container with ID starting with bdff6b51a06e09f8e1f880935bd6b22491c4015333f6ea4cf966f1fcf2bd3763 not found: ID does not exist" Dec 01 14:18:24 crc kubenswrapper[4585]: I1201 14:18:24.421824 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2cb820e-8840-4482-9866-e51474883db3" path="/var/lib/kubelet/pods/c2cb820e-8840-4482-9866-e51474883db3/volumes" Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.002248 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.149901 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1b1f236-fce8-4c89-9a75-d2e965fa825d-config-data\") pod \"f1b1f236-fce8-4c89-9a75-d2e965fa825d\" (UID: \"f1b1f236-fce8-4c89-9a75-d2e965fa825d\") " Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.150276 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1b1f236-fce8-4c89-9a75-d2e965fa825d-scripts\") pod \"f1b1f236-fce8-4c89-9a75-d2e965fa825d\" (UID: \"f1b1f236-fce8-4c89-9a75-d2e965fa825d\") " Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.150361 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1b1f236-fce8-4c89-9a75-d2e965fa825d-combined-ca-bundle\") pod \"f1b1f236-fce8-4c89-9a75-d2e965fa825d\" (UID: \"f1b1f236-fce8-4c89-9a75-d2e965fa825d\") " Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.150473 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1b1f236-fce8-4c89-9a75-d2e965fa825d-ceilometer-tls-certs\") pod \"f1b1f236-fce8-4c89-9a75-d2e965fa825d\" (UID: \"f1b1f236-fce8-4c89-9a75-d2e965fa825d\") " Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.150499 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f1b1f236-fce8-4c89-9a75-d2e965fa825d-sg-core-conf-yaml\") pod \"f1b1f236-fce8-4c89-9a75-d2e965fa825d\" (UID: \"f1b1f236-fce8-4c89-9a75-d2e965fa825d\") " Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.150945 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65qtq\" (UniqueName: \"kubernetes.io/projected/f1b1f236-fce8-4c89-9a75-d2e965fa825d-kube-api-access-65qtq\") pod \"f1b1f236-fce8-4c89-9a75-d2e965fa825d\" (UID: \"f1b1f236-fce8-4c89-9a75-d2e965fa825d\") " Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.151001 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1b1f236-fce8-4c89-9a75-d2e965fa825d-log-httpd\") pod \"f1b1f236-fce8-4c89-9a75-d2e965fa825d\" (UID: \"f1b1f236-fce8-4c89-9a75-d2e965fa825d\") " Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.151506 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1b1f236-fce8-4c89-9a75-d2e965fa825d-run-httpd\") pod \"f1b1f236-fce8-4c89-9a75-d2e965fa825d\" (UID: \"f1b1f236-fce8-4c89-9a75-d2e965fa825d\") " Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.151627 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1b1f236-fce8-4c89-9a75-d2e965fa825d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f1b1f236-fce8-4c89-9a75-d2e965fa825d" (UID: "f1b1f236-fce8-4c89-9a75-d2e965fa825d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.152044 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1b1f236-fce8-4c89-9a75-d2e965fa825d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f1b1f236-fce8-4c89-9a75-d2e965fa825d" (UID: "f1b1f236-fce8-4c89-9a75-d2e965fa825d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.152515 4585 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1b1f236-fce8-4c89-9a75-d2e965fa825d-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.152536 4585 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1b1f236-fce8-4c89-9a75-d2e965fa825d-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.160351 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1b1f236-fce8-4c89-9a75-d2e965fa825d-kube-api-access-65qtq" (OuterVolumeSpecName: "kube-api-access-65qtq") pod "f1b1f236-fce8-4c89-9a75-d2e965fa825d" (UID: "f1b1f236-fce8-4c89-9a75-d2e965fa825d"). InnerVolumeSpecName "kube-api-access-65qtq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.169830 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1b1f236-fce8-4c89-9a75-d2e965fa825d-scripts" (OuterVolumeSpecName: "scripts") pod "f1b1f236-fce8-4c89-9a75-d2e965fa825d" (UID: "f1b1f236-fce8-4c89-9a75-d2e965fa825d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.213787 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1b1f236-fce8-4c89-9a75-d2e965fa825d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f1b1f236-fce8-4c89-9a75-d2e965fa825d" (UID: "f1b1f236-fce8-4c89-9a75-d2e965fa825d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.254521 4585 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f1b1f236-fce8-4c89-9a75-d2e965fa825d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.254545 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65qtq\" (UniqueName: \"kubernetes.io/projected/f1b1f236-fce8-4c89-9a75-d2e965fa825d-kube-api-access-65qtq\") on node \"crc\" DevicePath \"\"" Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.254558 4585 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1b1f236-fce8-4c89-9a75-d2e965fa825d-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.262134 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1b1f236-fce8-4c89-9a75-d2e965fa825d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f1b1f236-fce8-4c89-9a75-d2e965fa825d" (UID: "f1b1f236-fce8-4c89-9a75-d2e965fa825d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.268257 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1b1f236-fce8-4c89-9a75-d2e965fa825d-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "f1b1f236-fce8-4c89-9a75-d2e965fa825d" (UID: "f1b1f236-fce8-4c89-9a75-d2e965fa825d"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.284974 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1b1f236-fce8-4c89-9a75-d2e965fa825d-config-data" (OuterVolumeSpecName: "config-data") pod "f1b1f236-fce8-4c89-9a75-d2e965fa825d" (UID: "f1b1f236-fce8-4c89-9a75-d2e965fa825d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.355903 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1b1f236-fce8-4c89-9a75-d2e965fa825d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.355935 4585 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1b1f236-fce8-4c89-9a75-d2e965fa825d-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.355947 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1b1f236-fce8-4c89-9a75-d2e965fa825d-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.652557 4585 generic.go:334] "Generic (PLEG): container finished" podID="f1b1f236-fce8-4c89-9a75-d2e965fa825d" containerID="5c79f0732bf51b071ac5241f64c9ae67550f1697d0b226ec87fefc3696a092c1" exitCode=0 Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.652597 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1b1f236-fce8-4c89-9a75-d2e965fa825d","Type":"ContainerDied","Data":"5c79f0732bf51b071ac5241f64c9ae67550f1697d0b226ec87fefc3696a092c1"} Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.652625 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1b1f236-fce8-4c89-9a75-d2e965fa825d","Type":"ContainerDied","Data":"82a21afb2d52f429b13c9a3048dce436ef1998664a99a196057b01ef77bf2a72"} Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.652642 4585 scope.go:117] "RemoveContainer" containerID="8a04b83d094fc306326f46ab8c5035d6b288dfe5eb33a0be4525537324a54ceb" Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.652760 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.702643 4585 scope.go:117] "RemoveContainer" containerID="732c758080d9c6c71f7dd8db447471492db654b47c63a8afe337d1fa7085b983" Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.756021 4585 scope.go:117] "RemoveContainer" containerID="bd7b61ade92593f15fa3a58d624b549bca8354a86f6095eec127c709e376c232" Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.757052 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.774014 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.786199 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 14:18:27 crc kubenswrapper[4585]: E1201 14:18:27.786643 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1b1f236-fce8-4c89-9a75-d2e965fa825d" containerName="proxy-httpd" Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.786664 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1b1f236-fce8-4c89-9a75-d2e965fa825d" containerName="proxy-httpd" Dec 01 14:18:27 crc kubenswrapper[4585]: E1201 14:18:27.786682 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1b1f236-fce8-4c89-9a75-d2e965fa825d" containerName="ceilometer-notification-agent" Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.786690 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1b1f236-fce8-4c89-9a75-d2e965fa825d" containerName="ceilometer-notification-agent" Dec 01 14:18:27 crc kubenswrapper[4585]: E1201 14:18:27.786707 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1b1f236-fce8-4c89-9a75-d2e965fa825d" containerName="sg-core" Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.786716 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1b1f236-fce8-4c89-9a75-d2e965fa825d" containerName="sg-core" Dec 01 14:18:27 crc kubenswrapper[4585]: E1201 14:18:27.786738 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2cb820e-8840-4482-9866-e51474883db3" containerName="init" Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.786745 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2cb820e-8840-4482-9866-e51474883db3" containerName="init" Dec 01 14:18:27 crc kubenswrapper[4585]: E1201 14:18:27.786774 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2cb820e-8840-4482-9866-e51474883db3" containerName="dnsmasq-dns" Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.786784 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2cb820e-8840-4482-9866-e51474883db3" containerName="dnsmasq-dns" Dec 01 14:18:27 crc kubenswrapper[4585]: E1201 14:18:27.786796 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1b1f236-fce8-4c89-9a75-d2e965fa825d" containerName="ceilometer-central-agent" Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.786802 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1b1f236-fce8-4c89-9a75-d2e965fa825d" containerName="ceilometer-central-agent" Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.787002 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1b1f236-fce8-4c89-9a75-d2e965fa825d" containerName="ceilometer-central-agent" Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.787019 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1b1f236-fce8-4c89-9a75-d2e965fa825d" containerName="sg-core" Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.787029 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2cb820e-8840-4482-9866-e51474883db3" containerName="dnsmasq-dns" Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.787043 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1b1f236-fce8-4c89-9a75-d2e965fa825d" containerName="proxy-httpd" Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.787053 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1b1f236-fce8-4c89-9a75-d2e965fa825d" containerName="ceilometer-notification-agent" Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.788641 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.792692 4585 scope.go:117] "RemoveContainer" containerID="5c79f0732bf51b071ac5241f64c9ae67550f1697d0b226ec87fefc3696a092c1" Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.794160 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.794222 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.799599 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.804635 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.858394 4585 scope.go:117] "RemoveContainer" containerID="8a04b83d094fc306326f46ab8c5035d6b288dfe5eb33a0be4525537324a54ceb" Dec 01 14:18:27 crc kubenswrapper[4585]: E1201 14:18:27.860530 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a04b83d094fc306326f46ab8c5035d6b288dfe5eb33a0be4525537324a54ceb\": container with ID starting with 8a04b83d094fc306326f46ab8c5035d6b288dfe5eb33a0be4525537324a54ceb not found: ID does not exist" containerID="8a04b83d094fc306326f46ab8c5035d6b288dfe5eb33a0be4525537324a54ceb" Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.860562 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a04b83d094fc306326f46ab8c5035d6b288dfe5eb33a0be4525537324a54ceb"} err="failed to get container status \"8a04b83d094fc306326f46ab8c5035d6b288dfe5eb33a0be4525537324a54ceb\": rpc error: code = NotFound desc = could not find container \"8a04b83d094fc306326f46ab8c5035d6b288dfe5eb33a0be4525537324a54ceb\": container with ID starting with 8a04b83d094fc306326f46ab8c5035d6b288dfe5eb33a0be4525537324a54ceb not found: ID does not exist" Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.860583 4585 scope.go:117] "RemoveContainer" containerID="732c758080d9c6c71f7dd8db447471492db654b47c63a8afe337d1fa7085b983" Dec 01 14:18:27 crc kubenswrapper[4585]: E1201 14:18:27.861095 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"732c758080d9c6c71f7dd8db447471492db654b47c63a8afe337d1fa7085b983\": container with ID starting with 732c758080d9c6c71f7dd8db447471492db654b47c63a8afe337d1fa7085b983 not found: ID does not exist" containerID="732c758080d9c6c71f7dd8db447471492db654b47c63a8afe337d1fa7085b983" Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.861118 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"732c758080d9c6c71f7dd8db447471492db654b47c63a8afe337d1fa7085b983"} err="failed to get container status \"732c758080d9c6c71f7dd8db447471492db654b47c63a8afe337d1fa7085b983\": rpc error: code = NotFound desc = could not find container \"732c758080d9c6c71f7dd8db447471492db654b47c63a8afe337d1fa7085b983\": container with ID starting with 732c758080d9c6c71f7dd8db447471492db654b47c63a8afe337d1fa7085b983 not found: ID does not exist" Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.861138 4585 scope.go:117] "RemoveContainer" containerID="bd7b61ade92593f15fa3a58d624b549bca8354a86f6095eec127c709e376c232" Dec 01 14:18:27 crc kubenswrapper[4585]: E1201 14:18:27.861501 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd7b61ade92593f15fa3a58d624b549bca8354a86f6095eec127c709e376c232\": container with ID starting with bd7b61ade92593f15fa3a58d624b549bca8354a86f6095eec127c709e376c232 not found: ID does not exist" containerID="bd7b61ade92593f15fa3a58d624b549bca8354a86f6095eec127c709e376c232" Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.861539 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd7b61ade92593f15fa3a58d624b549bca8354a86f6095eec127c709e376c232"} err="failed to get container status \"bd7b61ade92593f15fa3a58d624b549bca8354a86f6095eec127c709e376c232\": rpc error: code = NotFound desc = could not find container \"bd7b61ade92593f15fa3a58d624b549bca8354a86f6095eec127c709e376c232\": container with ID starting with bd7b61ade92593f15fa3a58d624b549bca8354a86f6095eec127c709e376c232 not found: ID does not exist" Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.861568 4585 scope.go:117] "RemoveContainer" containerID="5c79f0732bf51b071ac5241f64c9ae67550f1697d0b226ec87fefc3696a092c1" Dec 01 14:18:27 crc kubenswrapper[4585]: E1201 14:18:27.862029 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c79f0732bf51b071ac5241f64c9ae67550f1697d0b226ec87fefc3696a092c1\": container with ID starting with 5c79f0732bf51b071ac5241f64c9ae67550f1697d0b226ec87fefc3696a092c1 not found: ID does not exist" containerID="5c79f0732bf51b071ac5241f64c9ae67550f1697d0b226ec87fefc3696a092c1" Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.862099 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c79f0732bf51b071ac5241f64c9ae67550f1697d0b226ec87fefc3696a092c1"} err="failed to get container status \"5c79f0732bf51b071ac5241f64c9ae67550f1697d0b226ec87fefc3696a092c1\": rpc error: code = NotFound desc = could not find container \"5c79f0732bf51b071ac5241f64c9ae67550f1697d0b226ec87fefc3696a092c1\": container with ID starting with 5c79f0732bf51b071ac5241f64c9ae67550f1697d0b226ec87fefc3696a092c1 not found: ID does not exist" Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.966575 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7642d2c9-b2cc-400c-b45a-957690fb2e86-config-data\") pod \"ceilometer-0\" (UID: \"7642d2c9-b2cc-400c-b45a-957690fb2e86\") " pod="openstack/ceilometer-0" Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.966622 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7642d2c9-b2cc-400c-b45a-957690fb2e86-scripts\") pod \"ceilometer-0\" (UID: \"7642d2c9-b2cc-400c-b45a-957690fb2e86\") " pod="openstack/ceilometer-0" Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.966838 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7642d2c9-b2cc-400c-b45a-957690fb2e86-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7642d2c9-b2cc-400c-b45a-957690fb2e86\") " pod="openstack/ceilometer-0" Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.967099 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7642d2c9-b2cc-400c-b45a-957690fb2e86-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7642d2c9-b2cc-400c-b45a-957690fb2e86\") " pod="openstack/ceilometer-0" Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.967329 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7642d2c9-b2cc-400c-b45a-957690fb2e86-run-httpd\") pod \"ceilometer-0\" (UID: \"7642d2c9-b2cc-400c-b45a-957690fb2e86\") " pod="openstack/ceilometer-0" Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.967394 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7642d2c9-b2cc-400c-b45a-957690fb2e86-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7642d2c9-b2cc-400c-b45a-957690fb2e86\") " pod="openstack/ceilometer-0" Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.967425 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7642d2c9-b2cc-400c-b45a-957690fb2e86-log-httpd\") pod \"ceilometer-0\" (UID: \"7642d2c9-b2cc-400c-b45a-957690fb2e86\") " pod="openstack/ceilometer-0" Dec 01 14:18:27 crc kubenswrapper[4585]: I1201 14:18:27.967496 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt7xj\" (UniqueName: \"kubernetes.io/projected/7642d2c9-b2cc-400c-b45a-957690fb2e86-kube-api-access-xt7xj\") pod \"ceilometer-0\" (UID: \"7642d2c9-b2cc-400c-b45a-957690fb2e86\") " pod="openstack/ceilometer-0" Dec 01 14:18:28 crc kubenswrapper[4585]: I1201 14:18:28.069163 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7642d2c9-b2cc-400c-b45a-957690fb2e86-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7642d2c9-b2cc-400c-b45a-957690fb2e86\") " pod="openstack/ceilometer-0" Dec 01 14:18:28 crc kubenswrapper[4585]: I1201 14:18:28.069258 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7642d2c9-b2cc-400c-b45a-957690fb2e86-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7642d2c9-b2cc-400c-b45a-957690fb2e86\") " pod="openstack/ceilometer-0" Dec 01 14:18:28 crc kubenswrapper[4585]: I1201 14:18:28.069331 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7642d2c9-b2cc-400c-b45a-957690fb2e86-run-httpd\") pod \"ceilometer-0\" (UID: \"7642d2c9-b2cc-400c-b45a-957690fb2e86\") " pod="openstack/ceilometer-0" Dec 01 14:18:28 crc kubenswrapper[4585]: I1201 14:18:28.069350 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7642d2c9-b2cc-400c-b45a-957690fb2e86-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7642d2c9-b2cc-400c-b45a-957690fb2e86\") " pod="openstack/ceilometer-0" Dec 01 14:18:28 crc kubenswrapper[4585]: I1201 14:18:28.069384 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7642d2c9-b2cc-400c-b45a-957690fb2e86-log-httpd\") pod \"ceilometer-0\" (UID: \"7642d2c9-b2cc-400c-b45a-957690fb2e86\") " pod="openstack/ceilometer-0" Dec 01 14:18:28 crc kubenswrapper[4585]: I1201 14:18:28.069416 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xt7xj\" (UniqueName: \"kubernetes.io/projected/7642d2c9-b2cc-400c-b45a-957690fb2e86-kube-api-access-xt7xj\") pod \"ceilometer-0\" (UID: \"7642d2c9-b2cc-400c-b45a-957690fb2e86\") " pod="openstack/ceilometer-0" Dec 01 14:18:28 crc kubenswrapper[4585]: I1201 14:18:28.069476 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7642d2c9-b2cc-400c-b45a-957690fb2e86-config-data\") pod \"ceilometer-0\" (UID: \"7642d2c9-b2cc-400c-b45a-957690fb2e86\") " pod="openstack/ceilometer-0" Dec 01 14:18:28 crc kubenswrapper[4585]: I1201 14:18:28.069495 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7642d2c9-b2cc-400c-b45a-957690fb2e86-scripts\") pod \"ceilometer-0\" (UID: \"7642d2c9-b2cc-400c-b45a-957690fb2e86\") " pod="openstack/ceilometer-0" Dec 01 14:18:28 crc kubenswrapper[4585]: I1201 14:18:28.069944 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7642d2c9-b2cc-400c-b45a-957690fb2e86-log-httpd\") pod \"ceilometer-0\" (UID: \"7642d2c9-b2cc-400c-b45a-957690fb2e86\") " pod="openstack/ceilometer-0" Dec 01 14:18:28 crc kubenswrapper[4585]: I1201 14:18:28.070033 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7642d2c9-b2cc-400c-b45a-957690fb2e86-run-httpd\") pod \"ceilometer-0\" (UID: \"7642d2c9-b2cc-400c-b45a-957690fb2e86\") " pod="openstack/ceilometer-0" Dec 01 14:18:28 crc kubenswrapper[4585]: I1201 14:18:28.073939 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7642d2c9-b2cc-400c-b45a-957690fb2e86-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7642d2c9-b2cc-400c-b45a-957690fb2e86\") " pod="openstack/ceilometer-0" Dec 01 14:18:28 crc kubenswrapper[4585]: I1201 14:18:28.074649 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7642d2c9-b2cc-400c-b45a-957690fb2e86-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7642d2c9-b2cc-400c-b45a-957690fb2e86\") " pod="openstack/ceilometer-0" Dec 01 14:18:28 crc kubenswrapper[4585]: I1201 14:18:28.074828 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7642d2c9-b2cc-400c-b45a-957690fb2e86-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7642d2c9-b2cc-400c-b45a-957690fb2e86\") " pod="openstack/ceilometer-0" Dec 01 14:18:28 crc kubenswrapper[4585]: I1201 14:18:28.075151 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7642d2c9-b2cc-400c-b45a-957690fb2e86-config-data\") pod \"ceilometer-0\" (UID: \"7642d2c9-b2cc-400c-b45a-957690fb2e86\") " pod="openstack/ceilometer-0" Dec 01 14:18:28 crc kubenswrapper[4585]: I1201 14:18:28.075559 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7642d2c9-b2cc-400c-b45a-957690fb2e86-scripts\") pod \"ceilometer-0\" (UID: \"7642d2c9-b2cc-400c-b45a-957690fb2e86\") " pod="openstack/ceilometer-0" Dec 01 14:18:28 crc kubenswrapper[4585]: I1201 14:18:28.091085 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt7xj\" (UniqueName: \"kubernetes.io/projected/7642d2c9-b2cc-400c-b45a-957690fb2e86-kube-api-access-xt7xj\") pod \"ceilometer-0\" (UID: \"7642d2c9-b2cc-400c-b45a-957690fb2e86\") " pod="openstack/ceilometer-0" Dec 01 14:18:28 crc kubenswrapper[4585]: I1201 14:18:28.148693 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 14:18:28 crc kubenswrapper[4585]: I1201 14:18:28.440105 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1b1f236-fce8-4c89-9a75-d2e965fa825d" path="/var/lib/kubelet/pods/f1b1f236-fce8-4c89-9a75-d2e965fa825d/volumes" Dec 01 14:18:28 crc kubenswrapper[4585]: I1201 14:18:28.594228 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 14:18:28 crc kubenswrapper[4585]: I1201 14:18:28.663569 4585 generic.go:334] "Generic (PLEG): container finished" podID="ae2d266c-cec5-4d51-afcc-d948d3cd7903" containerID="06d1b0509366b8cb1d0354b0ea6d376a208ae3ae2bdfdfa9d6a09d3b1c2672f7" exitCode=0 Dec 01 14:18:28 crc kubenswrapper[4585]: I1201 14:18:28.663670 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-q986x" event={"ID":"ae2d266c-cec5-4d51-afcc-d948d3cd7903","Type":"ContainerDied","Data":"06d1b0509366b8cb1d0354b0ea6d376a208ae3ae2bdfdfa9d6a09d3b1c2672f7"} Dec 01 14:18:28 crc kubenswrapper[4585]: I1201 14:18:28.666120 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7642d2c9-b2cc-400c-b45a-957690fb2e86","Type":"ContainerStarted","Data":"efaa48fb5538c93b53bcbfe0b5b1eda84d866ba4d92936469f4965c0929fdde6"} Dec 01 14:18:29 crc kubenswrapper[4585]: I1201 14:18:29.678922 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7642d2c9-b2cc-400c-b45a-957690fb2e86","Type":"ContainerStarted","Data":"ce613b6271059c2d03765689522c6bc72ef274ed412233debcdcab040cc7e8e0"} Dec 01 14:18:29 crc kubenswrapper[4585]: I1201 14:18:29.855588 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 14:18:29 crc kubenswrapper[4585]: I1201 14:18:29.855861 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 14:18:30 crc kubenswrapper[4585]: I1201 14:18:30.160846 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-q986x" Dec 01 14:18:30 crc kubenswrapper[4585]: I1201 14:18:30.324911 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae2d266c-cec5-4d51-afcc-d948d3cd7903-combined-ca-bundle\") pod \"ae2d266c-cec5-4d51-afcc-d948d3cd7903\" (UID: \"ae2d266c-cec5-4d51-afcc-d948d3cd7903\") " Dec 01 14:18:30 crc kubenswrapper[4585]: I1201 14:18:30.325342 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2vn2\" (UniqueName: \"kubernetes.io/projected/ae2d266c-cec5-4d51-afcc-d948d3cd7903-kube-api-access-l2vn2\") pod \"ae2d266c-cec5-4d51-afcc-d948d3cd7903\" (UID: \"ae2d266c-cec5-4d51-afcc-d948d3cd7903\") " Dec 01 14:18:30 crc kubenswrapper[4585]: I1201 14:18:30.325419 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae2d266c-cec5-4d51-afcc-d948d3cd7903-scripts\") pod \"ae2d266c-cec5-4d51-afcc-d948d3cd7903\" (UID: \"ae2d266c-cec5-4d51-afcc-d948d3cd7903\") " Dec 01 14:18:30 crc kubenswrapper[4585]: I1201 14:18:30.325546 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae2d266c-cec5-4d51-afcc-d948d3cd7903-config-data\") pod \"ae2d266c-cec5-4d51-afcc-d948d3cd7903\" (UID: \"ae2d266c-cec5-4d51-afcc-d948d3cd7903\") " Dec 01 14:18:30 crc kubenswrapper[4585]: I1201 14:18:30.340141 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae2d266c-cec5-4d51-afcc-d948d3cd7903-kube-api-access-l2vn2" (OuterVolumeSpecName: "kube-api-access-l2vn2") pod "ae2d266c-cec5-4d51-afcc-d948d3cd7903" (UID: "ae2d266c-cec5-4d51-afcc-d948d3cd7903"). InnerVolumeSpecName "kube-api-access-l2vn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:18:30 crc kubenswrapper[4585]: I1201 14:18:30.375122 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae2d266c-cec5-4d51-afcc-d948d3cd7903-scripts" (OuterVolumeSpecName: "scripts") pod "ae2d266c-cec5-4d51-afcc-d948d3cd7903" (UID: "ae2d266c-cec5-4d51-afcc-d948d3cd7903"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:18:30 crc kubenswrapper[4585]: I1201 14:18:30.399125 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae2d266c-cec5-4d51-afcc-d948d3cd7903-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ae2d266c-cec5-4d51-afcc-d948d3cd7903" (UID: "ae2d266c-cec5-4d51-afcc-d948d3cd7903"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:18:30 crc kubenswrapper[4585]: I1201 14:18:30.427354 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae2d266c-cec5-4d51-afcc-d948d3cd7903-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 14:18:30 crc kubenswrapper[4585]: I1201 14:18:30.427376 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2vn2\" (UniqueName: \"kubernetes.io/projected/ae2d266c-cec5-4d51-afcc-d948d3cd7903-kube-api-access-l2vn2\") on node \"crc\" DevicePath \"\"" Dec 01 14:18:30 crc kubenswrapper[4585]: I1201 14:18:30.427386 4585 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae2d266c-cec5-4d51-afcc-d948d3cd7903-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 14:18:30 crc kubenswrapper[4585]: I1201 14:18:30.441149 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae2d266c-cec5-4d51-afcc-d948d3cd7903-config-data" (OuterVolumeSpecName: "config-data") pod "ae2d266c-cec5-4d51-afcc-d948d3cd7903" (UID: "ae2d266c-cec5-4d51-afcc-d948d3cd7903"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:18:30 crc kubenswrapper[4585]: I1201 14:18:30.529481 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae2d266c-cec5-4d51-afcc-d948d3cd7903-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 14:18:30 crc kubenswrapper[4585]: I1201 14:18:30.688811 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-q986x" event={"ID":"ae2d266c-cec5-4d51-afcc-d948d3cd7903","Type":"ContainerDied","Data":"98d9f2e82b84f74036694aee96fd27bb1d23999f0434af9a77d2d842feec8f31"} Dec 01 14:18:30 crc kubenswrapper[4585]: I1201 14:18:30.688849 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98d9f2e82b84f74036694aee96fd27bb1d23999f0434af9a77d2d842feec8f31" Dec 01 14:18:30 crc kubenswrapper[4585]: I1201 14:18:30.690018 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-q986x" Dec 01 14:18:30 crc kubenswrapper[4585]: I1201 14:18:30.692220 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7642d2c9-b2cc-400c-b45a-957690fb2e86","Type":"ContainerStarted","Data":"71d5e40600806a57542f9b4db9d3a0d7239abb1653e22f0f70ab17eab9073cc9"} Dec 01 14:18:30 crc kubenswrapper[4585]: I1201 14:18:30.909157 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="1f8c300c-5295-4071-a82d-f3e5884bb729" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.200:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 01 14:18:30 crc kubenswrapper[4585]: I1201 14:18:30.909174 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="1f8c300c-5295-4071-a82d-f3e5884bb729" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.200:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 01 14:18:30 crc kubenswrapper[4585]: I1201 14:18:30.991151 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 01 14:18:30 crc kubenswrapper[4585]: I1201 14:18:30.991353 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1f8c300c-5295-4071-a82d-f3e5884bb729" containerName="nova-api-log" containerID="cri-o://a1bc6c9c9a427bd35f544fddf7f399ddbcbb9f157c8d480e5433e1169a02e7bf" gracePeriod=30 Dec 01 14:18:30 crc kubenswrapper[4585]: I1201 14:18:30.991716 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1f8c300c-5295-4071-a82d-f3e5884bb729" containerName="nova-api-api" containerID="cri-o://865eaef9d545ca0d34430e183770b2e4fdd6be28b7d1f37a030135df2976a52f" gracePeriod=30 Dec 01 14:18:31 crc kubenswrapper[4585]: I1201 14:18:31.008594 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 14:18:31 crc kubenswrapper[4585]: I1201 14:18:31.009071 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="991dd5fb-9091-44df-a27a-9f130b663a51" containerName="nova-scheduler-scheduler" containerID="cri-o://da1082cd1c1f62526de89333e81b0e2a2c3f5a276706131e9ce35f4059abf6e0" gracePeriod=30 Dec 01 14:18:31 crc kubenswrapper[4585]: I1201 14:18:31.062260 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 14:18:31 crc kubenswrapper[4585]: I1201 14:18:31.062491 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d9080235-3e98-4f0f-945c-475e6de22a49" containerName="nova-metadata-log" containerID="cri-o://798fc74143f881628ca03f67176da27c5f13d751e6b0423d7a360227f11e8a68" gracePeriod=30 Dec 01 14:18:31 crc kubenswrapper[4585]: I1201 14:18:31.062664 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d9080235-3e98-4f0f-945c-475e6de22a49" containerName="nova-metadata-metadata" containerID="cri-o://199cdb1bedef6b3ef896b88c40c0ae60097e550577cb48b6a0b29dc388671876" gracePeriod=30 Dec 01 14:18:31 crc kubenswrapper[4585]: I1201 14:18:31.703661 4585 generic.go:334] "Generic (PLEG): container finished" podID="1f8c300c-5295-4071-a82d-f3e5884bb729" containerID="a1bc6c9c9a427bd35f544fddf7f399ddbcbb9f157c8d480e5433e1169a02e7bf" exitCode=143 Dec 01 14:18:31 crc kubenswrapper[4585]: I1201 14:18:31.703715 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1f8c300c-5295-4071-a82d-f3e5884bb729","Type":"ContainerDied","Data":"a1bc6c9c9a427bd35f544fddf7f399ddbcbb9f157c8d480e5433e1169a02e7bf"} Dec 01 14:18:31 crc kubenswrapper[4585]: I1201 14:18:31.707019 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7642d2c9-b2cc-400c-b45a-957690fb2e86","Type":"ContainerStarted","Data":"b3e7d56e1dd3082af934d7cdf05dc8e91f44c4b34ee23a4b99576a45ab9165e8"} Dec 01 14:18:31 crc kubenswrapper[4585]: I1201 14:18:31.709354 4585 generic.go:334] "Generic (PLEG): container finished" podID="d9080235-3e98-4f0f-945c-475e6de22a49" containerID="798fc74143f881628ca03f67176da27c5f13d751e6b0423d7a360227f11e8a68" exitCode=143 Dec 01 14:18:31 crc kubenswrapper[4585]: I1201 14:18:31.709422 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d9080235-3e98-4f0f-945c-475e6de22a49","Type":"ContainerDied","Data":"798fc74143f881628ca03f67176da27c5f13d751e6b0423d7a360227f11e8a68"} Dec 01 14:18:31 crc kubenswrapper[4585]: E1201 14:18:31.810348 4585 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="da1082cd1c1f62526de89333e81b0e2a2c3f5a276706131e9ce35f4059abf6e0" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 01 14:18:31 crc kubenswrapper[4585]: E1201 14:18:31.812609 4585 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="da1082cd1c1f62526de89333e81b0e2a2c3f5a276706131e9ce35f4059abf6e0" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 01 14:18:31 crc kubenswrapper[4585]: E1201 14:18:31.813896 4585 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="da1082cd1c1f62526de89333e81b0e2a2c3f5a276706131e9ce35f4059abf6e0" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 01 14:18:31 crc kubenswrapper[4585]: E1201 14:18:31.813940 4585 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="991dd5fb-9091-44df-a27a-9f130b663a51" containerName="nova-scheduler-scheduler" Dec 01 14:18:33 crc kubenswrapper[4585]: I1201 14:18:33.728529 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7642d2c9-b2cc-400c-b45a-957690fb2e86","Type":"ContainerStarted","Data":"432565f8b233bc0c3fe264331f395706b257abe8b7e6207febe3a0bd1b0f7260"} Dec 01 14:18:33 crc kubenswrapper[4585]: I1201 14:18:33.730454 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 14:18:34 crc kubenswrapper[4585]: I1201 14:18:34.485059 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="d9080235-3e98-4f0f-945c-475e6de22a49" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": read tcp 10.217.0.2:35498->10.217.0.193:8775: read: connection reset by peer" Dec 01 14:18:34 crc kubenswrapper[4585]: I1201 14:18:34.485303 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="d9080235-3e98-4f0f-945c-475e6de22a49" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": read tcp 10.217.0.2:35490->10.217.0.193:8775: read: connection reset by peer" Dec 01 14:18:34 crc kubenswrapper[4585]: I1201 14:18:34.754617 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d9080235-3e98-4f0f-945c-475e6de22a49","Type":"ContainerDied","Data":"199cdb1bedef6b3ef896b88c40c0ae60097e550577cb48b6a0b29dc388671876"} Dec 01 14:18:34 crc kubenswrapper[4585]: I1201 14:18:34.754847 4585 generic.go:334] "Generic (PLEG): container finished" podID="d9080235-3e98-4f0f-945c-475e6de22a49" containerID="199cdb1bedef6b3ef896b88c40c0ae60097e550577cb48b6a0b29dc388671876" exitCode=0 Dec 01 14:18:35 crc kubenswrapper[4585]: I1201 14:18:35.013352 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 14:18:35 crc kubenswrapper[4585]: I1201 14:18:35.030830 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.334890265 podStartE2EDuration="8.030815595s" podCreationTimestamp="2025-12-01 14:18:27 +0000 UTC" firstStartedPulling="2025-12-01 14:18:28.610794728 +0000 UTC m=+1222.595008583" lastFinishedPulling="2025-12-01 14:18:33.306720058 +0000 UTC m=+1227.290933913" observedRunningTime="2025-12-01 14:18:33.772315424 +0000 UTC m=+1227.756529279" watchObservedRunningTime="2025-12-01 14:18:35.030815595 +0000 UTC m=+1229.015029440" Dec 01 14:18:35 crc kubenswrapper[4585]: I1201 14:18:35.031733 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9080235-3e98-4f0f-945c-475e6de22a49-combined-ca-bundle\") pod \"d9080235-3e98-4f0f-945c-475e6de22a49\" (UID: \"d9080235-3e98-4f0f-945c-475e6de22a49\") " Dec 01 14:18:35 crc kubenswrapper[4585]: I1201 14:18:35.031818 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9080235-3e98-4f0f-945c-475e6de22a49-nova-metadata-tls-certs\") pod \"d9080235-3e98-4f0f-945c-475e6de22a49\" (UID: \"d9080235-3e98-4f0f-945c-475e6de22a49\") " Dec 01 14:18:35 crc kubenswrapper[4585]: I1201 14:18:35.031890 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtmlx\" (UniqueName: \"kubernetes.io/projected/d9080235-3e98-4f0f-945c-475e6de22a49-kube-api-access-gtmlx\") pod \"d9080235-3e98-4f0f-945c-475e6de22a49\" (UID: \"d9080235-3e98-4f0f-945c-475e6de22a49\") " Dec 01 14:18:35 crc kubenswrapper[4585]: I1201 14:18:35.031929 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9080235-3e98-4f0f-945c-475e6de22a49-config-data\") pod \"d9080235-3e98-4f0f-945c-475e6de22a49\" (UID: \"d9080235-3e98-4f0f-945c-475e6de22a49\") " Dec 01 14:18:35 crc kubenswrapper[4585]: I1201 14:18:35.031962 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9080235-3e98-4f0f-945c-475e6de22a49-logs\") pod \"d9080235-3e98-4f0f-945c-475e6de22a49\" (UID: \"d9080235-3e98-4f0f-945c-475e6de22a49\") " Dec 01 14:18:35 crc kubenswrapper[4585]: I1201 14:18:35.033741 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9080235-3e98-4f0f-945c-475e6de22a49-logs" (OuterVolumeSpecName: "logs") pod "d9080235-3e98-4f0f-945c-475e6de22a49" (UID: "d9080235-3e98-4f0f-945c-475e6de22a49"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:18:35 crc kubenswrapper[4585]: I1201 14:18:35.062227 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9080235-3e98-4f0f-945c-475e6de22a49-kube-api-access-gtmlx" (OuterVolumeSpecName: "kube-api-access-gtmlx") pod "d9080235-3e98-4f0f-945c-475e6de22a49" (UID: "d9080235-3e98-4f0f-945c-475e6de22a49"). InnerVolumeSpecName "kube-api-access-gtmlx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:18:35 crc kubenswrapper[4585]: I1201 14:18:35.094060 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9080235-3e98-4f0f-945c-475e6de22a49-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d9080235-3e98-4f0f-945c-475e6de22a49" (UID: "d9080235-3e98-4f0f-945c-475e6de22a49"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:18:35 crc kubenswrapper[4585]: I1201 14:18:35.118243 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9080235-3e98-4f0f-945c-475e6de22a49-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "d9080235-3e98-4f0f-945c-475e6de22a49" (UID: "d9080235-3e98-4f0f-945c-475e6de22a49"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:18:35 crc kubenswrapper[4585]: I1201 14:18:35.132638 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9080235-3e98-4f0f-945c-475e6de22a49-config-data" (OuterVolumeSpecName: "config-data") pod "d9080235-3e98-4f0f-945c-475e6de22a49" (UID: "d9080235-3e98-4f0f-945c-475e6de22a49"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:18:35 crc kubenswrapper[4585]: I1201 14:18:35.133340 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9080235-3e98-4f0f-945c-475e6de22a49-config-data\") pod \"d9080235-3e98-4f0f-945c-475e6de22a49\" (UID: \"d9080235-3e98-4f0f-945c-475e6de22a49\") " Dec 01 14:18:35 crc kubenswrapper[4585]: I1201 14:18:35.133760 4585 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9080235-3e98-4f0f-945c-475e6de22a49-logs\") on node \"crc\" DevicePath \"\"" Dec 01 14:18:35 crc kubenswrapper[4585]: I1201 14:18:35.133778 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9080235-3e98-4f0f-945c-475e6de22a49-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 14:18:35 crc kubenswrapper[4585]: I1201 14:18:35.133789 4585 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9080235-3e98-4f0f-945c-475e6de22a49-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 14:18:35 crc kubenswrapper[4585]: I1201 14:18:35.133799 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtmlx\" (UniqueName: \"kubernetes.io/projected/d9080235-3e98-4f0f-945c-475e6de22a49-kube-api-access-gtmlx\") on node \"crc\" DevicePath \"\"" Dec 01 14:18:35 crc kubenswrapper[4585]: W1201 14:18:35.133863 4585 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/d9080235-3e98-4f0f-945c-475e6de22a49/volumes/kubernetes.io~secret/config-data Dec 01 14:18:35 crc kubenswrapper[4585]: I1201 14:18:35.133872 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9080235-3e98-4f0f-945c-475e6de22a49-config-data" (OuterVolumeSpecName: "config-data") pod "d9080235-3e98-4f0f-945c-475e6de22a49" (UID: "d9080235-3e98-4f0f-945c-475e6de22a49"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:18:35 crc kubenswrapper[4585]: I1201 14:18:35.235123 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9080235-3e98-4f0f-945c-475e6de22a49-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 14:18:35 crc kubenswrapper[4585]: I1201 14:18:35.764452 4585 generic.go:334] "Generic (PLEG): container finished" podID="991dd5fb-9091-44df-a27a-9f130b663a51" containerID="da1082cd1c1f62526de89333e81b0e2a2c3f5a276706131e9ce35f4059abf6e0" exitCode=0 Dec 01 14:18:35 crc kubenswrapper[4585]: I1201 14:18:35.764787 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"991dd5fb-9091-44df-a27a-9f130b663a51","Type":"ContainerDied","Data":"da1082cd1c1f62526de89333e81b0e2a2c3f5a276706131e9ce35f4059abf6e0"} Dec 01 14:18:35 crc kubenswrapper[4585]: I1201 14:18:35.769124 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d9080235-3e98-4f0f-945c-475e6de22a49","Type":"ContainerDied","Data":"69146a789b11c65e32d09683cec09d36396e40c502c30b9d3771ccc6903a995f"} Dec 01 14:18:35 crc kubenswrapper[4585]: I1201 14:18:35.769197 4585 scope.go:117] "RemoveContainer" containerID="199cdb1bedef6b3ef896b88c40c0ae60097e550577cb48b6a0b29dc388671876" Dec 01 14:18:35 crc kubenswrapper[4585]: I1201 14:18:35.769249 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 14:18:35 crc kubenswrapper[4585]: I1201 14:18:35.810253 4585 scope.go:117] "RemoveContainer" containerID="798fc74143f881628ca03f67176da27c5f13d751e6b0423d7a360227f11e8a68" Dec 01 14:18:35 crc kubenswrapper[4585]: I1201 14:18:35.811072 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 14:18:35 crc kubenswrapper[4585]: I1201 14:18:35.819036 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 14:18:35 crc kubenswrapper[4585]: I1201 14:18:35.885353 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 01 14:18:35 crc kubenswrapper[4585]: E1201 14:18:35.886597 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9080235-3e98-4f0f-945c-475e6de22a49" containerName="nova-metadata-metadata" Dec 01 14:18:35 crc kubenswrapper[4585]: I1201 14:18:35.886719 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9080235-3e98-4f0f-945c-475e6de22a49" containerName="nova-metadata-metadata" Dec 01 14:18:35 crc kubenswrapper[4585]: E1201 14:18:35.886803 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae2d266c-cec5-4d51-afcc-d948d3cd7903" containerName="nova-manage" Dec 01 14:18:35 crc kubenswrapper[4585]: I1201 14:18:35.886870 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae2d266c-cec5-4d51-afcc-d948d3cd7903" containerName="nova-manage" Dec 01 14:18:35 crc kubenswrapper[4585]: E1201 14:18:35.887013 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9080235-3e98-4f0f-945c-475e6de22a49" containerName="nova-metadata-log" Dec 01 14:18:35 crc kubenswrapper[4585]: I1201 14:18:35.887099 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9080235-3e98-4f0f-945c-475e6de22a49" containerName="nova-metadata-log" Dec 01 14:18:35 crc kubenswrapper[4585]: I1201 14:18:35.887806 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9080235-3e98-4f0f-945c-475e6de22a49" containerName="nova-metadata-log" Dec 01 14:18:35 crc kubenswrapper[4585]: I1201 14:18:35.889281 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae2d266c-cec5-4d51-afcc-d948d3cd7903" containerName="nova-manage" Dec 01 14:18:35 crc kubenswrapper[4585]: I1201 14:18:35.889382 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9080235-3e98-4f0f-945c-475e6de22a49" containerName="nova-metadata-metadata" Dec 01 14:18:35 crc kubenswrapper[4585]: I1201 14:18:35.891950 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 14:18:35 crc kubenswrapper[4585]: I1201 14:18:35.896497 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 01 14:18:35 crc kubenswrapper[4585]: I1201 14:18:35.896565 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 01 14:18:35 crc kubenswrapper[4585]: I1201 14:18:35.941188 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 14:18:36 crc kubenswrapper[4585]: I1201 14:18:36.051268 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61c99c9c-c763-4e3a-8e71-6d78e3779991-config-data\") pod \"nova-metadata-0\" (UID: \"61c99c9c-c763-4e3a-8e71-6d78e3779991\") " pod="openstack/nova-metadata-0" Dec 01 14:18:36 crc kubenswrapper[4585]: I1201 14:18:36.051334 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/61c99c9c-c763-4e3a-8e71-6d78e3779991-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"61c99c9c-c763-4e3a-8e71-6d78e3779991\") " pod="openstack/nova-metadata-0" Dec 01 14:18:36 crc kubenswrapper[4585]: I1201 14:18:36.051417 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61c99c9c-c763-4e3a-8e71-6d78e3779991-logs\") pod \"nova-metadata-0\" (UID: \"61c99c9c-c763-4e3a-8e71-6d78e3779991\") " pod="openstack/nova-metadata-0" Dec 01 14:18:36 crc kubenswrapper[4585]: I1201 14:18:36.051453 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68dhg\" (UniqueName: \"kubernetes.io/projected/61c99c9c-c763-4e3a-8e71-6d78e3779991-kube-api-access-68dhg\") pod \"nova-metadata-0\" (UID: \"61c99c9c-c763-4e3a-8e71-6d78e3779991\") " pod="openstack/nova-metadata-0" Dec 01 14:18:36 crc kubenswrapper[4585]: I1201 14:18:36.051518 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61c99c9c-c763-4e3a-8e71-6d78e3779991-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"61c99c9c-c763-4e3a-8e71-6d78e3779991\") " pod="openstack/nova-metadata-0" Dec 01 14:18:36 crc kubenswrapper[4585]: I1201 14:18:36.153093 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61c99c9c-c763-4e3a-8e71-6d78e3779991-logs\") pod \"nova-metadata-0\" (UID: \"61c99c9c-c763-4e3a-8e71-6d78e3779991\") " pod="openstack/nova-metadata-0" Dec 01 14:18:36 crc kubenswrapper[4585]: I1201 14:18:36.153172 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68dhg\" (UniqueName: \"kubernetes.io/projected/61c99c9c-c763-4e3a-8e71-6d78e3779991-kube-api-access-68dhg\") pod \"nova-metadata-0\" (UID: \"61c99c9c-c763-4e3a-8e71-6d78e3779991\") " pod="openstack/nova-metadata-0" Dec 01 14:18:36 crc kubenswrapper[4585]: I1201 14:18:36.153258 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61c99c9c-c763-4e3a-8e71-6d78e3779991-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"61c99c9c-c763-4e3a-8e71-6d78e3779991\") " pod="openstack/nova-metadata-0" Dec 01 14:18:36 crc kubenswrapper[4585]: I1201 14:18:36.153307 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61c99c9c-c763-4e3a-8e71-6d78e3779991-config-data\") pod \"nova-metadata-0\" (UID: \"61c99c9c-c763-4e3a-8e71-6d78e3779991\") " pod="openstack/nova-metadata-0" Dec 01 14:18:36 crc kubenswrapper[4585]: I1201 14:18:36.153339 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/61c99c9c-c763-4e3a-8e71-6d78e3779991-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"61c99c9c-c763-4e3a-8e71-6d78e3779991\") " pod="openstack/nova-metadata-0" Dec 01 14:18:36 crc kubenswrapper[4585]: I1201 14:18:36.154798 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61c99c9c-c763-4e3a-8e71-6d78e3779991-logs\") pod \"nova-metadata-0\" (UID: \"61c99c9c-c763-4e3a-8e71-6d78e3779991\") " pod="openstack/nova-metadata-0" Dec 01 14:18:36 crc kubenswrapper[4585]: I1201 14:18:36.159511 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61c99c9c-c763-4e3a-8e71-6d78e3779991-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"61c99c9c-c763-4e3a-8e71-6d78e3779991\") " pod="openstack/nova-metadata-0" Dec 01 14:18:36 crc kubenswrapper[4585]: I1201 14:18:36.160196 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/61c99c9c-c763-4e3a-8e71-6d78e3779991-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"61c99c9c-c763-4e3a-8e71-6d78e3779991\") " pod="openstack/nova-metadata-0" Dec 01 14:18:36 crc kubenswrapper[4585]: I1201 14:18:36.160647 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61c99c9c-c763-4e3a-8e71-6d78e3779991-config-data\") pod \"nova-metadata-0\" (UID: \"61c99c9c-c763-4e3a-8e71-6d78e3779991\") " pod="openstack/nova-metadata-0" Dec 01 14:18:36 crc kubenswrapper[4585]: I1201 14:18:36.173115 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68dhg\" (UniqueName: \"kubernetes.io/projected/61c99c9c-c763-4e3a-8e71-6d78e3779991-kube-api-access-68dhg\") pod \"nova-metadata-0\" (UID: \"61c99c9c-c763-4e3a-8e71-6d78e3779991\") " pod="openstack/nova-metadata-0" Dec 01 14:18:36 crc kubenswrapper[4585]: I1201 14:18:36.211941 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 14:18:36 crc kubenswrapper[4585]: I1201 14:18:36.227144 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 14:18:36 crc kubenswrapper[4585]: I1201 14:18:36.355953 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/991dd5fb-9091-44df-a27a-9f130b663a51-config-data\") pod \"991dd5fb-9091-44df-a27a-9f130b663a51\" (UID: \"991dd5fb-9091-44df-a27a-9f130b663a51\") " Dec 01 14:18:36 crc kubenswrapper[4585]: I1201 14:18:36.356172 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmvd5\" (UniqueName: \"kubernetes.io/projected/991dd5fb-9091-44df-a27a-9f130b663a51-kube-api-access-vmvd5\") pod \"991dd5fb-9091-44df-a27a-9f130b663a51\" (UID: \"991dd5fb-9091-44df-a27a-9f130b663a51\") " Dec 01 14:18:36 crc kubenswrapper[4585]: I1201 14:18:36.356242 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/991dd5fb-9091-44df-a27a-9f130b663a51-combined-ca-bundle\") pod \"991dd5fb-9091-44df-a27a-9f130b663a51\" (UID: \"991dd5fb-9091-44df-a27a-9f130b663a51\") " Dec 01 14:18:36 crc kubenswrapper[4585]: I1201 14:18:36.360167 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/991dd5fb-9091-44df-a27a-9f130b663a51-kube-api-access-vmvd5" (OuterVolumeSpecName: "kube-api-access-vmvd5") pod "991dd5fb-9091-44df-a27a-9f130b663a51" (UID: "991dd5fb-9091-44df-a27a-9f130b663a51"). InnerVolumeSpecName "kube-api-access-vmvd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:18:36 crc kubenswrapper[4585]: I1201 14:18:36.394717 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/991dd5fb-9091-44df-a27a-9f130b663a51-config-data" (OuterVolumeSpecName: "config-data") pod "991dd5fb-9091-44df-a27a-9f130b663a51" (UID: "991dd5fb-9091-44df-a27a-9f130b663a51"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:18:36 crc kubenswrapper[4585]: I1201 14:18:36.434222 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/991dd5fb-9091-44df-a27a-9f130b663a51-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "991dd5fb-9091-44df-a27a-9f130b663a51" (UID: "991dd5fb-9091-44df-a27a-9f130b663a51"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:18:36 crc kubenswrapper[4585]: I1201 14:18:36.449885 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9080235-3e98-4f0f-945c-475e6de22a49" path="/var/lib/kubelet/pods/d9080235-3e98-4f0f-945c-475e6de22a49/volumes" Dec 01 14:18:36 crc kubenswrapper[4585]: I1201 14:18:36.469692 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/991dd5fb-9091-44df-a27a-9f130b663a51-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 14:18:36 crc kubenswrapper[4585]: I1201 14:18:36.469721 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmvd5\" (UniqueName: \"kubernetes.io/projected/991dd5fb-9091-44df-a27a-9f130b663a51-kube-api-access-vmvd5\") on node \"crc\" DevicePath \"\"" Dec 01 14:18:36 crc kubenswrapper[4585]: I1201 14:18:36.469729 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/991dd5fb-9091-44df-a27a-9f130b663a51-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 14:18:36 crc kubenswrapper[4585]: I1201 14:18:36.596113 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 14:18:36 crc kubenswrapper[4585]: I1201 14:18:36.778369 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"991dd5fb-9091-44df-a27a-9f130b663a51","Type":"ContainerDied","Data":"0e4f554130f94fbfb3a664a495b63147e23c556ba2760b4e375c1eb98e15a5bb"} Dec 01 14:18:36 crc kubenswrapper[4585]: I1201 14:18:36.778416 4585 scope.go:117] "RemoveContainer" containerID="da1082cd1c1f62526de89333e81b0e2a2c3f5a276706131e9ce35f4059abf6e0" Dec 01 14:18:36 crc kubenswrapper[4585]: I1201 14:18:36.778498 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 14:18:36 crc kubenswrapper[4585]: I1201 14:18:36.783237 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"61c99c9c-c763-4e3a-8e71-6d78e3779991","Type":"ContainerStarted","Data":"ad9a06ab93212a07de33f64409bbc4f8ebf19dea2f037179338e5704d21fb64d"} Dec 01 14:18:36 crc kubenswrapper[4585]: I1201 14:18:36.814519 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 14:18:36 crc kubenswrapper[4585]: I1201 14:18:36.830199 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 14:18:36 crc kubenswrapper[4585]: I1201 14:18:36.852915 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 14:18:36 crc kubenswrapper[4585]: E1201 14:18:36.853328 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="991dd5fb-9091-44df-a27a-9f130b663a51" containerName="nova-scheduler-scheduler" Dec 01 14:18:36 crc kubenswrapper[4585]: I1201 14:18:36.853347 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="991dd5fb-9091-44df-a27a-9f130b663a51" containerName="nova-scheduler-scheduler" Dec 01 14:18:36 crc kubenswrapper[4585]: I1201 14:18:36.853542 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="991dd5fb-9091-44df-a27a-9f130b663a51" containerName="nova-scheduler-scheduler" Dec 01 14:18:36 crc kubenswrapper[4585]: I1201 14:18:36.854151 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 14:18:36 crc kubenswrapper[4585]: I1201 14:18:36.871253 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 01 14:18:36 crc kubenswrapper[4585]: I1201 14:18:36.896321 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 14:18:36 crc kubenswrapper[4585]: I1201 14:18:36.988845 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/779dee18-6c1e-4e00-b3be-22ce3d8e2259-config-data\") pod \"nova-scheduler-0\" (UID: \"779dee18-6c1e-4e00-b3be-22ce3d8e2259\") " pod="openstack/nova-scheduler-0" Dec 01 14:18:36 crc kubenswrapper[4585]: I1201 14:18:36.988955 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f84dn\" (UniqueName: \"kubernetes.io/projected/779dee18-6c1e-4e00-b3be-22ce3d8e2259-kube-api-access-f84dn\") pod \"nova-scheduler-0\" (UID: \"779dee18-6c1e-4e00-b3be-22ce3d8e2259\") " pod="openstack/nova-scheduler-0" Dec 01 14:18:36 crc kubenswrapper[4585]: I1201 14:18:36.989080 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/779dee18-6c1e-4e00-b3be-22ce3d8e2259-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"779dee18-6c1e-4e00-b3be-22ce3d8e2259\") " pod="openstack/nova-scheduler-0" Dec 01 14:18:37 crc kubenswrapper[4585]: I1201 14:18:37.091422 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f84dn\" (UniqueName: \"kubernetes.io/projected/779dee18-6c1e-4e00-b3be-22ce3d8e2259-kube-api-access-f84dn\") pod \"nova-scheduler-0\" (UID: \"779dee18-6c1e-4e00-b3be-22ce3d8e2259\") " pod="openstack/nova-scheduler-0" Dec 01 14:18:37 crc kubenswrapper[4585]: I1201 14:18:37.091898 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/779dee18-6c1e-4e00-b3be-22ce3d8e2259-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"779dee18-6c1e-4e00-b3be-22ce3d8e2259\") " pod="openstack/nova-scheduler-0" Dec 01 14:18:37 crc kubenswrapper[4585]: I1201 14:18:37.091987 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/779dee18-6c1e-4e00-b3be-22ce3d8e2259-config-data\") pod \"nova-scheduler-0\" (UID: \"779dee18-6c1e-4e00-b3be-22ce3d8e2259\") " pod="openstack/nova-scheduler-0" Dec 01 14:18:37 crc kubenswrapper[4585]: I1201 14:18:37.096412 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/779dee18-6c1e-4e00-b3be-22ce3d8e2259-config-data\") pod \"nova-scheduler-0\" (UID: \"779dee18-6c1e-4e00-b3be-22ce3d8e2259\") " pod="openstack/nova-scheduler-0" Dec 01 14:18:37 crc kubenswrapper[4585]: I1201 14:18:37.098698 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/779dee18-6c1e-4e00-b3be-22ce3d8e2259-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"779dee18-6c1e-4e00-b3be-22ce3d8e2259\") " pod="openstack/nova-scheduler-0" Dec 01 14:18:37 crc kubenswrapper[4585]: I1201 14:18:37.117042 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f84dn\" (UniqueName: \"kubernetes.io/projected/779dee18-6c1e-4e00-b3be-22ce3d8e2259-kube-api-access-f84dn\") pod \"nova-scheduler-0\" (UID: \"779dee18-6c1e-4e00-b3be-22ce3d8e2259\") " pod="openstack/nova-scheduler-0" Dec 01 14:18:37 crc kubenswrapper[4585]: I1201 14:18:37.273396 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 14:18:37 crc kubenswrapper[4585]: I1201 14:18:37.798132 4585 generic.go:334] "Generic (PLEG): container finished" podID="1f8c300c-5295-4071-a82d-f3e5884bb729" containerID="865eaef9d545ca0d34430e183770b2e4fdd6be28b7d1f37a030135df2976a52f" exitCode=0 Dec 01 14:18:37 crc kubenswrapper[4585]: I1201 14:18:37.798472 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1f8c300c-5295-4071-a82d-f3e5884bb729","Type":"ContainerDied","Data":"865eaef9d545ca0d34430e183770b2e4fdd6be28b7d1f37a030135df2976a52f"} Dec 01 14:18:37 crc kubenswrapper[4585]: I1201 14:18:37.800837 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"61c99c9c-c763-4e3a-8e71-6d78e3779991","Type":"ContainerStarted","Data":"a78a5ff91325baecdc9ec5f30ad32a1ebccd13e782c2351781dd1c684107ef62"} Dec 01 14:18:37 crc kubenswrapper[4585]: I1201 14:18:37.800874 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"61c99c9c-c763-4e3a-8e71-6d78e3779991","Type":"ContainerStarted","Data":"c2af6f3d4bf61237a8b98090fc0ccbca45ef5c9d5c9415fc7da054f365fadf00"} Dec 01 14:18:37 crc kubenswrapper[4585]: I1201 14:18:37.826127 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.826110229 podStartE2EDuration="2.826110229s" podCreationTimestamp="2025-12-01 14:18:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:18:37.825707478 +0000 UTC m=+1231.809921333" watchObservedRunningTime="2025-12-01 14:18:37.826110229 +0000 UTC m=+1231.810324074" Dec 01 14:18:37 crc kubenswrapper[4585]: I1201 14:18:37.927410 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 14:18:38 crc kubenswrapper[4585]: I1201 14:18:38.095797 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 14:18:38 crc kubenswrapper[4585]: I1201 14:18:38.214767 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f8c300c-5295-4071-a82d-f3e5884bb729-logs\") pod \"1f8c300c-5295-4071-a82d-f3e5884bb729\" (UID: \"1f8c300c-5295-4071-a82d-f3e5884bb729\") " Dec 01 14:18:38 crc kubenswrapper[4585]: I1201 14:18:38.214853 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f8c300c-5295-4071-a82d-f3e5884bb729-combined-ca-bundle\") pod \"1f8c300c-5295-4071-a82d-f3e5884bb729\" (UID: \"1f8c300c-5295-4071-a82d-f3e5884bb729\") " Dec 01 14:18:38 crc kubenswrapper[4585]: I1201 14:18:38.214882 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f8c300c-5295-4071-a82d-f3e5884bb729-public-tls-certs\") pod \"1f8c300c-5295-4071-a82d-f3e5884bb729\" (UID: \"1f8c300c-5295-4071-a82d-f3e5884bb729\") " Dec 01 14:18:38 crc kubenswrapper[4585]: I1201 14:18:38.214899 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f8c300c-5295-4071-a82d-f3e5884bb729-internal-tls-certs\") pod \"1f8c300c-5295-4071-a82d-f3e5884bb729\" (UID: \"1f8c300c-5295-4071-a82d-f3e5884bb729\") " Dec 01 14:18:38 crc kubenswrapper[4585]: I1201 14:18:38.215034 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f8c300c-5295-4071-a82d-f3e5884bb729-config-data\") pod \"1f8c300c-5295-4071-a82d-f3e5884bb729\" (UID: \"1f8c300c-5295-4071-a82d-f3e5884bb729\") " Dec 01 14:18:38 crc kubenswrapper[4585]: I1201 14:18:38.215087 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xf6jv\" (UniqueName: \"kubernetes.io/projected/1f8c300c-5295-4071-a82d-f3e5884bb729-kube-api-access-xf6jv\") pod \"1f8c300c-5295-4071-a82d-f3e5884bb729\" (UID: \"1f8c300c-5295-4071-a82d-f3e5884bb729\") " Dec 01 14:18:38 crc kubenswrapper[4585]: I1201 14:18:38.216187 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f8c300c-5295-4071-a82d-f3e5884bb729-logs" (OuterVolumeSpecName: "logs") pod "1f8c300c-5295-4071-a82d-f3e5884bb729" (UID: "1f8c300c-5295-4071-a82d-f3e5884bb729"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:18:38 crc kubenswrapper[4585]: I1201 14:18:38.222082 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f8c300c-5295-4071-a82d-f3e5884bb729-kube-api-access-xf6jv" (OuterVolumeSpecName: "kube-api-access-xf6jv") pod "1f8c300c-5295-4071-a82d-f3e5884bb729" (UID: "1f8c300c-5295-4071-a82d-f3e5884bb729"). InnerVolumeSpecName "kube-api-access-xf6jv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:18:38 crc kubenswrapper[4585]: I1201 14:18:38.256720 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f8c300c-5295-4071-a82d-f3e5884bb729-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1f8c300c-5295-4071-a82d-f3e5884bb729" (UID: "1f8c300c-5295-4071-a82d-f3e5884bb729"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:18:38 crc kubenswrapper[4585]: I1201 14:18:38.268311 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f8c300c-5295-4071-a82d-f3e5884bb729-config-data" (OuterVolumeSpecName: "config-data") pod "1f8c300c-5295-4071-a82d-f3e5884bb729" (UID: "1f8c300c-5295-4071-a82d-f3e5884bb729"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:18:38 crc kubenswrapper[4585]: I1201 14:18:38.277991 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f8c300c-5295-4071-a82d-f3e5884bb729-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "1f8c300c-5295-4071-a82d-f3e5884bb729" (UID: "1f8c300c-5295-4071-a82d-f3e5884bb729"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:18:38 crc kubenswrapper[4585]: I1201 14:18:38.280583 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f8c300c-5295-4071-a82d-f3e5884bb729-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1f8c300c-5295-4071-a82d-f3e5884bb729" (UID: "1f8c300c-5295-4071-a82d-f3e5884bb729"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:18:38 crc kubenswrapper[4585]: I1201 14:18:38.318117 4585 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f8c300c-5295-4071-a82d-f3e5884bb729-logs\") on node \"crc\" DevicePath \"\"" Dec 01 14:18:38 crc kubenswrapper[4585]: I1201 14:18:38.318299 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f8c300c-5295-4071-a82d-f3e5884bb729-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 14:18:38 crc kubenswrapper[4585]: I1201 14:18:38.318366 4585 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f8c300c-5295-4071-a82d-f3e5884bb729-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 14:18:38 crc kubenswrapper[4585]: I1201 14:18:38.318425 4585 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f8c300c-5295-4071-a82d-f3e5884bb729-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 14:18:38 crc kubenswrapper[4585]: I1201 14:18:38.318475 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f8c300c-5295-4071-a82d-f3e5884bb729-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 14:18:38 crc kubenswrapper[4585]: I1201 14:18:38.318542 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xf6jv\" (UniqueName: \"kubernetes.io/projected/1f8c300c-5295-4071-a82d-f3e5884bb729-kube-api-access-xf6jv\") on node \"crc\" DevicePath \"\"" Dec 01 14:18:38 crc kubenswrapper[4585]: I1201 14:18:38.428593 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="991dd5fb-9091-44df-a27a-9f130b663a51" path="/var/lib/kubelet/pods/991dd5fb-9091-44df-a27a-9f130b663a51/volumes" Dec 01 14:18:38 crc kubenswrapper[4585]: I1201 14:18:38.809935 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"779dee18-6c1e-4e00-b3be-22ce3d8e2259","Type":"ContainerStarted","Data":"6b614cff6a4936c5abd67a9e331a8328e2bcbd40b230a9a6af53e666cf9bc3c8"} Dec 01 14:18:38 crc kubenswrapper[4585]: I1201 14:18:38.810001 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"779dee18-6c1e-4e00-b3be-22ce3d8e2259","Type":"ContainerStarted","Data":"46b033750d4ed7129e2d7768d0434e9a2491bf346049ff59b5534cf5fcd2fb04"} Dec 01 14:18:38 crc kubenswrapper[4585]: I1201 14:18:38.812149 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 14:18:38 crc kubenswrapper[4585]: I1201 14:18:38.812528 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1f8c300c-5295-4071-a82d-f3e5884bb729","Type":"ContainerDied","Data":"e8ca071b4524a386efc860e8ebb9508b944c4efcd707253009835b6ecbae7e81"} Dec 01 14:18:38 crc kubenswrapper[4585]: I1201 14:18:38.812582 4585 scope.go:117] "RemoveContainer" containerID="865eaef9d545ca0d34430e183770b2e4fdd6be28b7d1f37a030135df2976a52f" Dec 01 14:18:38 crc kubenswrapper[4585]: I1201 14:18:38.835890 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.835873167 podStartE2EDuration="2.835873167s" podCreationTimestamp="2025-12-01 14:18:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:18:38.829286852 +0000 UTC m=+1232.813500707" watchObservedRunningTime="2025-12-01 14:18:38.835873167 +0000 UTC m=+1232.820087022" Dec 01 14:18:38 crc kubenswrapper[4585]: I1201 14:18:38.852322 4585 scope.go:117] "RemoveContainer" containerID="a1bc6c9c9a427bd35f544fddf7f399ddbcbb9f157c8d480e5433e1169a02e7bf" Dec 01 14:18:38 crc kubenswrapper[4585]: I1201 14:18:38.915937 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 01 14:18:38 crc kubenswrapper[4585]: I1201 14:18:38.927445 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 01 14:18:38 crc kubenswrapper[4585]: I1201 14:18:38.935897 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 01 14:18:38 crc kubenswrapper[4585]: E1201 14:18:38.936228 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f8c300c-5295-4071-a82d-f3e5884bb729" containerName="nova-api-api" Dec 01 14:18:38 crc kubenswrapper[4585]: I1201 14:18:38.936246 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f8c300c-5295-4071-a82d-f3e5884bb729" containerName="nova-api-api" Dec 01 14:18:38 crc kubenswrapper[4585]: E1201 14:18:38.936257 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f8c300c-5295-4071-a82d-f3e5884bb729" containerName="nova-api-log" Dec 01 14:18:38 crc kubenswrapper[4585]: I1201 14:18:38.936264 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f8c300c-5295-4071-a82d-f3e5884bb729" containerName="nova-api-log" Dec 01 14:18:38 crc kubenswrapper[4585]: I1201 14:18:38.936443 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f8c300c-5295-4071-a82d-f3e5884bb729" containerName="nova-api-api" Dec 01 14:18:38 crc kubenswrapper[4585]: I1201 14:18:38.936464 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f8c300c-5295-4071-a82d-f3e5884bb729" containerName="nova-api-log" Dec 01 14:18:38 crc kubenswrapper[4585]: I1201 14:18:38.937368 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 14:18:38 crc kubenswrapper[4585]: I1201 14:18:38.939741 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 01 14:18:38 crc kubenswrapper[4585]: I1201 14:18:38.941742 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 01 14:18:38 crc kubenswrapper[4585]: I1201 14:18:38.941917 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 01 14:18:38 crc kubenswrapper[4585]: I1201 14:18:38.946505 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 14:18:39 crc kubenswrapper[4585]: I1201 14:18:39.036147 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2bbc3d0-64bb-4942-8ee2-a05b538ec68f-config-data\") pod \"nova-api-0\" (UID: \"f2bbc3d0-64bb-4942-8ee2-a05b538ec68f\") " pod="openstack/nova-api-0" Dec 01 14:18:39 crc kubenswrapper[4585]: I1201 14:18:39.036196 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2bbc3d0-64bb-4942-8ee2-a05b538ec68f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f2bbc3d0-64bb-4942-8ee2-a05b538ec68f\") " pod="openstack/nova-api-0" Dec 01 14:18:39 crc kubenswrapper[4585]: I1201 14:18:39.036223 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jld4g\" (UniqueName: \"kubernetes.io/projected/f2bbc3d0-64bb-4942-8ee2-a05b538ec68f-kube-api-access-jld4g\") pod \"nova-api-0\" (UID: \"f2bbc3d0-64bb-4942-8ee2-a05b538ec68f\") " pod="openstack/nova-api-0" Dec 01 14:18:39 crc kubenswrapper[4585]: I1201 14:18:39.036348 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2bbc3d0-64bb-4942-8ee2-a05b538ec68f-public-tls-certs\") pod \"nova-api-0\" (UID: \"f2bbc3d0-64bb-4942-8ee2-a05b538ec68f\") " pod="openstack/nova-api-0" Dec 01 14:18:39 crc kubenswrapper[4585]: I1201 14:18:39.036378 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2bbc3d0-64bb-4942-8ee2-a05b538ec68f-logs\") pod \"nova-api-0\" (UID: \"f2bbc3d0-64bb-4942-8ee2-a05b538ec68f\") " pod="openstack/nova-api-0" Dec 01 14:18:39 crc kubenswrapper[4585]: I1201 14:18:39.036443 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2bbc3d0-64bb-4942-8ee2-a05b538ec68f-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f2bbc3d0-64bb-4942-8ee2-a05b538ec68f\") " pod="openstack/nova-api-0" Dec 01 14:18:39 crc kubenswrapper[4585]: I1201 14:18:39.137827 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2bbc3d0-64bb-4942-8ee2-a05b538ec68f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f2bbc3d0-64bb-4942-8ee2-a05b538ec68f\") " pod="openstack/nova-api-0" Dec 01 14:18:39 crc kubenswrapper[4585]: I1201 14:18:39.137874 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jld4g\" (UniqueName: \"kubernetes.io/projected/f2bbc3d0-64bb-4942-8ee2-a05b538ec68f-kube-api-access-jld4g\") pod \"nova-api-0\" (UID: \"f2bbc3d0-64bb-4942-8ee2-a05b538ec68f\") " pod="openstack/nova-api-0" Dec 01 14:18:39 crc kubenswrapper[4585]: I1201 14:18:39.137950 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2bbc3d0-64bb-4942-8ee2-a05b538ec68f-public-tls-certs\") pod \"nova-api-0\" (UID: \"f2bbc3d0-64bb-4942-8ee2-a05b538ec68f\") " pod="openstack/nova-api-0" Dec 01 14:18:39 crc kubenswrapper[4585]: I1201 14:18:39.137990 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2bbc3d0-64bb-4942-8ee2-a05b538ec68f-logs\") pod \"nova-api-0\" (UID: \"f2bbc3d0-64bb-4942-8ee2-a05b538ec68f\") " pod="openstack/nova-api-0" Dec 01 14:18:39 crc kubenswrapper[4585]: I1201 14:18:39.138044 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2bbc3d0-64bb-4942-8ee2-a05b538ec68f-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f2bbc3d0-64bb-4942-8ee2-a05b538ec68f\") " pod="openstack/nova-api-0" Dec 01 14:18:39 crc kubenswrapper[4585]: I1201 14:18:39.138075 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2bbc3d0-64bb-4942-8ee2-a05b538ec68f-config-data\") pod \"nova-api-0\" (UID: \"f2bbc3d0-64bb-4942-8ee2-a05b538ec68f\") " pod="openstack/nova-api-0" Dec 01 14:18:39 crc kubenswrapper[4585]: I1201 14:18:39.138651 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2bbc3d0-64bb-4942-8ee2-a05b538ec68f-logs\") pod \"nova-api-0\" (UID: \"f2bbc3d0-64bb-4942-8ee2-a05b538ec68f\") " pod="openstack/nova-api-0" Dec 01 14:18:39 crc kubenswrapper[4585]: I1201 14:18:39.141700 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2bbc3d0-64bb-4942-8ee2-a05b538ec68f-config-data\") pod \"nova-api-0\" (UID: \"f2bbc3d0-64bb-4942-8ee2-a05b538ec68f\") " pod="openstack/nova-api-0" Dec 01 14:18:39 crc kubenswrapper[4585]: I1201 14:18:39.143383 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2bbc3d0-64bb-4942-8ee2-a05b538ec68f-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f2bbc3d0-64bb-4942-8ee2-a05b538ec68f\") " pod="openstack/nova-api-0" Dec 01 14:18:39 crc kubenswrapper[4585]: I1201 14:18:39.143815 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2bbc3d0-64bb-4942-8ee2-a05b538ec68f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f2bbc3d0-64bb-4942-8ee2-a05b538ec68f\") " pod="openstack/nova-api-0" Dec 01 14:18:39 crc kubenswrapper[4585]: I1201 14:18:39.160649 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2bbc3d0-64bb-4942-8ee2-a05b538ec68f-public-tls-certs\") pod \"nova-api-0\" (UID: \"f2bbc3d0-64bb-4942-8ee2-a05b538ec68f\") " pod="openstack/nova-api-0" Dec 01 14:18:39 crc kubenswrapper[4585]: I1201 14:18:39.168734 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jld4g\" (UniqueName: \"kubernetes.io/projected/f2bbc3d0-64bb-4942-8ee2-a05b538ec68f-kube-api-access-jld4g\") pod \"nova-api-0\" (UID: \"f2bbc3d0-64bb-4942-8ee2-a05b538ec68f\") " pod="openstack/nova-api-0" Dec 01 14:18:39 crc kubenswrapper[4585]: I1201 14:18:39.253028 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 14:18:39 crc kubenswrapper[4585]: I1201 14:18:39.739809 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 14:18:39 crc kubenswrapper[4585]: W1201 14:18:39.747627 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2bbc3d0_64bb_4942_8ee2_a05b538ec68f.slice/crio-52782ff3db61f667f6fd61df67a82a6acb3ba99b6b71a117015e62e986808b0b WatchSource:0}: Error finding container 52782ff3db61f667f6fd61df67a82a6acb3ba99b6b71a117015e62e986808b0b: Status 404 returned error can't find the container with id 52782ff3db61f667f6fd61df67a82a6acb3ba99b6b71a117015e62e986808b0b Dec 01 14:18:39 crc kubenswrapper[4585]: I1201 14:18:39.825205 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f2bbc3d0-64bb-4942-8ee2-a05b538ec68f","Type":"ContainerStarted","Data":"52782ff3db61f667f6fd61df67a82a6acb3ba99b6b71a117015e62e986808b0b"} Dec 01 14:18:40 crc kubenswrapper[4585]: I1201 14:18:40.423545 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f8c300c-5295-4071-a82d-f3e5884bb729" path="/var/lib/kubelet/pods/1f8c300c-5295-4071-a82d-f3e5884bb729/volumes" Dec 01 14:18:40 crc kubenswrapper[4585]: I1201 14:18:40.835250 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f2bbc3d0-64bb-4942-8ee2-a05b538ec68f","Type":"ContainerStarted","Data":"54dcc1c2f505ded992edc488cc78892b03a6a631e7340dc8204a7c5e3e7a970a"} Dec 01 14:18:40 crc kubenswrapper[4585]: I1201 14:18:40.835311 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f2bbc3d0-64bb-4942-8ee2-a05b538ec68f","Type":"ContainerStarted","Data":"8f5494685c2aef5ffa29df3b86031898c8d8bd8768942106583ee6c2921caff7"} Dec 01 14:18:40 crc kubenswrapper[4585]: I1201 14:18:40.860915 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.860893521 podStartE2EDuration="2.860893521s" podCreationTimestamp="2025-12-01 14:18:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:18:40.852388454 +0000 UTC m=+1234.836602319" watchObservedRunningTime="2025-12-01 14:18:40.860893521 +0000 UTC m=+1234.845107376" Dec 01 14:18:41 crc kubenswrapper[4585]: I1201 14:18:41.228602 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 01 14:18:41 crc kubenswrapper[4585]: I1201 14:18:41.228655 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 01 14:18:42 crc kubenswrapper[4585]: I1201 14:18:42.273534 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 01 14:18:43 crc kubenswrapper[4585]: I1201 14:18:43.715951 4585 patch_prober.go:28] interesting pod/machine-config-daemon-lj9gs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 14:18:43 crc kubenswrapper[4585]: I1201 14:18:43.716247 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 14:18:43 crc kubenswrapper[4585]: I1201 14:18:43.716294 4585 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" Dec 01 14:18:43 crc kubenswrapper[4585]: I1201 14:18:43.717116 4585 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ad6574d507c0610da07eb42cf40383d7aa7800bda84bee35a347684dc954f810"} pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 14:18:43 crc kubenswrapper[4585]: I1201 14:18:43.717175 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerName="machine-config-daemon" containerID="cri-o://ad6574d507c0610da07eb42cf40383d7aa7800bda84bee35a347684dc954f810" gracePeriod=600 Dec 01 14:18:44 crc kubenswrapper[4585]: I1201 14:18:44.878489 4585 generic.go:334] "Generic (PLEG): container finished" podID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerID="ad6574d507c0610da07eb42cf40383d7aa7800bda84bee35a347684dc954f810" exitCode=0 Dec 01 14:18:44 crc kubenswrapper[4585]: I1201 14:18:44.878563 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" event={"ID":"f7beb40d-bcd0-43c8-a9fe-c32408790a4c","Type":"ContainerDied","Data":"ad6574d507c0610da07eb42cf40383d7aa7800bda84bee35a347684dc954f810"} Dec 01 14:18:44 crc kubenswrapper[4585]: I1201 14:18:44.878825 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" event={"ID":"f7beb40d-bcd0-43c8-a9fe-c32408790a4c","Type":"ContainerStarted","Data":"1fbf41077863f3fc04eeb32135f7e1a50fc3bb2ac74df27d60186d6226d4dc1b"} Dec 01 14:18:44 crc kubenswrapper[4585]: I1201 14:18:44.878845 4585 scope.go:117] "RemoveContainer" containerID="041ce578b922e9949f0fa3c528cdc2179672e360c800f9ad0c54e96def5e8b8a" Dec 01 14:18:46 crc kubenswrapper[4585]: I1201 14:18:46.228614 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 01 14:18:46 crc kubenswrapper[4585]: I1201 14:18:46.229099 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 01 14:18:47 crc kubenswrapper[4585]: I1201 14:18:47.242180 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="61c99c9c-c763-4e3a-8e71-6d78e3779991" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.203:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 01 14:18:47 crc kubenswrapper[4585]: I1201 14:18:47.242227 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="61c99c9c-c763-4e3a-8e71-6d78e3779991" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.203:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 01 14:18:47 crc kubenswrapper[4585]: I1201 14:18:47.274162 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 01 14:18:47 crc kubenswrapper[4585]: I1201 14:18:47.308945 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 01 14:18:47 crc kubenswrapper[4585]: I1201 14:18:47.941961 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 01 14:18:49 crc kubenswrapper[4585]: I1201 14:18:49.254535 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 14:18:49 crc kubenswrapper[4585]: I1201 14:18:49.254848 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 14:18:50 crc kubenswrapper[4585]: I1201 14:18:50.268261 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f2bbc3d0-64bb-4942-8ee2-a05b538ec68f" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.205:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 01 14:18:50 crc kubenswrapper[4585]: I1201 14:18:50.268292 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f2bbc3d0-64bb-4942-8ee2-a05b538ec68f" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.205:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 01 14:18:56 crc kubenswrapper[4585]: I1201 14:18:56.235785 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 01 14:18:56 crc kubenswrapper[4585]: I1201 14:18:56.239577 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 01 14:18:56 crc kubenswrapper[4585]: I1201 14:18:56.242015 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 01 14:18:57 crc kubenswrapper[4585]: I1201 14:18:57.002555 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 01 14:18:58 crc kubenswrapper[4585]: I1201 14:18:58.159250 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 01 14:18:59 crc kubenswrapper[4585]: I1201 14:18:59.260303 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 01 14:18:59 crc kubenswrapper[4585]: I1201 14:18:59.260932 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 01 14:18:59 crc kubenswrapper[4585]: I1201 14:18:59.262568 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 01 14:18:59 crc kubenswrapper[4585]: I1201 14:18:59.268966 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 01 14:19:00 crc kubenswrapper[4585]: I1201 14:19:00.021105 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 01 14:19:00 crc kubenswrapper[4585]: I1201 14:19:00.027684 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 01 14:19:08 crc kubenswrapper[4585]: I1201 14:19:08.408584 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 14:19:09 crc kubenswrapper[4585]: I1201 14:19:09.432850 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 14:19:13 crc kubenswrapper[4585]: I1201 14:19:13.056169 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="d41c9a27-f15b-44c5-84b2-0e083f8dc837" containerName="rabbitmq" containerID="cri-o://85d3f6b4909b70bf35ccaee9949916c82b673ab0fc7b98cbea1b5d6b216206b9" gracePeriod=604796 Dec 01 14:19:14 crc kubenswrapper[4585]: I1201 14:19:14.001631 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="6c266121-e7d2-42aa-b1d9-0d15bdd0f798" containerName="rabbitmq" containerID="cri-o://750cf3415d2c964a31c9ffec8a7a334d06b9a8c65269a22a794fe0a3cf8946c2" gracePeriod=604796 Dec 01 14:19:15 crc kubenswrapper[4585]: I1201 14:19:15.120086 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="6c266121-e7d2-42aa-b1d9-0d15bdd0f798" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.94:5671: connect: connection refused" Dec 01 14:19:15 crc kubenswrapper[4585]: I1201 14:19:15.180617 4585 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="d41c9a27-f15b-44c5-84b2-0e083f8dc837" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.95:5671: connect: connection refused" Dec 01 14:19:19 crc kubenswrapper[4585]: I1201 14:19:19.643596 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 01 14:19:19 crc kubenswrapper[4585]: I1201 14:19:19.728677 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d41c9a27-f15b-44c5-84b2-0e083f8dc837-server-conf\") pod \"d41c9a27-f15b-44c5-84b2-0e083f8dc837\" (UID: \"d41c9a27-f15b-44c5-84b2-0e083f8dc837\") " Dec 01 14:19:19 crc kubenswrapper[4585]: I1201 14:19:19.728738 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d41c9a27-f15b-44c5-84b2-0e083f8dc837-rabbitmq-confd\") pod \"d41c9a27-f15b-44c5-84b2-0e083f8dc837\" (UID: \"d41c9a27-f15b-44c5-84b2-0e083f8dc837\") " Dec 01 14:19:19 crc kubenswrapper[4585]: I1201 14:19:19.728792 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d41c9a27-f15b-44c5-84b2-0e083f8dc837-config-data\") pod \"d41c9a27-f15b-44c5-84b2-0e083f8dc837\" (UID: \"d41c9a27-f15b-44c5-84b2-0e083f8dc837\") " Dec 01 14:19:19 crc kubenswrapper[4585]: I1201 14:19:19.728848 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d41c9a27-f15b-44c5-84b2-0e083f8dc837-rabbitmq-tls\") pod \"d41c9a27-f15b-44c5-84b2-0e083f8dc837\" (UID: \"d41c9a27-f15b-44c5-84b2-0e083f8dc837\") " Dec 01 14:19:19 crc kubenswrapper[4585]: I1201 14:19:19.728910 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwnv5\" (UniqueName: \"kubernetes.io/projected/d41c9a27-f15b-44c5-84b2-0e083f8dc837-kube-api-access-gwnv5\") pod \"d41c9a27-f15b-44c5-84b2-0e083f8dc837\" (UID: \"d41c9a27-f15b-44c5-84b2-0e083f8dc837\") " Dec 01 14:19:19 crc kubenswrapper[4585]: I1201 14:19:19.728981 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d41c9a27-f15b-44c5-84b2-0e083f8dc837-erlang-cookie-secret\") pod \"d41c9a27-f15b-44c5-84b2-0e083f8dc837\" (UID: \"d41c9a27-f15b-44c5-84b2-0e083f8dc837\") " Dec 01 14:19:19 crc kubenswrapper[4585]: I1201 14:19:19.729001 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d41c9a27-f15b-44c5-84b2-0e083f8dc837-plugins-conf\") pod \"d41c9a27-f15b-44c5-84b2-0e083f8dc837\" (UID: \"d41c9a27-f15b-44c5-84b2-0e083f8dc837\") " Dec 01 14:19:19 crc kubenswrapper[4585]: I1201 14:19:19.729025 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d41c9a27-f15b-44c5-84b2-0e083f8dc837-rabbitmq-plugins\") pod \"d41c9a27-f15b-44c5-84b2-0e083f8dc837\" (UID: \"d41c9a27-f15b-44c5-84b2-0e083f8dc837\") " Dec 01 14:19:19 crc kubenswrapper[4585]: I1201 14:19:19.729898 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d41c9a27-f15b-44c5-84b2-0e083f8dc837-pod-info\") pod \"d41c9a27-f15b-44c5-84b2-0e083f8dc837\" (UID: \"d41c9a27-f15b-44c5-84b2-0e083f8dc837\") " Dec 01 14:19:19 crc kubenswrapper[4585]: I1201 14:19:19.730018 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"d41c9a27-f15b-44c5-84b2-0e083f8dc837\" (UID: \"d41c9a27-f15b-44c5-84b2-0e083f8dc837\") " Dec 01 14:19:19 crc kubenswrapper[4585]: I1201 14:19:19.730087 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d41c9a27-f15b-44c5-84b2-0e083f8dc837-rabbitmq-erlang-cookie\") pod \"d41c9a27-f15b-44c5-84b2-0e083f8dc837\" (UID: \"d41c9a27-f15b-44c5-84b2-0e083f8dc837\") " Dec 01 14:19:19 crc kubenswrapper[4585]: I1201 14:19:19.730154 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d41c9a27-f15b-44c5-84b2-0e083f8dc837-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "d41c9a27-f15b-44c5-84b2-0e083f8dc837" (UID: "d41c9a27-f15b-44c5-84b2-0e083f8dc837"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:19:19 crc kubenswrapper[4585]: I1201 14:19:19.733278 4585 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d41c9a27-f15b-44c5-84b2-0e083f8dc837-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 01 14:19:19 crc kubenswrapper[4585]: I1201 14:19:19.734121 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d41c9a27-f15b-44c5-84b2-0e083f8dc837-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "d41c9a27-f15b-44c5-84b2-0e083f8dc837" (UID: "d41c9a27-f15b-44c5-84b2-0e083f8dc837"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:19:19 crc kubenswrapper[4585]: I1201 14:19:19.735296 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d41c9a27-f15b-44c5-84b2-0e083f8dc837-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "d41c9a27-f15b-44c5-84b2-0e083f8dc837" (UID: "d41c9a27-f15b-44c5-84b2-0e083f8dc837"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:19:19 crc kubenswrapper[4585]: I1201 14:19:19.753850 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d41c9a27-f15b-44c5-84b2-0e083f8dc837-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "d41c9a27-f15b-44c5-84b2-0e083f8dc837" (UID: "d41c9a27-f15b-44c5-84b2-0e083f8dc837"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:19:19 crc kubenswrapper[4585]: I1201 14:19:19.758090 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/d41c9a27-f15b-44c5-84b2-0e083f8dc837-pod-info" (OuterVolumeSpecName: "pod-info") pod "d41c9a27-f15b-44c5-84b2-0e083f8dc837" (UID: "d41c9a27-f15b-44c5-84b2-0e083f8dc837"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 01 14:19:19 crc kubenswrapper[4585]: I1201 14:19:19.763514 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d41c9a27-f15b-44c5-84b2-0e083f8dc837-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "d41c9a27-f15b-44c5-84b2-0e083f8dc837" (UID: "d41c9a27-f15b-44c5-84b2-0e083f8dc837"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:19:19 crc kubenswrapper[4585]: I1201 14:19:19.765723 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d41c9a27-f15b-44c5-84b2-0e083f8dc837-kube-api-access-gwnv5" (OuterVolumeSpecName: "kube-api-access-gwnv5") pod "d41c9a27-f15b-44c5-84b2-0e083f8dc837" (UID: "d41c9a27-f15b-44c5-84b2-0e083f8dc837"). InnerVolumeSpecName "kube-api-access-gwnv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:19:19 crc kubenswrapper[4585]: I1201 14:19:19.802920 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "d41c9a27-f15b-44c5-84b2-0e083f8dc837" (UID: "d41c9a27-f15b-44c5-84b2-0e083f8dc837"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 14:19:19 crc kubenswrapper[4585]: I1201 14:19:19.842880 4585 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d41c9a27-f15b-44c5-84b2-0e083f8dc837-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 01 14:19:19 crc kubenswrapper[4585]: I1201 14:19:19.842917 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwnv5\" (UniqueName: \"kubernetes.io/projected/d41c9a27-f15b-44c5-84b2-0e083f8dc837-kube-api-access-gwnv5\") on node \"crc\" DevicePath \"\"" Dec 01 14:19:19 crc kubenswrapper[4585]: I1201 14:19:19.842932 4585 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d41c9a27-f15b-44c5-84b2-0e083f8dc837-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 01 14:19:19 crc kubenswrapper[4585]: I1201 14:19:19.842942 4585 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d41c9a27-f15b-44c5-84b2-0e083f8dc837-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 01 14:19:19 crc kubenswrapper[4585]: I1201 14:19:19.842951 4585 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d41c9a27-f15b-44c5-84b2-0e083f8dc837-pod-info\") on node \"crc\" DevicePath \"\"" Dec 01 14:19:19 crc kubenswrapper[4585]: I1201 14:19:19.842994 4585 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Dec 01 14:19:19 crc kubenswrapper[4585]: I1201 14:19:19.843006 4585 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d41c9a27-f15b-44c5-84b2-0e083f8dc837-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 01 14:19:19 crc kubenswrapper[4585]: I1201 14:19:19.861404 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d41c9a27-f15b-44c5-84b2-0e083f8dc837-server-conf" (OuterVolumeSpecName: "server-conf") pod "d41c9a27-f15b-44c5-84b2-0e083f8dc837" (UID: "d41c9a27-f15b-44c5-84b2-0e083f8dc837"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:19:19 crc kubenswrapper[4585]: I1201 14:19:19.872486 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d41c9a27-f15b-44c5-84b2-0e083f8dc837-config-data" (OuterVolumeSpecName: "config-data") pod "d41c9a27-f15b-44c5-84b2-0e083f8dc837" (UID: "d41c9a27-f15b-44c5-84b2-0e083f8dc837"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:19:19 crc kubenswrapper[4585]: I1201 14:19:19.895175 4585 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Dec 01 14:19:19 crc kubenswrapper[4585]: I1201 14:19:19.945122 4585 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Dec 01 14:19:19 crc kubenswrapper[4585]: I1201 14:19:19.945159 4585 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d41c9a27-f15b-44c5-84b2-0e083f8dc837-server-conf\") on node \"crc\" DevicePath \"\"" Dec 01 14:19:19 crc kubenswrapper[4585]: I1201 14:19:19.945168 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d41c9a27-f15b-44c5-84b2-0e083f8dc837-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 14:19:19 crc kubenswrapper[4585]: I1201 14:19:19.952384 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d41c9a27-f15b-44c5-84b2-0e083f8dc837-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "d41c9a27-f15b-44c5-84b2-0e083f8dc837" (UID: "d41c9a27-f15b-44c5-84b2-0e083f8dc837"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.047294 4585 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d41c9a27-f15b-44c5-84b2-0e083f8dc837-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.234090 4585 generic.go:334] "Generic (PLEG): container finished" podID="6c266121-e7d2-42aa-b1d9-0d15bdd0f798" containerID="750cf3415d2c964a31c9ffec8a7a334d06b9a8c65269a22a794fe0a3cf8946c2" exitCode=0 Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.234153 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6c266121-e7d2-42aa-b1d9-0d15bdd0f798","Type":"ContainerDied","Data":"750cf3415d2c964a31c9ffec8a7a334d06b9a8c65269a22a794fe0a3cf8946c2"} Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.242195 4585 generic.go:334] "Generic (PLEG): container finished" podID="d41c9a27-f15b-44c5-84b2-0e083f8dc837" containerID="85d3f6b4909b70bf35ccaee9949916c82b673ab0fc7b98cbea1b5d6b216206b9" exitCode=0 Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.242239 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d41c9a27-f15b-44c5-84b2-0e083f8dc837","Type":"ContainerDied","Data":"85d3f6b4909b70bf35ccaee9949916c82b673ab0fc7b98cbea1b5d6b216206b9"} Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.242266 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d41c9a27-f15b-44c5-84b2-0e083f8dc837","Type":"ContainerDied","Data":"16f7c8940c8b332113a9028b5b8993a30fe696de6b22be734af4a73e28b2daf2"} Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.242282 4585 scope.go:117] "RemoveContainer" containerID="85d3f6b4909b70bf35ccaee9949916c82b673ab0fc7b98cbea1b5d6b216206b9" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.242429 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.297453 4585 scope.go:117] "RemoveContainer" containerID="136bde2b6e8e606ac594a703cd22914247375724ce60c2580695ad1f22d011e9" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.310763 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.328328 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.334289 4585 scope.go:117] "RemoveContainer" containerID="85d3f6b4909b70bf35ccaee9949916c82b673ab0fc7b98cbea1b5d6b216206b9" Dec 01 14:19:20 crc kubenswrapper[4585]: E1201 14:19:20.339160 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85d3f6b4909b70bf35ccaee9949916c82b673ab0fc7b98cbea1b5d6b216206b9\": container with ID starting with 85d3f6b4909b70bf35ccaee9949916c82b673ab0fc7b98cbea1b5d6b216206b9 not found: ID does not exist" containerID="85d3f6b4909b70bf35ccaee9949916c82b673ab0fc7b98cbea1b5d6b216206b9" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.339237 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85d3f6b4909b70bf35ccaee9949916c82b673ab0fc7b98cbea1b5d6b216206b9"} err="failed to get container status \"85d3f6b4909b70bf35ccaee9949916c82b673ab0fc7b98cbea1b5d6b216206b9\": rpc error: code = NotFound desc = could not find container \"85d3f6b4909b70bf35ccaee9949916c82b673ab0fc7b98cbea1b5d6b216206b9\": container with ID starting with 85d3f6b4909b70bf35ccaee9949916c82b673ab0fc7b98cbea1b5d6b216206b9 not found: ID does not exist" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.339269 4585 scope.go:117] "RemoveContainer" containerID="136bde2b6e8e606ac594a703cd22914247375724ce60c2580695ad1f22d011e9" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.342778 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 14:19:20 crc kubenswrapper[4585]: E1201 14:19:20.343247 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d41c9a27-f15b-44c5-84b2-0e083f8dc837" containerName="setup-container" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.343262 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="d41c9a27-f15b-44c5-84b2-0e083f8dc837" containerName="setup-container" Dec 01 14:19:20 crc kubenswrapper[4585]: E1201 14:19:20.343297 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d41c9a27-f15b-44c5-84b2-0e083f8dc837" containerName="rabbitmq" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.343303 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="d41c9a27-f15b-44c5-84b2-0e083f8dc837" containerName="rabbitmq" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.343474 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="d41c9a27-f15b-44c5-84b2-0e083f8dc837" containerName="rabbitmq" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.344556 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.355227 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.360623 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.360826 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.363742 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.364565 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.366379 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.366771 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.371650 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-hjxqc" Dec 01 14:19:20 crc kubenswrapper[4585]: E1201 14:19:20.375563 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"136bde2b6e8e606ac594a703cd22914247375724ce60c2580695ad1f22d011e9\": container with ID starting with 136bde2b6e8e606ac594a703cd22914247375724ce60c2580695ad1f22d011e9 not found: ID does not exist" containerID="136bde2b6e8e606ac594a703cd22914247375724ce60c2580695ad1f22d011e9" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.375601 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"136bde2b6e8e606ac594a703cd22914247375724ce60c2580695ad1f22d011e9"} err="failed to get container status \"136bde2b6e8e606ac594a703cd22914247375724ce60c2580695ad1f22d011e9\": rpc error: code = NotFound desc = could not find container \"136bde2b6e8e606ac594a703cd22914247375724ce60c2580695ad1f22d011e9\": container with ID starting with 136bde2b6e8e606ac594a703cd22914247375724ce60c2580695ad1f22d011e9 not found: ID does not exist" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.443240 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d41c9a27-f15b-44c5-84b2-0e083f8dc837" path="/var/lib/kubelet/pods/d41c9a27-f15b-44c5-84b2-0e083f8dc837/volumes" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.458938 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b271e13c-b935-4f31-a32d-865af7228e55-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b271e13c-b935-4f31-a32d-865af7228e55\") " pod="openstack/rabbitmq-server-0" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.458992 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"b271e13c-b935-4f31-a32d-865af7228e55\") " pod="openstack/rabbitmq-server-0" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.459038 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b271e13c-b935-4f31-a32d-865af7228e55-config-data\") pod \"rabbitmq-server-0\" (UID: \"b271e13c-b935-4f31-a32d-865af7228e55\") " pod="openstack/rabbitmq-server-0" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.459093 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b271e13c-b935-4f31-a32d-865af7228e55-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b271e13c-b935-4f31-a32d-865af7228e55\") " pod="openstack/rabbitmq-server-0" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.459119 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b271e13c-b935-4f31-a32d-865af7228e55-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b271e13c-b935-4f31-a32d-865af7228e55\") " pod="openstack/rabbitmq-server-0" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.459133 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b271e13c-b935-4f31-a32d-865af7228e55-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b271e13c-b935-4f31-a32d-865af7228e55\") " pod="openstack/rabbitmq-server-0" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.459150 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b271e13c-b935-4f31-a32d-865af7228e55-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b271e13c-b935-4f31-a32d-865af7228e55\") " pod="openstack/rabbitmq-server-0" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.459174 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tp96q\" (UniqueName: \"kubernetes.io/projected/b271e13c-b935-4f31-a32d-865af7228e55-kube-api-access-tp96q\") pod \"rabbitmq-server-0\" (UID: \"b271e13c-b935-4f31-a32d-865af7228e55\") " pod="openstack/rabbitmq-server-0" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.459193 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b271e13c-b935-4f31-a32d-865af7228e55-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b271e13c-b935-4f31-a32d-865af7228e55\") " pod="openstack/rabbitmq-server-0" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.459233 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b271e13c-b935-4f31-a32d-865af7228e55-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b271e13c-b935-4f31-a32d-865af7228e55\") " pod="openstack/rabbitmq-server-0" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.459264 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b271e13c-b935-4f31-a32d-865af7228e55-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b271e13c-b935-4f31-a32d-865af7228e55\") " pod="openstack/rabbitmq-server-0" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.562037 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b271e13c-b935-4f31-a32d-865af7228e55-config-data\") pod \"rabbitmq-server-0\" (UID: \"b271e13c-b935-4f31-a32d-865af7228e55\") " pod="openstack/rabbitmq-server-0" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.562129 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b271e13c-b935-4f31-a32d-865af7228e55-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b271e13c-b935-4f31-a32d-865af7228e55\") " pod="openstack/rabbitmq-server-0" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.562168 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b271e13c-b935-4f31-a32d-865af7228e55-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b271e13c-b935-4f31-a32d-865af7228e55\") " pod="openstack/rabbitmq-server-0" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.562191 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b271e13c-b935-4f31-a32d-865af7228e55-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b271e13c-b935-4f31-a32d-865af7228e55\") " pod="openstack/rabbitmq-server-0" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.562210 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b271e13c-b935-4f31-a32d-865af7228e55-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b271e13c-b935-4f31-a32d-865af7228e55\") " pod="openstack/rabbitmq-server-0" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.562233 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tp96q\" (UniqueName: \"kubernetes.io/projected/b271e13c-b935-4f31-a32d-865af7228e55-kube-api-access-tp96q\") pod \"rabbitmq-server-0\" (UID: \"b271e13c-b935-4f31-a32d-865af7228e55\") " pod="openstack/rabbitmq-server-0" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.562251 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b271e13c-b935-4f31-a32d-865af7228e55-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b271e13c-b935-4f31-a32d-865af7228e55\") " pod="openstack/rabbitmq-server-0" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.562301 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b271e13c-b935-4f31-a32d-865af7228e55-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b271e13c-b935-4f31-a32d-865af7228e55\") " pod="openstack/rabbitmq-server-0" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.562332 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b271e13c-b935-4f31-a32d-865af7228e55-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b271e13c-b935-4f31-a32d-865af7228e55\") " pod="openstack/rabbitmq-server-0" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.562383 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b271e13c-b935-4f31-a32d-865af7228e55-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b271e13c-b935-4f31-a32d-865af7228e55\") " pod="openstack/rabbitmq-server-0" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.562695 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"b271e13c-b935-4f31-a32d-865af7228e55\") " pod="openstack/rabbitmq-server-0" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.562956 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b271e13c-b935-4f31-a32d-865af7228e55-config-data\") pod \"rabbitmq-server-0\" (UID: \"b271e13c-b935-4f31-a32d-865af7228e55\") " pod="openstack/rabbitmq-server-0" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.562985 4585 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"b271e13c-b935-4f31-a32d-865af7228e55\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.563657 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b271e13c-b935-4f31-a32d-865af7228e55-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b271e13c-b935-4f31-a32d-865af7228e55\") " pod="openstack/rabbitmq-server-0" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.564297 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b271e13c-b935-4f31-a32d-865af7228e55-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b271e13c-b935-4f31-a32d-865af7228e55\") " pod="openstack/rabbitmq-server-0" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.564502 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b271e13c-b935-4f31-a32d-865af7228e55-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b271e13c-b935-4f31-a32d-865af7228e55\") " pod="openstack/rabbitmq-server-0" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.564513 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b271e13c-b935-4f31-a32d-865af7228e55-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b271e13c-b935-4f31-a32d-865af7228e55\") " pod="openstack/rabbitmq-server-0" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.570828 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b271e13c-b935-4f31-a32d-865af7228e55-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b271e13c-b935-4f31-a32d-865af7228e55\") " pod="openstack/rabbitmq-server-0" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.571381 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b271e13c-b935-4f31-a32d-865af7228e55-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b271e13c-b935-4f31-a32d-865af7228e55\") " pod="openstack/rabbitmq-server-0" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.573655 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b271e13c-b935-4f31-a32d-865af7228e55-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b271e13c-b935-4f31-a32d-865af7228e55\") " pod="openstack/rabbitmq-server-0" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.579959 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b271e13c-b935-4f31-a32d-865af7228e55-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b271e13c-b935-4f31-a32d-865af7228e55\") " pod="openstack/rabbitmq-server-0" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.580437 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tp96q\" (UniqueName: \"kubernetes.io/projected/b271e13c-b935-4f31-a32d-865af7228e55-kube-api-access-tp96q\") pod \"rabbitmq-server-0\" (UID: \"b271e13c-b935-4f31-a32d-865af7228e55\") " pod="openstack/rabbitmq-server-0" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.636023 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"b271e13c-b935-4f31-a32d-865af7228e55\") " pod="openstack/rabbitmq-server-0" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.700333 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.701073 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.765619 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6c266121-e7d2-42aa-b1d9-0d15bdd0f798-erlang-cookie-secret\") pod \"6c266121-e7d2-42aa-b1d9-0d15bdd0f798\" (UID: \"6c266121-e7d2-42aa-b1d9-0d15bdd0f798\") " Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.765668 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6c266121-e7d2-42aa-b1d9-0d15bdd0f798-plugins-conf\") pod \"6c266121-e7d2-42aa-b1d9-0d15bdd0f798\" (UID: \"6c266121-e7d2-42aa-b1d9-0d15bdd0f798\") " Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.765756 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6c266121-e7d2-42aa-b1d9-0d15bdd0f798-server-conf\") pod \"6c266121-e7d2-42aa-b1d9-0d15bdd0f798\" (UID: \"6c266121-e7d2-42aa-b1d9-0d15bdd0f798\") " Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.765814 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6c266121-e7d2-42aa-b1d9-0d15bdd0f798-pod-info\") pod \"6c266121-e7d2-42aa-b1d9-0d15bdd0f798\" (UID: \"6c266121-e7d2-42aa-b1d9-0d15bdd0f798\") " Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.765829 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6c266121-e7d2-42aa-b1d9-0d15bdd0f798-config-data\") pod \"6c266121-e7d2-42aa-b1d9-0d15bdd0f798\" (UID: \"6c266121-e7d2-42aa-b1d9-0d15bdd0f798\") " Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.765854 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6c266121-e7d2-42aa-b1d9-0d15bdd0f798-rabbitmq-confd\") pod \"6c266121-e7d2-42aa-b1d9-0d15bdd0f798\" (UID: \"6c266121-e7d2-42aa-b1d9-0d15bdd0f798\") " Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.765877 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6c266121-e7d2-42aa-b1d9-0d15bdd0f798-rabbitmq-tls\") pod \"6c266121-e7d2-42aa-b1d9-0d15bdd0f798\" (UID: \"6c266121-e7d2-42aa-b1d9-0d15bdd0f798\") " Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.765910 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"6c266121-e7d2-42aa-b1d9-0d15bdd0f798\" (UID: \"6c266121-e7d2-42aa-b1d9-0d15bdd0f798\") " Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.765952 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6c266121-e7d2-42aa-b1d9-0d15bdd0f798-rabbitmq-plugins\") pod \"6c266121-e7d2-42aa-b1d9-0d15bdd0f798\" (UID: \"6c266121-e7d2-42aa-b1d9-0d15bdd0f798\") " Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.766689 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtdpp\" (UniqueName: \"kubernetes.io/projected/6c266121-e7d2-42aa-b1d9-0d15bdd0f798-kube-api-access-jtdpp\") pod \"6c266121-e7d2-42aa-b1d9-0d15bdd0f798\" (UID: \"6c266121-e7d2-42aa-b1d9-0d15bdd0f798\") " Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.766760 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6c266121-e7d2-42aa-b1d9-0d15bdd0f798-rabbitmq-erlang-cookie\") pod \"6c266121-e7d2-42aa-b1d9-0d15bdd0f798\" (UID: \"6c266121-e7d2-42aa-b1d9-0d15bdd0f798\") " Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.767903 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c266121-e7d2-42aa-b1d9-0d15bdd0f798-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "6c266121-e7d2-42aa-b1d9-0d15bdd0f798" (UID: "6c266121-e7d2-42aa-b1d9-0d15bdd0f798"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.769419 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c266121-e7d2-42aa-b1d9-0d15bdd0f798-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "6c266121-e7d2-42aa-b1d9-0d15bdd0f798" (UID: "6c266121-e7d2-42aa-b1d9-0d15bdd0f798"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.770052 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c266121-e7d2-42aa-b1d9-0d15bdd0f798-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "6c266121-e7d2-42aa-b1d9-0d15bdd0f798" (UID: "6c266121-e7d2-42aa-b1d9-0d15bdd0f798"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.793437 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c266121-e7d2-42aa-b1d9-0d15bdd0f798-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "6c266121-e7d2-42aa-b1d9-0d15bdd0f798" (UID: "6c266121-e7d2-42aa-b1d9-0d15bdd0f798"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.793883 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/6c266121-e7d2-42aa-b1d9-0d15bdd0f798-pod-info" (OuterVolumeSpecName: "pod-info") pod "6c266121-e7d2-42aa-b1d9-0d15bdd0f798" (UID: "6c266121-e7d2-42aa-b1d9-0d15bdd0f798"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.794031 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c266121-e7d2-42aa-b1d9-0d15bdd0f798-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "6c266121-e7d2-42aa-b1d9-0d15bdd0f798" (UID: "6c266121-e7d2-42aa-b1d9-0d15bdd0f798"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.794915 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "persistence") pod "6c266121-e7d2-42aa-b1d9-0d15bdd0f798" (UID: "6c266121-e7d2-42aa-b1d9-0d15bdd0f798"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.795108 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c266121-e7d2-42aa-b1d9-0d15bdd0f798-kube-api-access-jtdpp" (OuterVolumeSpecName: "kube-api-access-jtdpp") pod "6c266121-e7d2-42aa-b1d9-0d15bdd0f798" (UID: "6c266121-e7d2-42aa-b1d9-0d15bdd0f798"). InnerVolumeSpecName "kube-api-access-jtdpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.846714 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c266121-e7d2-42aa-b1d9-0d15bdd0f798-config-data" (OuterVolumeSpecName: "config-data") pod "6c266121-e7d2-42aa-b1d9-0d15bdd0f798" (UID: "6c266121-e7d2-42aa-b1d9-0d15bdd0f798"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.868315 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c266121-e7d2-42aa-b1d9-0d15bdd0f798-server-conf" (OuterVolumeSpecName: "server-conf") pod "6c266121-e7d2-42aa-b1d9-0d15bdd0f798" (UID: "6c266121-e7d2-42aa-b1d9-0d15bdd0f798"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.869562 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtdpp\" (UniqueName: \"kubernetes.io/projected/6c266121-e7d2-42aa-b1d9-0d15bdd0f798-kube-api-access-jtdpp\") on node \"crc\" DevicePath \"\"" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.869582 4585 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6c266121-e7d2-42aa-b1d9-0d15bdd0f798-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.869591 4585 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6c266121-e7d2-42aa-b1d9-0d15bdd0f798-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.869600 4585 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6c266121-e7d2-42aa-b1d9-0d15bdd0f798-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.869608 4585 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6c266121-e7d2-42aa-b1d9-0d15bdd0f798-server-conf\") on node \"crc\" DevicePath \"\"" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.869618 4585 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6c266121-e7d2-42aa-b1d9-0d15bdd0f798-pod-info\") on node \"crc\" DevicePath \"\"" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.869625 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6c266121-e7d2-42aa-b1d9-0d15bdd0f798-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.869633 4585 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6c266121-e7d2-42aa-b1d9-0d15bdd0f798-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.869657 4585 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.869672 4585 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6c266121-e7d2-42aa-b1d9-0d15bdd0f798-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.911105 4585 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.957077 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c266121-e7d2-42aa-b1d9-0d15bdd0f798-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "6c266121-e7d2-42aa-b1d9-0d15bdd0f798" (UID: "6c266121-e7d2-42aa-b1d9-0d15bdd0f798"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.971846 4585 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6c266121-e7d2-42aa-b1d9-0d15bdd0f798-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 01 14:19:20 crc kubenswrapper[4585]: I1201 14:19:20.971876 4585 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Dec 01 14:19:21 crc kubenswrapper[4585]: I1201 14:19:21.253496 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6c266121-e7d2-42aa-b1d9-0d15bdd0f798","Type":"ContainerDied","Data":"01129af23bcc210a5877ac141d365da74bc31f28218a2968425fc8e15107979c"} Dec 01 14:19:21 crc kubenswrapper[4585]: I1201 14:19:21.253563 4585 scope.go:117] "RemoveContainer" containerID="750cf3415d2c964a31c9ffec8a7a334d06b9a8c65269a22a794fe0a3cf8946c2" Dec 01 14:19:21 crc kubenswrapper[4585]: I1201 14:19:21.253684 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:19:21 crc kubenswrapper[4585]: I1201 14:19:21.278351 4585 scope.go:117] "RemoveContainer" containerID="01471bb78ac8279a59d5f59a8cd08029ea754a6ffbcf48823c084903b339191c" Dec 01 14:19:21 crc kubenswrapper[4585]: I1201 14:19:21.308257 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 14:19:21 crc kubenswrapper[4585]: I1201 14:19:21.327054 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 14:19:21 crc kubenswrapper[4585]: I1201 14:19:21.347728 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 14:19:21 crc kubenswrapper[4585]: I1201 14:19:21.374067 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 14:19:21 crc kubenswrapper[4585]: E1201 14:19:21.374730 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c266121-e7d2-42aa-b1d9-0d15bdd0f798" containerName="setup-container" Dec 01 14:19:21 crc kubenswrapper[4585]: I1201 14:19:21.374810 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c266121-e7d2-42aa-b1d9-0d15bdd0f798" containerName="setup-container" Dec 01 14:19:21 crc kubenswrapper[4585]: E1201 14:19:21.374893 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c266121-e7d2-42aa-b1d9-0d15bdd0f798" containerName="rabbitmq" Dec 01 14:19:21 crc kubenswrapper[4585]: I1201 14:19:21.374961 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c266121-e7d2-42aa-b1d9-0d15bdd0f798" containerName="rabbitmq" Dec 01 14:19:21 crc kubenswrapper[4585]: I1201 14:19:21.375261 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c266121-e7d2-42aa-b1d9-0d15bdd0f798" containerName="rabbitmq" Dec 01 14:19:21 crc kubenswrapper[4585]: I1201 14:19:21.376432 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:19:21 crc kubenswrapper[4585]: I1201 14:19:21.385479 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 14:19:21 crc kubenswrapper[4585]: I1201 14:19:21.405644 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 01 14:19:21 crc kubenswrapper[4585]: I1201 14:19:21.406208 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 01 14:19:21 crc kubenswrapper[4585]: I1201 14:19:21.406272 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 01 14:19:21 crc kubenswrapper[4585]: I1201 14:19:21.406314 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 01 14:19:21 crc kubenswrapper[4585]: I1201 14:19:21.406367 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-7czjg" Dec 01 14:19:21 crc kubenswrapper[4585]: I1201 14:19:21.406407 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 01 14:19:21 crc kubenswrapper[4585]: I1201 14:19:21.406458 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 01 14:19:21 crc kubenswrapper[4585]: I1201 14:19:21.504951 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/645c2200-d127-4ffe-a91e-9f9ae104dc06-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"645c2200-d127-4ffe-a91e-9f9ae104dc06\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:19:21 crc kubenswrapper[4585]: I1201 14:19:21.505051 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/645c2200-d127-4ffe-a91e-9f9ae104dc06-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"645c2200-d127-4ffe-a91e-9f9ae104dc06\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:19:21 crc kubenswrapper[4585]: I1201 14:19:21.505121 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/645c2200-d127-4ffe-a91e-9f9ae104dc06-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"645c2200-d127-4ffe-a91e-9f9ae104dc06\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:19:21 crc kubenswrapper[4585]: I1201 14:19:21.505206 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"645c2200-d127-4ffe-a91e-9f9ae104dc06\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:19:21 crc kubenswrapper[4585]: I1201 14:19:21.505335 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbs9f\" (UniqueName: \"kubernetes.io/projected/645c2200-d127-4ffe-a91e-9f9ae104dc06-kube-api-access-mbs9f\") pod \"rabbitmq-cell1-server-0\" (UID: \"645c2200-d127-4ffe-a91e-9f9ae104dc06\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:19:21 crc kubenswrapper[4585]: I1201 14:19:21.505376 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/645c2200-d127-4ffe-a91e-9f9ae104dc06-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"645c2200-d127-4ffe-a91e-9f9ae104dc06\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:19:21 crc kubenswrapper[4585]: I1201 14:19:21.505476 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/645c2200-d127-4ffe-a91e-9f9ae104dc06-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"645c2200-d127-4ffe-a91e-9f9ae104dc06\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:19:21 crc kubenswrapper[4585]: I1201 14:19:21.505521 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/645c2200-d127-4ffe-a91e-9f9ae104dc06-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"645c2200-d127-4ffe-a91e-9f9ae104dc06\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:19:21 crc kubenswrapper[4585]: I1201 14:19:21.505611 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/645c2200-d127-4ffe-a91e-9f9ae104dc06-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"645c2200-d127-4ffe-a91e-9f9ae104dc06\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:19:21 crc kubenswrapper[4585]: I1201 14:19:21.505649 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/645c2200-d127-4ffe-a91e-9f9ae104dc06-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"645c2200-d127-4ffe-a91e-9f9ae104dc06\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:19:21 crc kubenswrapper[4585]: I1201 14:19:21.505696 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/645c2200-d127-4ffe-a91e-9f9ae104dc06-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"645c2200-d127-4ffe-a91e-9f9ae104dc06\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:19:21 crc kubenswrapper[4585]: I1201 14:19:21.608185 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/645c2200-d127-4ffe-a91e-9f9ae104dc06-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"645c2200-d127-4ffe-a91e-9f9ae104dc06\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:19:21 crc kubenswrapper[4585]: I1201 14:19:21.608276 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/645c2200-d127-4ffe-a91e-9f9ae104dc06-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"645c2200-d127-4ffe-a91e-9f9ae104dc06\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:19:21 crc kubenswrapper[4585]: I1201 14:19:21.608347 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/645c2200-d127-4ffe-a91e-9f9ae104dc06-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"645c2200-d127-4ffe-a91e-9f9ae104dc06\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:19:21 crc kubenswrapper[4585]: I1201 14:19:21.608455 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/645c2200-d127-4ffe-a91e-9f9ae104dc06-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"645c2200-d127-4ffe-a91e-9f9ae104dc06\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:19:21 crc kubenswrapper[4585]: I1201 14:19:21.609888 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/645c2200-d127-4ffe-a91e-9f9ae104dc06-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"645c2200-d127-4ffe-a91e-9f9ae104dc06\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:19:21 crc kubenswrapper[4585]: I1201 14:19:21.609957 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/645c2200-d127-4ffe-a91e-9f9ae104dc06-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"645c2200-d127-4ffe-a91e-9f9ae104dc06\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:19:21 crc kubenswrapper[4585]: I1201 14:19:21.608702 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/645c2200-d127-4ffe-a91e-9f9ae104dc06-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"645c2200-d127-4ffe-a91e-9f9ae104dc06\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:19:21 crc kubenswrapper[4585]: I1201 14:19:21.610078 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/645c2200-d127-4ffe-a91e-9f9ae104dc06-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"645c2200-d127-4ffe-a91e-9f9ae104dc06\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:19:21 crc kubenswrapper[4585]: I1201 14:19:21.610111 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"645c2200-d127-4ffe-a91e-9f9ae104dc06\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:19:21 crc kubenswrapper[4585]: I1201 14:19:21.610174 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbs9f\" (UniqueName: \"kubernetes.io/projected/645c2200-d127-4ffe-a91e-9f9ae104dc06-kube-api-access-mbs9f\") pod \"rabbitmq-cell1-server-0\" (UID: \"645c2200-d127-4ffe-a91e-9f9ae104dc06\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:19:21 crc kubenswrapper[4585]: I1201 14:19:21.610206 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/645c2200-d127-4ffe-a91e-9f9ae104dc06-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"645c2200-d127-4ffe-a91e-9f9ae104dc06\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:19:21 crc kubenswrapper[4585]: I1201 14:19:21.610259 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/645c2200-d127-4ffe-a91e-9f9ae104dc06-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"645c2200-d127-4ffe-a91e-9f9ae104dc06\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:19:21 crc kubenswrapper[4585]: I1201 14:19:21.610299 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/645c2200-d127-4ffe-a91e-9f9ae104dc06-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"645c2200-d127-4ffe-a91e-9f9ae104dc06\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:19:21 crc kubenswrapper[4585]: I1201 14:19:21.610546 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/645c2200-d127-4ffe-a91e-9f9ae104dc06-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"645c2200-d127-4ffe-a91e-9f9ae104dc06\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:19:21 crc kubenswrapper[4585]: I1201 14:19:21.610642 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/645c2200-d127-4ffe-a91e-9f9ae104dc06-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"645c2200-d127-4ffe-a91e-9f9ae104dc06\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:19:21 crc kubenswrapper[4585]: I1201 14:19:21.610879 4585 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"645c2200-d127-4ffe-a91e-9f9ae104dc06\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:19:21 crc kubenswrapper[4585]: I1201 14:19:21.611276 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/645c2200-d127-4ffe-a91e-9f9ae104dc06-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"645c2200-d127-4ffe-a91e-9f9ae104dc06\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:19:21 crc kubenswrapper[4585]: I1201 14:19:21.614686 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/645c2200-d127-4ffe-a91e-9f9ae104dc06-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"645c2200-d127-4ffe-a91e-9f9ae104dc06\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:19:21 crc kubenswrapper[4585]: I1201 14:19:21.628240 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/645c2200-d127-4ffe-a91e-9f9ae104dc06-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"645c2200-d127-4ffe-a91e-9f9ae104dc06\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:19:21 crc kubenswrapper[4585]: I1201 14:19:21.628537 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/645c2200-d127-4ffe-a91e-9f9ae104dc06-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"645c2200-d127-4ffe-a91e-9f9ae104dc06\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:19:21 crc kubenswrapper[4585]: I1201 14:19:21.628616 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/645c2200-d127-4ffe-a91e-9f9ae104dc06-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"645c2200-d127-4ffe-a91e-9f9ae104dc06\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:19:21 crc kubenswrapper[4585]: I1201 14:19:21.633153 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbs9f\" (UniqueName: \"kubernetes.io/projected/645c2200-d127-4ffe-a91e-9f9ae104dc06-kube-api-access-mbs9f\") pod \"rabbitmq-cell1-server-0\" (UID: \"645c2200-d127-4ffe-a91e-9f9ae104dc06\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:19:21 crc kubenswrapper[4585]: I1201 14:19:21.664299 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"645c2200-d127-4ffe-a91e-9f9ae104dc06\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:19:21 crc kubenswrapper[4585]: I1201 14:19:21.749772 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:19:22 crc kubenswrapper[4585]: I1201 14:19:22.221002 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 14:19:22 crc kubenswrapper[4585]: I1201 14:19:22.241118 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-b5dwd"] Dec 01 14:19:22 crc kubenswrapper[4585]: I1201 14:19:22.242601 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-b5dwd" Dec 01 14:19:22 crc kubenswrapper[4585]: I1201 14:19:22.248039 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Dec 01 14:19:22 crc kubenswrapper[4585]: I1201 14:19:22.280430 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-b5dwd"] Dec 01 14:19:22 crc kubenswrapper[4585]: I1201 14:19:22.325282 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-b5dwd\" (UID: \"2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-b5dwd" Dec 01 14:19:22 crc kubenswrapper[4585]: I1201 14:19:22.325375 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6mx7\" (UniqueName: \"kubernetes.io/projected/2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d-kube-api-access-p6mx7\") pod \"dnsmasq-dns-79bd4cc8c9-b5dwd\" (UID: \"2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-b5dwd" Dec 01 14:19:22 crc kubenswrapper[4585]: I1201 14:19:22.325412 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-b5dwd\" (UID: \"2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-b5dwd" Dec 01 14:19:22 crc kubenswrapper[4585]: I1201 14:19:22.326295 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-b5dwd\" (UID: \"2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-b5dwd" Dec 01 14:19:22 crc kubenswrapper[4585]: I1201 14:19:22.326371 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-b5dwd\" (UID: \"2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-b5dwd" Dec 01 14:19:22 crc kubenswrapper[4585]: I1201 14:19:22.326453 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d-config\") pod \"dnsmasq-dns-79bd4cc8c9-b5dwd\" (UID: \"2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-b5dwd" Dec 01 14:19:22 crc kubenswrapper[4585]: I1201 14:19:22.326518 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-b5dwd\" (UID: \"2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-b5dwd" Dec 01 14:19:22 crc kubenswrapper[4585]: I1201 14:19:22.334393 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"645c2200-d127-4ffe-a91e-9f9ae104dc06","Type":"ContainerStarted","Data":"81cb2b0dc3d4b2823b98f90b13d7f8b9bbd2be7b732299c9d0d72f25829c5d10"} Dec 01 14:19:22 crc kubenswrapper[4585]: I1201 14:19:22.335743 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b271e13c-b935-4f31-a32d-865af7228e55","Type":"ContainerStarted","Data":"2466b702df45aec784446a20970afd18cd9ca3a3d91d0f128f00ae21a7eee727"} Dec 01 14:19:22 crc kubenswrapper[4585]: I1201 14:19:22.423359 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c266121-e7d2-42aa-b1d9-0d15bdd0f798" path="/var/lib/kubelet/pods/6c266121-e7d2-42aa-b1d9-0d15bdd0f798/volumes" Dec 01 14:19:22 crc kubenswrapper[4585]: I1201 14:19:22.428599 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-b5dwd\" (UID: \"2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-b5dwd" Dec 01 14:19:22 crc kubenswrapper[4585]: I1201 14:19:22.428679 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6mx7\" (UniqueName: \"kubernetes.io/projected/2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d-kube-api-access-p6mx7\") pod \"dnsmasq-dns-79bd4cc8c9-b5dwd\" (UID: \"2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-b5dwd" Dec 01 14:19:22 crc kubenswrapper[4585]: I1201 14:19:22.428720 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-b5dwd\" (UID: \"2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-b5dwd" Dec 01 14:19:22 crc kubenswrapper[4585]: I1201 14:19:22.428748 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-b5dwd\" (UID: \"2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-b5dwd" Dec 01 14:19:22 crc kubenswrapper[4585]: I1201 14:19:22.428777 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-b5dwd\" (UID: \"2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-b5dwd" Dec 01 14:19:22 crc kubenswrapper[4585]: I1201 14:19:22.428822 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d-config\") pod \"dnsmasq-dns-79bd4cc8c9-b5dwd\" (UID: \"2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-b5dwd" Dec 01 14:19:22 crc kubenswrapper[4585]: I1201 14:19:22.428867 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-b5dwd\" (UID: \"2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-b5dwd" Dec 01 14:19:22 crc kubenswrapper[4585]: I1201 14:19:22.429587 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-b5dwd\" (UID: \"2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-b5dwd" Dec 01 14:19:22 crc kubenswrapper[4585]: I1201 14:19:22.429616 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-b5dwd\" (UID: \"2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-b5dwd" Dec 01 14:19:22 crc kubenswrapper[4585]: I1201 14:19:22.430218 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-b5dwd\" (UID: \"2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-b5dwd" Dec 01 14:19:22 crc kubenswrapper[4585]: I1201 14:19:22.430771 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d-config\") pod \"dnsmasq-dns-79bd4cc8c9-b5dwd\" (UID: \"2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-b5dwd" Dec 01 14:19:22 crc kubenswrapper[4585]: I1201 14:19:22.430860 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-b5dwd\" (UID: \"2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-b5dwd" Dec 01 14:19:22 crc kubenswrapper[4585]: I1201 14:19:22.431407 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-b5dwd\" (UID: \"2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-b5dwd" Dec 01 14:19:22 crc kubenswrapper[4585]: I1201 14:19:22.446799 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6mx7\" (UniqueName: \"kubernetes.io/projected/2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d-kube-api-access-p6mx7\") pod \"dnsmasq-dns-79bd4cc8c9-b5dwd\" (UID: \"2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-b5dwd" Dec 01 14:19:22 crc kubenswrapper[4585]: I1201 14:19:22.603217 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-b5dwd" Dec 01 14:19:23 crc kubenswrapper[4585]: I1201 14:19:23.095849 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-b5dwd"] Dec 01 14:19:23 crc kubenswrapper[4585]: W1201 14:19:23.099577 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2286dd35_c7a4_4b2b_bb9d_93a07c5bce8d.slice/crio-5a79e7743108e9221ffd4c542cb52affd8664b9afadbd01bf349640bf9469372 WatchSource:0}: Error finding container 5a79e7743108e9221ffd4c542cb52affd8664b9afadbd01bf349640bf9469372: Status 404 returned error can't find the container with id 5a79e7743108e9221ffd4c542cb52affd8664b9afadbd01bf349640bf9469372 Dec 01 14:19:23 crc kubenswrapper[4585]: I1201 14:19:23.351341 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-b5dwd" event={"ID":"2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d","Type":"ContainerStarted","Data":"d3f4fc0b4d62ae2e3ee458495a356ef7790773a219fb2653936de6d93386c676"} Dec 01 14:19:23 crc kubenswrapper[4585]: I1201 14:19:23.351637 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-b5dwd" event={"ID":"2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d","Type":"ContainerStarted","Data":"5a79e7743108e9221ffd4c542cb52affd8664b9afadbd01bf349640bf9469372"} Dec 01 14:19:23 crc kubenswrapper[4585]: I1201 14:19:23.355253 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b271e13c-b935-4f31-a32d-865af7228e55","Type":"ContainerStarted","Data":"a31b711a5b8afd65ad9f661a71bf84c9b1899dd96fd2798d26e8657ec6bb7581"} Dec 01 14:19:24 crc kubenswrapper[4585]: I1201 14:19:24.366249 4585 generic.go:334] "Generic (PLEG): container finished" podID="2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d" containerID="d3f4fc0b4d62ae2e3ee458495a356ef7790773a219fb2653936de6d93386c676" exitCode=0 Dec 01 14:19:24 crc kubenswrapper[4585]: I1201 14:19:24.366346 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-b5dwd" event={"ID":"2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d","Type":"ContainerDied","Data":"d3f4fc0b4d62ae2e3ee458495a356ef7790773a219fb2653936de6d93386c676"} Dec 01 14:19:24 crc kubenswrapper[4585]: I1201 14:19:24.368936 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"645c2200-d127-4ffe-a91e-9f9ae104dc06","Type":"ContainerStarted","Data":"1c89711d913f5eabaa4681e8495f26315d6ccd2dda9d0227e534402e665f7cdb"} Dec 01 14:19:25 crc kubenswrapper[4585]: I1201 14:19:25.381326 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-b5dwd" event={"ID":"2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d","Type":"ContainerStarted","Data":"485861e66e7a293b6302b3d5ec4b307dd091303a33233d520e0dd9e36af99fb2"} Dec 01 14:19:25 crc kubenswrapper[4585]: I1201 14:19:25.382466 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-79bd4cc8c9-b5dwd" Dec 01 14:19:25 crc kubenswrapper[4585]: I1201 14:19:25.413877 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-79bd4cc8c9-b5dwd" podStartSLOduration=3.413847913 podStartE2EDuration="3.413847913s" podCreationTimestamp="2025-12-01 14:19:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:19:25.405181862 +0000 UTC m=+1279.389395727" watchObservedRunningTime="2025-12-01 14:19:25.413847913 +0000 UTC m=+1279.398061778" Dec 01 14:19:32 crc kubenswrapper[4585]: I1201 14:19:32.605259 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-79bd4cc8c9-b5dwd" Dec 01 14:19:32 crc kubenswrapper[4585]: I1201 14:19:32.687023 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-lxk24"] Dec 01 14:19:32 crc kubenswrapper[4585]: I1201 14:19:32.687276 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-89c5cd4d5-lxk24" podUID="d3580eb6-20f0-4ed5-a45b-6b081edd487d" containerName="dnsmasq-dns" containerID="cri-o://f4981f468997dae832f470c09a3c8b928be8b9dca1ef159716ef82c1c38d2ff9" gracePeriod=10 Dec 01 14:19:32 crc kubenswrapper[4585]: I1201 14:19:32.857197 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6cd9bffc9-cjbbt"] Dec 01 14:19:32 crc kubenswrapper[4585]: I1201 14:19:32.859778 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cd9bffc9-cjbbt" Dec 01 14:19:32 crc kubenswrapper[4585]: I1201 14:19:32.935495 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cd9bffc9-cjbbt"] Dec 01 14:19:32 crc kubenswrapper[4585]: I1201 14:19:32.990251 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqftx\" (UniqueName: \"kubernetes.io/projected/7e89a01c-fb21-4027-bbeb-6bfe70da33d0-kube-api-access-hqftx\") pod \"dnsmasq-dns-6cd9bffc9-cjbbt\" (UID: \"7e89a01c-fb21-4027-bbeb-6bfe70da33d0\") " pod="openstack/dnsmasq-dns-6cd9bffc9-cjbbt" Dec 01 14:19:32 crc kubenswrapper[4585]: I1201 14:19:32.990294 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e89a01c-fb21-4027-bbeb-6bfe70da33d0-dns-swift-storage-0\") pod \"dnsmasq-dns-6cd9bffc9-cjbbt\" (UID: \"7e89a01c-fb21-4027-bbeb-6bfe70da33d0\") " pod="openstack/dnsmasq-dns-6cd9bffc9-cjbbt" Dec 01 14:19:32 crc kubenswrapper[4585]: I1201 14:19:32.990324 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7e89a01c-fb21-4027-bbeb-6bfe70da33d0-openstack-edpm-ipam\") pod \"dnsmasq-dns-6cd9bffc9-cjbbt\" (UID: \"7e89a01c-fb21-4027-bbeb-6bfe70da33d0\") " pod="openstack/dnsmasq-dns-6cd9bffc9-cjbbt" Dec 01 14:19:32 crc kubenswrapper[4585]: I1201 14:19:32.990358 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e89a01c-fb21-4027-bbeb-6bfe70da33d0-dns-svc\") pod \"dnsmasq-dns-6cd9bffc9-cjbbt\" (UID: \"7e89a01c-fb21-4027-bbeb-6bfe70da33d0\") " pod="openstack/dnsmasq-dns-6cd9bffc9-cjbbt" Dec 01 14:19:32 crc kubenswrapper[4585]: I1201 14:19:32.990383 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e89a01c-fb21-4027-bbeb-6bfe70da33d0-config\") pod \"dnsmasq-dns-6cd9bffc9-cjbbt\" (UID: \"7e89a01c-fb21-4027-bbeb-6bfe70da33d0\") " pod="openstack/dnsmasq-dns-6cd9bffc9-cjbbt" Dec 01 14:19:32 crc kubenswrapper[4585]: I1201 14:19:32.990405 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e89a01c-fb21-4027-bbeb-6bfe70da33d0-ovsdbserver-sb\") pod \"dnsmasq-dns-6cd9bffc9-cjbbt\" (UID: \"7e89a01c-fb21-4027-bbeb-6bfe70da33d0\") " pod="openstack/dnsmasq-dns-6cd9bffc9-cjbbt" Dec 01 14:19:32 crc kubenswrapper[4585]: I1201 14:19:32.990479 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e89a01c-fb21-4027-bbeb-6bfe70da33d0-ovsdbserver-nb\") pod \"dnsmasq-dns-6cd9bffc9-cjbbt\" (UID: \"7e89a01c-fb21-4027-bbeb-6bfe70da33d0\") " pod="openstack/dnsmasq-dns-6cd9bffc9-cjbbt" Dec 01 14:19:33 crc kubenswrapper[4585]: I1201 14:19:33.096506 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e89a01c-fb21-4027-bbeb-6bfe70da33d0-ovsdbserver-nb\") pod \"dnsmasq-dns-6cd9bffc9-cjbbt\" (UID: \"7e89a01c-fb21-4027-bbeb-6bfe70da33d0\") " pod="openstack/dnsmasq-dns-6cd9bffc9-cjbbt" Dec 01 14:19:33 crc kubenswrapper[4585]: I1201 14:19:33.096645 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqftx\" (UniqueName: \"kubernetes.io/projected/7e89a01c-fb21-4027-bbeb-6bfe70da33d0-kube-api-access-hqftx\") pod \"dnsmasq-dns-6cd9bffc9-cjbbt\" (UID: \"7e89a01c-fb21-4027-bbeb-6bfe70da33d0\") " pod="openstack/dnsmasq-dns-6cd9bffc9-cjbbt" Dec 01 14:19:33 crc kubenswrapper[4585]: I1201 14:19:33.096678 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e89a01c-fb21-4027-bbeb-6bfe70da33d0-dns-swift-storage-0\") pod \"dnsmasq-dns-6cd9bffc9-cjbbt\" (UID: \"7e89a01c-fb21-4027-bbeb-6bfe70da33d0\") " pod="openstack/dnsmasq-dns-6cd9bffc9-cjbbt" Dec 01 14:19:33 crc kubenswrapper[4585]: I1201 14:19:33.096721 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7e89a01c-fb21-4027-bbeb-6bfe70da33d0-openstack-edpm-ipam\") pod \"dnsmasq-dns-6cd9bffc9-cjbbt\" (UID: \"7e89a01c-fb21-4027-bbeb-6bfe70da33d0\") " pod="openstack/dnsmasq-dns-6cd9bffc9-cjbbt" Dec 01 14:19:33 crc kubenswrapper[4585]: I1201 14:19:33.096761 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e89a01c-fb21-4027-bbeb-6bfe70da33d0-dns-svc\") pod \"dnsmasq-dns-6cd9bffc9-cjbbt\" (UID: \"7e89a01c-fb21-4027-bbeb-6bfe70da33d0\") " pod="openstack/dnsmasq-dns-6cd9bffc9-cjbbt" Dec 01 14:19:33 crc kubenswrapper[4585]: I1201 14:19:33.096796 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e89a01c-fb21-4027-bbeb-6bfe70da33d0-config\") pod \"dnsmasq-dns-6cd9bffc9-cjbbt\" (UID: \"7e89a01c-fb21-4027-bbeb-6bfe70da33d0\") " pod="openstack/dnsmasq-dns-6cd9bffc9-cjbbt" Dec 01 14:19:33 crc kubenswrapper[4585]: I1201 14:19:33.096825 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e89a01c-fb21-4027-bbeb-6bfe70da33d0-ovsdbserver-sb\") pod \"dnsmasq-dns-6cd9bffc9-cjbbt\" (UID: \"7e89a01c-fb21-4027-bbeb-6bfe70da33d0\") " pod="openstack/dnsmasq-dns-6cd9bffc9-cjbbt" Dec 01 14:19:33 crc kubenswrapper[4585]: I1201 14:19:33.097629 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e89a01c-fb21-4027-bbeb-6bfe70da33d0-ovsdbserver-nb\") pod \"dnsmasq-dns-6cd9bffc9-cjbbt\" (UID: \"7e89a01c-fb21-4027-bbeb-6bfe70da33d0\") " pod="openstack/dnsmasq-dns-6cd9bffc9-cjbbt" Dec 01 14:19:33 crc kubenswrapper[4585]: I1201 14:19:33.098125 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e89a01c-fb21-4027-bbeb-6bfe70da33d0-dns-svc\") pod \"dnsmasq-dns-6cd9bffc9-cjbbt\" (UID: \"7e89a01c-fb21-4027-bbeb-6bfe70da33d0\") " pod="openstack/dnsmasq-dns-6cd9bffc9-cjbbt" Dec 01 14:19:33 crc kubenswrapper[4585]: I1201 14:19:33.098657 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e89a01c-fb21-4027-bbeb-6bfe70da33d0-dns-swift-storage-0\") pod \"dnsmasq-dns-6cd9bffc9-cjbbt\" (UID: \"7e89a01c-fb21-4027-bbeb-6bfe70da33d0\") " pod="openstack/dnsmasq-dns-6cd9bffc9-cjbbt" Dec 01 14:19:33 crc kubenswrapper[4585]: I1201 14:19:33.098845 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7e89a01c-fb21-4027-bbeb-6bfe70da33d0-openstack-edpm-ipam\") pod \"dnsmasq-dns-6cd9bffc9-cjbbt\" (UID: \"7e89a01c-fb21-4027-bbeb-6bfe70da33d0\") " pod="openstack/dnsmasq-dns-6cd9bffc9-cjbbt" Dec 01 14:19:33 crc kubenswrapper[4585]: I1201 14:19:33.098907 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e89a01c-fb21-4027-bbeb-6bfe70da33d0-config\") pod \"dnsmasq-dns-6cd9bffc9-cjbbt\" (UID: \"7e89a01c-fb21-4027-bbeb-6bfe70da33d0\") " pod="openstack/dnsmasq-dns-6cd9bffc9-cjbbt" Dec 01 14:19:33 crc kubenswrapper[4585]: I1201 14:19:33.100386 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e89a01c-fb21-4027-bbeb-6bfe70da33d0-ovsdbserver-sb\") pod \"dnsmasq-dns-6cd9bffc9-cjbbt\" (UID: \"7e89a01c-fb21-4027-bbeb-6bfe70da33d0\") " pod="openstack/dnsmasq-dns-6cd9bffc9-cjbbt" Dec 01 14:19:33 crc kubenswrapper[4585]: I1201 14:19:33.137503 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqftx\" (UniqueName: \"kubernetes.io/projected/7e89a01c-fb21-4027-bbeb-6bfe70da33d0-kube-api-access-hqftx\") pod \"dnsmasq-dns-6cd9bffc9-cjbbt\" (UID: \"7e89a01c-fb21-4027-bbeb-6bfe70da33d0\") " pod="openstack/dnsmasq-dns-6cd9bffc9-cjbbt" Dec 01 14:19:33 crc kubenswrapper[4585]: I1201 14:19:33.193168 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cd9bffc9-cjbbt" Dec 01 14:19:33 crc kubenswrapper[4585]: I1201 14:19:33.316041 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-lxk24" Dec 01 14:19:33 crc kubenswrapper[4585]: I1201 14:19:33.482317 4585 generic.go:334] "Generic (PLEG): container finished" podID="d3580eb6-20f0-4ed5-a45b-6b081edd487d" containerID="f4981f468997dae832f470c09a3c8b928be8b9dca1ef159716ef82c1c38d2ff9" exitCode=0 Dec 01 14:19:33 crc kubenswrapper[4585]: I1201 14:19:33.482359 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-lxk24" event={"ID":"d3580eb6-20f0-4ed5-a45b-6b081edd487d","Type":"ContainerDied","Data":"f4981f468997dae832f470c09a3c8b928be8b9dca1ef159716ef82c1c38d2ff9"} Dec 01 14:19:33 crc kubenswrapper[4585]: I1201 14:19:33.482390 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-lxk24" event={"ID":"d3580eb6-20f0-4ed5-a45b-6b081edd487d","Type":"ContainerDied","Data":"62c5562efca0f4a33895898c2e5e2b7ddc4301d722153d06ddd94c5e42bbbcfe"} Dec 01 14:19:33 crc kubenswrapper[4585]: I1201 14:19:33.482391 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-lxk24" Dec 01 14:19:33 crc kubenswrapper[4585]: I1201 14:19:33.482407 4585 scope.go:117] "RemoveContainer" containerID="f4981f468997dae832f470c09a3c8b928be8b9dca1ef159716ef82c1c38d2ff9" Dec 01 14:19:33 crc kubenswrapper[4585]: I1201 14:19:33.505102 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d3580eb6-20f0-4ed5-a45b-6b081edd487d-ovsdbserver-sb\") pod \"d3580eb6-20f0-4ed5-a45b-6b081edd487d\" (UID: \"d3580eb6-20f0-4ed5-a45b-6b081edd487d\") " Dec 01 14:19:33 crc kubenswrapper[4585]: I1201 14:19:33.505236 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d3580eb6-20f0-4ed5-a45b-6b081edd487d-dns-svc\") pod \"d3580eb6-20f0-4ed5-a45b-6b081edd487d\" (UID: \"d3580eb6-20f0-4ed5-a45b-6b081edd487d\") " Dec 01 14:19:33 crc kubenswrapper[4585]: I1201 14:19:33.505359 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d3580eb6-20f0-4ed5-a45b-6b081edd487d-ovsdbserver-nb\") pod \"d3580eb6-20f0-4ed5-a45b-6b081edd487d\" (UID: \"d3580eb6-20f0-4ed5-a45b-6b081edd487d\") " Dec 01 14:19:33 crc kubenswrapper[4585]: I1201 14:19:33.505404 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d3580eb6-20f0-4ed5-a45b-6b081edd487d-dns-swift-storage-0\") pod \"d3580eb6-20f0-4ed5-a45b-6b081edd487d\" (UID: \"d3580eb6-20f0-4ed5-a45b-6b081edd487d\") " Dec 01 14:19:33 crc kubenswrapper[4585]: I1201 14:19:33.505434 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkwmf\" (UniqueName: \"kubernetes.io/projected/d3580eb6-20f0-4ed5-a45b-6b081edd487d-kube-api-access-wkwmf\") pod \"d3580eb6-20f0-4ed5-a45b-6b081edd487d\" (UID: \"d3580eb6-20f0-4ed5-a45b-6b081edd487d\") " Dec 01 14:19:33 crc kubenswrapper[4585]: I1201 14:19:33.506262 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3580eb6-20f0-4ed5-a45b-6b081edd487d-config\") pod \"d3580eb6-20f0-4ed5-a45b-6b081edd487d\" (UID: \"d3580eb6-20f0-4ed5-a45b-6b081edd487d\") " Dec 01 14:19:33 crc kubenswrapper[4585]: I1201 14:19:33.510739 4585 scope.go:117] "RemoveContainer" containerID="9ed97749f4ec64edb57f645adc69e887c4cd1acd81ecf8eb255c340f7ea63f5b" Dec 01 14:19:33 crc kubenswrapper[4585]: I1201 14:19:33.516938 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3580eb6-20f0-4ed5-a45b-6b081edd487d-kube-api-access-wkwmf" (OuterVolumeSpecName: "kube-api-access-wkwmf") pod "d3580eb6-20f0-4ed5-a45b-6b081edd487d" (UID: "d3580eb6-20f0-4ed5-a45b-6b081edd487d"). InnerVolumeSpecName "kube-api-access-wkwmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:19:33 crc kubenswrapper[4585]: I1201 14:19:33.549723 4585 scope.go:117] "RemoveContainer" containerID="f4981f468997dae832f470c09a3c8b928be8b9dca1ef159716ef82c1c38d2ff9" Dec 01 14:19:33 crc kubenswrapper[4585]: E1201 14:19:33.550325 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4981f468997dae832f470c09a3c8b928be8b9dca1ef159716ef82c1c38d2ff9\": container with ID starting with f4981f468997dae832f470c09a3c8b928be8b9dca1ef159716ef82c1c38d2ff9 not found: ID does not exist" containerID="f4981f468997dae832f470c09a3c8b928be8b9dca1ef159716ef82c1c38d2ff9" Dec 01 14:19:33 crc kubenswrapper[4585]: I1201 14:19:33.550359 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4981f468997dae832f470c09a3c8b928be8b9dca1ef159716ef82c1c38d2ff9"} err="failed to get container status \"f4981f468997dae832f470c09a3c8b928be8b9dca1ef159716ef82c1c38d2ff9\": rpc error: code = NotFound desc = could not find container \"f4981f468997dae832f470c09a3c8b928be8b9dca1ef159716ef82c1c38d2ff9\": container with ID starting with f4981f468997dae832f470c09a3c8b928be8b9dca1ef159716ef82c1c38d2ff9 not found: ID does not exist" Dec 01 14:19:33 crc kubenswrapper[4585]: I1201 14:19:33.550379 4585 scope.go:117] "RemoveContainer" containerID="9ed97749f4ec64edb57f645adc69e887c4cd1acd81ecf8eb255c340f7ea63f5b" Dec 01 14:19:33 crc kubenswrapper[4585]: E1201 14:19:33.550659 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ed97749f4ec64edb57f645adc69e887c4cd1acd81ecf8eb255c340f7ea63f5b\": container with ID starting with 9ed97749f4ec64edb57f645adc69e887c4cd1acd81ecf8eb255c340f7ea63f5b not found: ID does not exist" containerID="9ed97749f4ec64edb57f645adc69e887c4cd1acd81ecf8eb255c340f7ea63f5b" Dec 01 14:19:33 crc kubenswrapper[4585]: I1201 14:19:33.550678 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ed97749f4ec64edb57f645adc69e887c4cd1acd81ecf8eb255c340f7ea63f5b"} err="failed to get container status \"9ed97749f4ec64edb57f645adc69e887c4cd1acd81ecf8eb255c340f7ea63f5b\": rpc error: code = NotFound desc = could not find container \"9ed97749f4ec64edb57f645adc69e887c4cd1acd81ecf8eb255c340f7ea63f5b\": container with ID starting with 9ed97749f4ec64edb57f645adc69e887c4cd1acd81ecf8eb255c340f7ea63f5b not found: ID does not exist" Dec 01 14:19:33 crc kubenswrapper[4585]: I1201 14:19:33.573150 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3580eb6-20f0-4ed5-a45b-6b081edd487d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d3580eb6-20f0-4ed5-a45b-6b081edd487d" (UID: "d3580eb6-20f0-4ed5-a45b-6b081edd487d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:19:33 crc kubenswrapper[4585]: I1201 14:19:33.573672 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3580eb6-20f0-4ed5-a45b-6b081edd487d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d3580eb6-20f0-4ed5-a45b-6b081edd487d" (UID: "d3580eb6-20f0-4ed5-a45b-6b081edd487d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:19:33 crc kubenswrapper[4585]: I1201 14:19:33.579083 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3580eb6-20f0-4ed5-a45b-6b081edd487d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d3580eb6-20f0-4ed5-a45b-6b081edd487d" (UID: "d3580eb6-20f0-4ed5-a45b-6b081edd487d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:19:33 crc kubenswrapper[4585]: I1201 14:19:33.580215 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3580eb6-20f0-4ed5-a45b-6b081edd487d-config" (OuterVolumeSpecName: "config") pod "d3580eb6-20f0-4ed5-a45b-6b081edd487d" (UID: "d3580eb6-20f0-4ed5-a45b-6b081edd487d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:19:33 crc kubenswrapper[4585]: I1201 14:19:33.587112 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3580eb6-20f0-4ed5-a45b-6b081edd487d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d3580eb6-20f0-4ed5-a45b-6b081edd487d" (UID: "d3580eb6-20f0-4ed5-a45b-6b081edd487d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:19:33 crc kubenswrapper[4585]: I1201 14:19:33.609392 4585 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d3580eb6-20f0-4ed5-a45b-6b081edd487d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 14:19:33 crc kubenswrapper[4585]: I1201 14:19:33.609426 4585 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d3580eb6-20f0-4ed5-a45b-6b081edd487d-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 14:19:33 crc kubenswrapper[4585]: I1201 14:19:33.609439 4585 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d3580eb6-20f0-4ed5-a45b-6b081edd487d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 14:19:33 crc kubenswrapper[4585]: I1201 14:19:33.609448 4585 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d3580eb6-20f0-4ed5-a45b-6b081edd487d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 01 14:19:33 crc kubenswrapper[4585]: I1201 14:19:33.609457 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkwmf\" (UniqueName: \"kubernetes.io/projected/d3580eb6-20f0-4ed5-a45b-6b081edd487d-kube-api-access-wkwmf\") on node \"crc\" DevicePath \"\"" Dec 01 14:19:33 crc kubenswrapper[4585]: I1201 14:19:33.609469 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3580eb6-20f0-4ed5-a45b-6b081edd487d-config\") on node \"crc\" DevicePath \"\"" Dec 01 14:19:33 crc kubenswrapper[4585]: I1201 14:19:33.664233 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cd9bffc9-cjbbt"] Dec 01 14:19:33 crc kubenswrapper[4585]: W1201 14:19:33.664391 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e89a01c_fb21_4027_bbeb_6bfe70da33d0.slice/crio-ae9dc3b2318e7b44bd4aeb9176997dc5b041d99c0171ac71eaca179523a555fa WatchSource:0}: Error finding container ae9dc3b2318e7b44bd4aeb9176997dc5b041d99c0171ac71eaca179523a555fa: Status 404 returned error can't find the container with id ae9dc3b2318e7b44bd4aeb9176997dc5b041d99c0171ac71eaca179523a555fa Dec 01 14:19:33 crc kubenswrapper[4585]: I1201 14:19:33.909931 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-lxk24"] Dec 01 14:19:33 crc kubenswrapper[4585]: I1201 14:19:33.923239 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-lxk24"] Dec 01 14:19:34 crc kubenswrapper[4585]: I1201 14:19:34.426426 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3580eb6-20f0-4ed5-a45b-6b081edd487d" path="/var/lib/kubelet/pods/d3580eb6-20f0-4ed5-a45b-6b081edd487d/volumes" Dec 01 14:19:34 crc kubenswrapper[4585]: I1201 14:19:34.495127 4585 generic.go:334] "Generic (PLEG): container finished" podID="7e89a01c-fb21-4027-bbeb-6bfe70da33d0" containerID="0139a40c4203a6fa9c1b9cff756611c7f1fdec272c26562f5ff3269c180b3938" exitCode=0 Dec 01 14:19:34 crc kubenswrapper[4585]: I1201 14:19:34.495211 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cd9bffc9-cjbbt" event={"ID":"7e89a01c-fb21-4027-bbeb-6bfe70da33d0","Type":"ContainerDied","Data":"0139a40c4203a6fa9c1b9cff756611c7f1fdec272c26562f5ff3269c180b3938"} Dec 01 14:19:34 crc kubenswrapper[4585]: I1201 14:19:34.495278 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cd9bffc9-cjbbt" event={"ID":"7e89a01c-fb21-4027-bbeb-6bfe70da33d0","Type":"ContainerStarted","Data":"ae9dc3b2318e7b44bd4aeb9176997dc5b041d99c0171ac71eaca179523a555fa"} Dec 01 14:19:35 crc kubenswrapper[4585]: I1201 14:19:35.506343 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cd9bffc9-cjbbt" event={"ID":"7e89a01c-fb21-4027-bbeb-6bfe70da33d0","Type":"ContainerStarted","Data":"afbe6ed49d4411612df18da309579bd73a397db3d16885da20d82f57d9670c3a"} Dec 01 14:19:35 crc kubenswrapper[4585]: I1201 14:19:35.506639 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6cd9bffc9-cjbbt" Dec 01 14:19:35 crc kubenswrapper[4585]: I1201 14:19:35.533567 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6cd9bffc9-cjbbt" podStartSLOduration=3.533545462 podStartE2EDuration="3.533545462s" podCreationTimestamp="2025-12-01 14:19:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:19:35.524064509 +0000 UTC m=+1289.508278364" watchObservedRunningTime="2025-12-01 14:19:35.533545462 +0000 UTC m=+1289.517759317" Dec 01 14:19:43 crc kubenswrapper[4585]: I1201 14:19:43.196110 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6cd9bffc9-cjbbt" Dec 01 14:19:43 crc kubenswrapper[4585]: I1201 14:19:43.285780 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-b5dwd"] Dec 01 14:19:43 crc kubenswrapper[4585]: I1201 14:19:43.286590 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-79bd4cc8c9-b5dwd" podUID="2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d" containerName="dnsmasq-dns" containerID="cri-o://485861e66e7a293b6302b3d5ec4b307dd091303a33233d520e0dd9e36af99fb2" gracePeriod=10 Dec 01 14:19:43 crc kubenswrapper[4585]: I1201 14:19:43.616423 4585 generic.go:334] "Generic (PLEG): container finished" podID="2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d" containerID="485861e66e7a293b6302b3d5ec4b307dd091303a33233d520e0dd9e36af99fb2" exitCode=0 Dec 01 14:19:43 crc kubenswrapper[4585]: I1201 14:19:43.616486 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-b5dwd" event={"ID":"2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d","Type":"ContainerDied","Data":"485861e66e7a293b6302b3d5ec4b307dd091303a33233d520e0dd9e36af99fb2"} Dec 01 14:19:43 crc kubenswrapper[4585]: I1201 14:19:43.777785 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-b5dwd" Dec 01 14:19:43 crc kubenswrapper[4585]: I1201 14:19:43.946386 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d-config\") pod \"2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d\" (UID: \"2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d\") " Dec 01 14:19:43 crc kubenswrapper[4585]: I1201 14:19:43.946476 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6mx7\" (UniqueName: \"kubernetes.io/projected/2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d-kube-api-access-p6mx7\") pod \"2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d\" (UID: \"2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d\") " Dec 01 14:19:43 crc kubenswrapper[4585]: I1201 14:19:43.946515 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d-ovsdbserver-sb\") pod \"2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d\" (UID: \"2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d\") " Dec 01 14:19:43 crc kubenswrapper[4585]: I1201 14:19:43.946579 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d-openstack-edpm-ipam\") pod \"2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d\" (UID: \"2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d\") " Dec 01 14:19:43 crc kubenswrapper[4585]: I1201 14:19:43.946635 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d-dns-swift-storage-0\") pod \"2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d\" (UID: \"2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d\") " Dec 01 14:19:43 crc kubenswrapper[4585]: I1201 14:19:43.946725 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d-ovsdbserver-nb\") pod \"2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d\" (UID: \"2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d\") " Dec 01 14:19:43 crc kubenswrapper[4585]: I1201 14:19:43.946767 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d-dns-svc\") pod \"2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d\" (UID: \"2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d\") " Dec 01 14:19:43 crc kubenswrapper[4585]: I1201 14:19:43.971049 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d-kube-api-access-p6mx7" (OuterVolumeSpecName: "kube-api-access-p6mx7") pod "2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d" (UID: "2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d"). InnerVolumeSpecName "kube-api-access-p6mx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:19:44 crc kubenswrapper[4585]: I1201 14:19:44.013902 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d" (UID: "2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:19:44 crc kubenswrapper[4585]: I1201 14:19:44.014133 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d-config" (OuterVolumeSpecName: "config") pod "2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d" (UID: "2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:19:44 crc kubenswrapper[4585]: I1201 14:19:44.016037 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d" (UID: "2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:19:44 crc kubenswrapper[4585]: I1201 14:19:44.016187 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d" (UID: "2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:19:44 crc kubenswrapper[4585]: I1201 14:19:44.025078 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d" (UID: "2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:19:44 crc kubenswrapper[4585]: I1201 14:19:44.026626 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d" (UID: "2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:19:44 crc kubenswrapper[4585]: I1201 14:19:44.049721 4585 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 14:19:44 crc kubenswrapper[4585]: I1201 14:19:44.049763 4585 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d-config\") on node \"crc\" DevicePath \"\"" Dec 01 14:19:44 crc kubenswrapper[4585]: I1201 14:19:44.049777 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6mx7\" (UniqueName: \"kubernetes.io/projected/2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d-kube-api-access-p6mx7\") on node \"crc\" DevicePath \"\"" Dec 01 14:19:44 crc kubenswrapper[4585]: I1201 14:19:44.049789 4585 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 14:19:44 crc kubenswrapper[4585]: I1201 14:19:44.049801 4585 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 01 14:19:44 crc kubenswrapper[4585]: I1201 14:19:44.049814 4585 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 01 14:19:44 crc kubenswrapper[4585]: I1201 14:19:44.049826 4585 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 14:19:44 crc kubenswrapper[4585]: I1201 14:19:44.630027 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-b5dwd" event={"ID":"2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d","Type":"ContainerDied","Data":"5a79e7743108e9221ffd4c542cb52affd8664b9afadbd01bf349640bf9469372"} Dec 01 14:19:44 crc kubenswrapper[4585]: I1201 14:19:44.630088 4585 scope.go:117] "RemoveContainer" containerID="485861e66e7a293b6302b3d5ec4b307dd091303a33233d520e0dd9e36af99fb2" Dec 01 14:19:44 crc kubenswrapper[4585]: I1201 14:19:44.630089 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-b5dwd" Dec 01 14:19:44 crc kubenswrapper[4585]: I1201 14:19:44.656427 4585 scope.go:117] "RemoveContainer" containerID="d3f4fc0b4d62ae2e3ee458495a356ef7790773a219fb2653936de6d93386c676" Dec 01 14:19:44 crc kubenswrapper[4585]: I1201 14:19:44.683911 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-b5dwd"] Dec 01 14:19:44 crc kubenswrapper[4585]: I1201 14:19:44.698817 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-b5dwd"] Dec 01 14:19:46 crc kubenswrapper[4585]: I1201 14:19:46.427236 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d" path="/var/lib/kubelet/pods/2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d/volumes" Dec 01 14:19:55 crc kubenswrapper[4585]: I1201 14:19:55.728518 4585 generic.go:334] "Generic (PLEG): container finished" podID="b271e13c-b935-4f31-a32d-865af7228e55" containerID="a31b711a5b8afd65ad9f661a71bf84c9b1899dd96fd2798d26e8657ec6bb7581" exitCode=0 Dec 01 14:19:55 crc kubenswrapper[4585]: I1201 14:19:55.728618 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b271e13c-b935-4f31-a32d-865af7228e55","Type":"ContainerDied","Data":"a31b711a5b8afd65ad9f661a71bf84c9b1899dd96fd2798d26e8657ec6bb7581"} Dec 01 14:19:56 crc kubenswrapper[4585]: I1201 14:19:56.737702 4585 generic.go:334] "Generic (PLEG): container finished" podID="645c2200-d127-4ffe-a91e-9f9ae104dc06" containerID="1c89711d913f5eabaa4681e8495f26315d6ccd2dda9d0227e534402e665f7cdb" exitCode=0 Dec 01 14:19:56 crc kubenswrapper[4585]: I1201 14:19:56.737777 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"645c2200-d127-4ffe-a91e-9f9ae104dc06","Type":"ContainerDied","Data":"1c89711d913f5eabaa4681e8495f26315d6ccd2dda9d0227e534402e665f7cdb"} Dec 01 14:19:56 crc kubenswrapper[4585]: I1201 14:19:56.740277 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b271e13c-b935-4f31-a32d-865af7228e55","Type":"ContainerStarted","Data":"31c29d918a16b06ced3ffab0718ee4834cbc1d7db247cbb20197c5b54cb918e0"} Dec 01 14:19:56 crc kubenswrapper[4585]: I1201 14:19:56.740902 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 01 14:19:56 crc kubenswrapper[4585]: I1201 14:19:56.866808 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.866784599 podStartE2EDuration="36.866784599s" podCreationTimestamp="2025-12-01 14:19:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:19:56.81132863 +0000 UTC m=+1310.795542495" watchObservedRunningTime="2025-12-01 14:19:56.866784599 +0000 UTC m=+1310.850998454" Dec 01 14:19:57 crc kubenswrapper[4585]: I1201 14:19:57.751278 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"645c2200-d127-4ffe-a91e-9f9ae104dc06","Type":"ContainerStarted","Data":"8f305acb55b1c8226fd0591426c73c2b183abee36264d9b7892f3d82dfb15a2d"} Dec 01 14:19:57 crc kubenswrapper[4585]: I1201 14:19:57.751789 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:19:57 crc kubenswrapper[4585]: I1201 14:19:57.774915 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.774897886 podStartE2EDuration="36.774897886s" podCreationTimestamp="2025-12-01 14:19:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:19:57.771618069 +0000 UTC m=+1311.755831924" watchObservedRunningTime="2025-12-01 14:19:57.774897886 +0000 UTC m=+1311.759111741" Dec 01 14:20:02 crc kubenswrapper[4585]: I1201 14:20:02.166763 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p647g"] Dec 01 14:20:02 crc kubenswrapper[4585]: E1201 14:20:02.167645 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3580eb6-20f0-4ed5-a45b-6b081edd487d" containerName="init" Dec 01 14:20:02 crc kubenswrapper[4585]: I1201 14:20:02.167661 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3580eb6-20f0-4ed5-a45b-6b081edd487d" containerName="init" Dec 01 14:20:02 crc kubenswrapper[4585]: E1201 14:20:02.167682 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3580eb6-20f0-4ed5-a45b-6b081edd487d" containerName="dnsmasq-dns" Dec 01 14:20:02 crc kubenswrapper[4585]: I1201 14:20:02.167690 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3580eb6-20f0-4ed5-a45b-6b081edd487d" containerName="dnsmasq-dns" Dec 01 14:20:02 crc kubenswrapper[4585]: E1201 14:20:02.167724 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d" containerName="dnsmasq-dns" Dec 01 14:20:02 crc kubenswrapper[4585]: I1201 14:20:02.167732 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d" containerName="dnsmasq-dns" Dec 01 14:20:02 crc kubenswrapper[4585]: E1201 14:20:02.167760 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d" containerName="init" Dec 01 14:20:02 crc kubenswrapper[4585]: I1201 14:20:02.167768 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d" containerName="init" Dec 01 14:20:02 crc kubenswrapper[4585]: I1201 14:20:02.168007 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="2286dd35-c7a4-4b2b-bb9d-93a07c5bce8d" containerName="dnsmasq-dns" Dec 01 14:20:02 crc kubenswrapper[4585]: I1201 14:20:02.168021 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3580eb6-20f0-4ed5-a45b-6b081edd487d" containerName="dnsmasq-dns" Dec 01 14:20:02 crc kubenswrapper[4585]: I1201 14:20:02.168791 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p647g" Dec 01 14:20:02 crc kubenswrapper[4585]: I1201 14:20:02.177641 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 14:20:02 crc kubenswrapper[4585]: I1201 14:20:02.178553 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-z6vpw" Dec 01 14:20:02 crc kubenswrapper[4585]: I1201 14:20:02.179272 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 14:20:02 crc kubenswrapper[4585]: I1201 14:20:02.179525 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 14:20:02 crc kubenswrapper[4585]: I1201 14:20:02.196620 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p647g"] Dec 01 14:20:02 crc kubenswrapper[4585]: I1201 14:20:02.227570 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c334f141-1564-4112-a013-53207cf5900c-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-p647g\" (UID: \"c334f141-1564-4112-a013-53207cf5900c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p647g" Dec 01 14:20:02 crc kubenswrapper[4585]: I1201 14:20:02.227614 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c334f141-1564-4112-a013-53207cf5900c-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-p647g\" (UID: \"c334f141-1564-4112-a013-53207cf5900c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p647g" Dec 01 14:20:02 crc kubenswrapper[4585]: I1201 14:20:02.227665 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c334f141-1564-4112-a013-53207cf5900c-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-p647g\" (UID: \"c334f141-1564-4112-a013-53207cf5900c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p647g" Dec 01 14:20:02 crc kubenswrapper[4585]: I1201 14:20:02.227698 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxwh6\" (UniqueName: \"kubernetes.io/projected/c334f141-1564-4112-a013-53207cf5900c-kube-api-access-cxwh6\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-p647g\" (UID: \"c334f141-1564-4112-a013-53207cf5900c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p647g" Dec 01 14:20:02 crc kubenswrapper[4585]: I1201 14:20:02.329848 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c334f141-1564-4112-a013-53207cf5900c-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-p647g\" (UID: \"c334f141-1564-4112-a013-53207cf5900c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p647g" Dec 01 14:20:02 crc kubenswrapper[4585]: I1201 14:20:02.329931 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxwh6\" (UniqueName: \"kubernetes.io/projected/c334f141-1564-4112-a013-53207cf5900c-kube-api-access-cxwh6\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-p647g\" (UID: \"c334f141-1564-4112-a013-53207cf5900c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p647g" Dec 01 14:20:02 crc kubenswrapper[4585]: I1201 14:20:02.330120 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c334f141-1564-4112-a013-53207cf5900c-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-p647g\" (UID: \"c334f141-1564-4112-a013-53207cf5900c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p647g" Dec 01 14:20:02 crc kubenswrapper[4585]: I1201 14:20:02.330155 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c334f141-1564-4112-a013-53207cf5900c-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-p647g\" (UID: \"c334f141-1564-4112-a013-53207cf5900c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p647g" Dec 01 14:20:02 crc kubenswrapper[4585]: I1201 14:20:02.338378 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c334f141-1564-4112-a013-53207cf5900c-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-p647g\" (UID: \"c334f141-1564-4112-a013-53207cf5900c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p647g" Dec 01 14:20:02 crc kubenswrapper[4585]: I1201 14:20:02.338592 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c334f141-1564-4112-a013-53207cf5900c-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-p647g\" (UID: \"c334f141-1564-4112-a013-53207cf5900c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p647g" Dec 01 14:20:02 crc kubenswrapper[4585]: I1201 14:20:02.338498 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c334f141-1564-4112-a013-53207cf5900c-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-p647g\" (UID: \"c334f141-1564-4112-a013-53207cf5900c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p647g" Dec 01 14:20:02 crc kubenswrapper[4585]: I1201 14:20:02.353742 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxwh6\" (UniqueName: \"kubernetes.io/projected/c334f141-1564-4112-a013-53207cf5900c-kube-api-access-cxwh6\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-p647g\" (UID: \"c334f141-1564-4112-a013-53207cf5900c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p647g" Dec 01 14:20:02 crc kubenswrapper[4585]: I1201 14:20:02.508937 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p647g" Dec 01 14:20:03 crc kubenswrapper[4585]: I1201 14:20:03.092174 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p647g"] Dec 01 14:20:03 crc kubenswrapper[4585]: I1201 14:20:03.806725 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p647g" event={"ID":"c334f141-1564-4112-a013-53207cf5900c","Type":"ContainerStarted","Data":"264688b5d55b1540cfa54a62f009e3890b6311843f2a127ec2a1e87dabe042b1"} Dec 01 14:20:10 crc kubenswrapper[4585]: I1201 14:20:10.705311 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 01 14:20:11 crc kubenswrapper[4585]: I1201 14:20:11.753203 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 01 14:20:12 crc kubenswrapper[4585]: I1201 14:20:12.627517 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 14:20:12 crc kubenswrapper[4585]: I1201 14:20:12.938795 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p647g" event={"ID":"c334f141-1564-4112-a013-53207cf5900c","Type":"ContainerStarted","Data":"817d9a62d3947afdd6ed65b28d2b3c73ae3691b9b585ad43d1d0833605e0741f"} Dec 01 14:20:12 crc kubenswrapper[4585]: I1201 14:20:12.961338 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p647g" podStartSLOduration=1.43805253 podStartE2EDuration="10.961321374s" podCreationTimestamp="2025-12-01 14:20:02 +0000 UTC" firstStartedPulling="2025-12-01 14:20:03.101776962 +0000 UTC m=+1317.085990817" lastFinishedPulling="2025-12-01 14:20:12.625045806 +0000 UTC m=+1326.609259661" observedRunningTime="2025-12-01 14:20:12.954278386 +0000 UTC m=+1326.938492261" watchObservedRunningTime="2025-12-01 14:20:12.961321374 +0000 UTC m=+1326.945535229" Dec 01 14:20:24 crc kubenswrapper[4585]: I1201 14:20:24.032809 4585 generic.go:334] "Generic (PLEG): container finished" podID="c334f141-1564-4112-a013-53207cf5900c" containerID="817d9a62d3947afdd6ed65b28d2b3c73ae3691b9b585ad43d1d0833605e0741f" exitCode=0 Dec 01 14:20:24 crc kubenswrapper[4585]: I1201 14:20:24.032893 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p647g" event={"ID":"c334f141-1564-4112-a013-53207cf5900c","Type":"ContainerDied","Data":"817d9a62d3947afdd6ed65b28d2b3c73ae3691b9b585ad43d1d0833605e0741f"} Dec 01 14:20:25 crc kubenswrapper[4585]: I1201 14:20:25.476307 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p647g" Dec 01 14:20:25 crc kubenswrapper[4585]: I1201 14:20:25.553207 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxwh6\" (UniqueName: \"kubernetes.io/projected/c334f141-1564-4112-a013-53207cf5900c-kube-api-access-cxwh6\") pod \"c334f141-1564-4112-a013-53207cf5900c\" (UID: \"c334f141-1564-4112-a013-53207cf5900c\") " Dec 01 14:20:25 crc kubenswrapper[4585]: I1201 14:20:25.553451 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c334f141-1564-4112-a013-53207cf5900c-inventory\") pod \"c334f141-1564-4112-a013-53207cf5900c\" (UID: \"c334f141-1564-4112-a013-53207cf5900c\") " Dec 01 14:20:25 crc kubenswrapper[4585]: I1201 14:20:25.553558 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c334f141-1564-4112-a013-53207cf5900c-ssh-key\") pod \"c334f141-1564-4112-a013-53207cf5900c\" (UID: \"c334f141-1564-4112-a013-53207cf5900c\") " Dec 01 14:20:25 crc kubenswrapper[4585]: I1201 14:20:25.553636 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c334f141-1564-4112-a013-53207cf5900c-repo-setup-combined-ca-bundle\") pod \"c334f141-1564-4112-a013-53207cf5900c\" (UID: \"c334f141-1564-4112-a013-53207cf5900c\") " Dec 01 14:20:25 crc kubenswrapper[4585]: I1201 14:20:25.567207 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c334f141-1564-4112-a013-53207cf5900c-kube-api-access-cxwh6" (OuterVolumeSpecName: "kube-api-access-cxwh6") pod "c334f141-1564-4112-a013-53207cf5900c" (UID: "c334f141-1564-4112-a013-53207cf5900c"). InnerVolumeSpecName "kube-api-access-cxwh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:20:25 crc kubenswrapper[4585]: I1201 14:20:25.584489 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c334f141-1564-4112-a013-53207cf5900c-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "c334f141-1564-4112-a013-53207cf5900c" (UID: "c334f141-1564-4112-a013-53207cf5900c"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:20:25 crc kubenswrapper[4585]: I1201 14:20:25.624221 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c334f141-1564-4112-a013-53207cf5900c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c334f141-1564-4112-a013-53207cf5900c" (UID: "c334f141-1564-4112-a013-53207cf5900c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:20:25 crc kubenswrapper[4585]: I1201 14:20:25.628095 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c334f141-1564-4112-a013-53207cf5900c-inventory" (OuterVolumeSpecName: "inventory") pod "c334f141-1564-4112-a013-53207cf5900c" (UID: "c334f141-1564-4112-a013-53207cf5900c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:20:25 crc kubenswrapper[4585]: I1201 14:20:25.656716 4585 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c334f141-1564-4112-a013-53207cf5900c-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 14:20:25 crc kubenswrapper[4585]: I1201 14:20:25.656760 4585 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c334f141-1564-4112-a013-53207cf5900c-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 14:20:25 crc kubenswrapper[4585]: I1201 14:20:25.656775 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxwh6\" (UniqueName: \"kubernetes.io/projected/c334f141-1564-4112-a013-53207cf5900c-kube-api-access-cxwh6\") on node \"crc\" DevicePath \"\"" Dec 01 14:20:25 crc kubenswrapper[4585]: I1201 14:20:25.656786 4585 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c334f141-1564-4112-a013-53207cf5900c-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 14:20:26 crc kubenswrapper[4585]: I1201 14:20:26.055384 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p647g" event={"ID":"c334f141-1564-4112-a013-53207cf5900c","Type":"ContainerDied","Data":"264688b5d55b1540cfa54a62f009e3890b6311843f2a127ec2a1e87dabe042b1"} Dec 01 14:20:26 crc kubenswrapper[4585]: I1201 14:20:26.055434 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="264688b5d55b1540cfa54a62f009e3890b6311843f2a127ec2a1e87dabe042b1" Dec 01 14:20:26 crc kubenswrapper[4585]: I1201 14:20:26.055466 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-p647g" Dec 01 14:20:26 crc kubenswrapper[4585]: I1201 14:20:26.147041 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-47qxg"] Dec 01 14:20:26 crc kubenswrapper[4585]: E1201 14:20:26.147455 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c334f141-1564-4112-a013-53207cf5900c" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 01 14:20:26 crc kubenswrapper[4585]: I1201 14:20:26.147472 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="c334f141-1564-4112-a013-53207cf5900c" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 01 14:20:26 crc kubenswrapper[4585]: I1201 14:20:26.147669 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="c334f141-1564-4112-a013-53207cf5900c" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 01 14:20:26 crc kubenswrapper[4585]: I1201 14:20:26.148275 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-47qxg" Dec 01 14:20:26 crc kubenswrapper[4585]: I1201 14:20:26.151108 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 14:20:26 crc kubenswrapper[4585]: I1201 14:20:26.151146 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 14:20:26 crc kubenswrapper[4585]: I1201 14:20:26.151245 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-z6vpw" Dec 01 14:20:26 crc kubenswrapper[4585]: I1201 14:20:26.152806 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 14:20:26 crc kubenswrapper[4585]: I1201 14:20:26.182648 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-47qxg"] Dec 01 14:20:26 crc kubenswrapper[4585]: I1201 14:20:26.279686 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g222z\" (UniqueName: \"kubernetes.io/projected/883ed263-3b11-459f-83d8-c29a49f9c79c-kube-api-access-g222z\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-47qxg\" (UID: \"883ed263-3b11-459f-83d8-c29a49f9c79c\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-47qxg" Dec 01 14:20:26 crc kubenswrapper[4585]: I1201 14:20:26.279734 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/883ed263-3b11-459f-83d8-c29a49f9c79c-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-47qxg\" (UID: \"883ed263-3b11-459f-83d8-c29a49f9c79c\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-47qxg" Dec 01 14:20:26 crc kubenswrapper[4585]: I1201 14:20:26.279763 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/883ed263-3b11-459f-83d8-c29a49f9c79c-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-47qxg\" (UID: \"883ed263-3b11-459f-83d8-c29a49f9c79c\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-47qxg" Dec 01 14:20:26 crc kubenswrapper[4585]: I1201 14:20:26.381743 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g222z\" (UniqueName: \"kubernetes.io/projected/883ed263-3b11-459f-83d8-c29a49f9c79c-kube-api-access-g222z\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-47qxg\" (UID: \"883ed263-3b11-459f-83d8-c29a49f9c79c\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-47qxg" Dec 01 14:20:26 crc kubenswrapper[4585]: I1201 14:20:26.382033 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/883ed263-3b11-459f-83d8-c29a49f9c79c-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-47qxg\" (UID: \"883ed263-3b11-459f-83d8-c29a49f9c79c\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-47qxg" Dec 01 14:20:26 crc kubenswrapper[4585]: I1201 14:20:26.382101 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/883ed263-3b11-459f-83d8-c29a49f9c79c-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-47qxg\" (UID: \"883ed263-3b11-459f-83d8-c29a49f9c79c\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-47qxg" Dec 01 14:20:26 crc kubenswrapper[4585]: I1201 14:20:26.387709 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/883ed263-3b11-459f-83d8-c29a49f9c79c-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-47qxg\" (UID: \"883ed263-3b11-459f-83d8-c29a49f9c79c\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-47qxg" Dec 01 14:20:26 crc kubenswrapper[4585]: I1201 14:20:26.392466 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/883ed263-3b11-459f-83d8-c29a49f9c79c-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-47qxg\" (UID: \"883ed263-3b11-459f-83d8-c29a49f9c79c\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-47qxg" Dec 01 14:20:26 crc kubenswrapper[4585]: I1201 14:20:26.402588 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g222z\" (UniqueName: \"kubernetes.io/projected/883ed263-3b11-459f-83d8-c29a49f9c79c-kube-api-access-g222z\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-47qxg\" (UID: \"883ed263-3b11-459f-83d8-c29a49f9c79c\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-47qxg" Dec 01 14:20:26 crc kubenswrapper[4585]: I1201 14:20:26.517519 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-47qxg" Dec 01 14:20:27 crc kubenswrapper[4585]: I1201 14:20:27.052840 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-47qxg"] Dec 01 14:20:28 crc kubenswrapper[4585]: I1201 14:20:28.077400 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-47qxg" event={"ID":"883ed263-3b11-459f-83d8-c29a49f9c79c","Type":"ContainerStarted","Data":"469fcfcb189aced99ffe62848b90a5612d5226ef285e867eb67c2c4f3fc5d8c4"} Dec 01 14:20:28 crc kubenswrapper[4585]: I1201 14:20:28.078009 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-47qxg" event={"ID":"883ed263-3b11-459f-83d8-c29a49f9c79c","Type":"ContainerStarted","Data":"f0d61ea57dd5fe35461a2b4357e972eae7140a95e5b20a42c1cc88fdb49e55ab"} Dec 01 14:20:28 crc kubenswrapper[4585]: I1201 14:20:28.093897 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-47qxg" podStartSLOduration=1.629918473 podStartE2EDuration="2.093881086s" podCreationTimestamp="2025-12-01 14:20:26 +0000 UTC" firstStartedPulling="2025-12-01 14:20:27.068314636 +0000 UTC m=+1341.052528491" lastFinishedPulling="2025-12-01 14:20:27.532277249 +0000 UTC m=+1341.516491104" observedRunningTime="2025-12-01 14:20:28.089989142 +0000 UTC m=+1342.074202997" watchObservedRunningTime="2025-12-01 14:20:28.093881086 +0000 UTC m=+1342.078094941" Dec 01 14:20:31 crc kubenswrapper[4585]: I1201 14:20:31.102871 4585 generic.go:334] "Generic (PLEG): container finished" podID="883ed263-3b11-459f-83d8-c29a49f9c79c" containerID="469fcfcb189aced99ffe62848b90a5612d5226ef285e867eb67c2c4f3fc5d8c4" exitCode=0 Dec 01 14:20:31 crc kubenswrapper[4585]: I1201 14:20:31.103011 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-47qxg" event={"ID":"883ed263-3b11-459f-83d8-c29a49f9c79c","Type":"ContainerDied","Data":"469fcfcb189aced99ffe62848b90a5612d5226ef285e867eb67c2c4f3fc5d8c4"} Dec 01 14:20:32 crc kubenswrapper[4585]: I1201 14:20:32.223513 4585 scope.go:117] "RemoveContainer" containerID="6d30ffa5587b83f16bca0f1a93349dff61fb549a0681bd8110eb45017a5f9200" Dec 01 14:20:32 crc kubenswrapper[4585]: I1201 14:20:32.524229 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-47qxg" Dec 01 14:20:32 crc kubenswrapper[4585]: I1201 14:20:32.596574 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/883ed263-3b11-459f-83d8-c29a49f9c79c-ssh-key\") pod \"883ed263-3b11-459f-83d8-c29a49f9c79c\" (UID: \"883ed263-3b11-459f-83d8-c29a49f9c79c\") " Dec 01 14:20:32 crc kubenswrapper[4585]: I1201 14:20:32.596638 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/883ed263-3b11-459f-83d8-c29a49f9c79c-inventory\") pod \"883ed263-3b11-459f-83d8-c29a49f9c79c\" (UID: \"883ed263-3b11-459f-83d8-c29a49f9c79c\") " Dec 01 14:20:32 crc kubenswrapper[4585]: I1201 14:20:32.596754 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g222z\" (UniqueName: \"kubernetes.io/projected/883ed263-3b11-459f-83d8-c29a49f9c79c-kube-api-access-g222z\") pod \"883ed263-3b11-459f-83d8-c29a49f9c79c\" (UID: \"883ed263-3b11-459f-83d8-c29a49f9c79c\") " Dec 01 14:20:32 crc kubenswrapper[4585]: I1201 14:20:32.611717 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/883ed263-3b11-459f-83d8-c29a49f9c79c-kube-api-access-g222z" (OuterVolumeSpecName: "kube-api-access-g222z") pod "883ed263-3b11-459f-83d8-c29a49f9c79c" (UID: "883ed263-3b11-459f-83d8-c29a49f9c79c"). InnerVolumeSpecName "kube-api-access-g222z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:20:32 crc kubenswrapper[4585]: I1201 14:20:32.623719 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/883ed263-3b11-459f-83d8-c29a49f9c79c-inventory" (OuterVolumeSpecName: "inventory") pod "883ed263-3b11-459f-83d8-c29a49f9c79c" (UID: "883ed263-3b11-459f-83d8-c29a49f9c79c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:20:32 crc kubenswrapper[4585]: I1201 14:20:32.625023 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/883ed263-3b11-459f-83d8-c29a49f9c79c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "883ed263-3b11-459f-83d8-c29a49f9c79c" (UID: "883ed263-3b11-459f-83d8-c29a49f9c79c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:20:32 crc kubenswrapper[4585]: I1201 14:20:32.699112 4585 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/883ed263-3b11-459f-83d8-c29a49f9c79c-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 14:20:32 crc kubenswrapper[4585]: I1201 14:20:32.699146 4585 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/883ed263-3b11-459f-83d8-c29a49f9c79c-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 14:20:32 crc kubenswrapper[4585]: I1201 14:20:32.699156 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g222z\" (UniqueName: \"kubernetes.io/projected/883ed263-3b11-459f-83d8-c29a49f9c79c-kube-api-access-g222z\") on node \"crc\" DevicePath \"\"" Dec 01 14:20:33 crc kubenswrapper[4585]: I1201 14:20:33.125816 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-47qxg" event={"ID":"883ed263-3b11-459f-83d8-c29a49f9c79c","Type":"ContainerDied","Data":"f0d61ea57dd5fe35461a2b4357e972eae7140a95e5b20a42c1cc88fdb49e55ab"} Dec 01 14:20:33 crc kubenswrapper[4585]: I1201 14:20:33.126334 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0d61ea57dd5fe35461a2b4357e972eae7140a95e5b20a42c1cc88fdb49e55ab" Dec 01 14:20:33 crc kubenswrapper[4585]: I1201 14:20:33.125879 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-47qxg" Dec 01 14:20:33 crc kubenswrapper[4585]: I1201 14:20:33.216261 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6t4hr"] Dec 01 14:20:33 crc kubenswrapper[4585]: E1201 14:20:33.216694 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="883ed263-3b11-459f-83d8-c29a49f9c79c" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 01 14:20:33 crc kubenswrapper[4585]: I1201 14:20:33.216715 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="883ed263-3b11-459f-83d8-c29a49f9c79c" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 01 14:20:33 crc kubenswrapper[4585]: I1201 14:20:33.216962 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="883ed263-3b11-459f-83d8-c29a49f9c79c" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 01 14:20:33 crc kubenswrapper[4585]: I1201 14:20:33.220789 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6t4hr" Dec 01 14:20:33 crc kubenswrapper[4585]: I1201 14:20:33.224441 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 14:20:33 crc kubenswrapper[4585]: I1201 14:20:33.224443 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 14:20:33 crc kubenswrapper[4585]: I1201 14:20:33.226346 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 14:20:33 crc kubenswrapper[4585]: I1201 14:20:33.226786 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-z6vpw" Dec 01 14:20:33 crc kubenswrapper[4585]: I1201 14:20:33.236241 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6t4hr"] Dec 01 14:20:33 crc kubenswrapper[4585]: I1201 14:20:33.313726 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9d1f4c36-f08f-4359-a950-a506d064998b-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-6t4hr\" (UID: \"9d1f4c36-f08f-4359-a950-a506d064998b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6t4hr" Dec 01 14:20:33 crc kubenswrapper[4585]: I1201 14:20:33.313771 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d1f4c36-f08f-4359-a950-a506d064998b-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-6t4hr\" (UID: \"9d1f4c36-f08f-4359-a950-a506d064998b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6t4hr" Dec 01 14:20:33 crc kubenswrapper[4585]: I1201 14:20:33.314015 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92k2c\" (UniqueName: \"kubernetes.io/projected/9d1f4c36-f08f-4359-a950-a506d064998b-kube-api-access-92k2c\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-6t4hr\" (UID: \"9d1f4c36-f08f-4359-a950-a506d064998b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6t4hr" Dec 01 14:20:33 crc kubenswrapper[4585]: I1201 14:20:33.314310 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9d1f4c36-f08f-4359-a950-a506d064998b-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-6t4hr\" (UID: \"9d1f4c36-f08f-4359-a950-a506d064998b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6t4hr" Dec 01 14:20:33 crc kubenswrapper[4585]: I1201 14:20:33.416190 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92k2c\" (UniqueName: \"kubernetes.io/projected/9d1f4c36-f08f-4359-a950-a506d064998b-kube-api-access-92k2c\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-6t4hr\" (UID: \"9d1f4c36-f08f-4359-a950-a506d064998b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6t4hr" Dec 01 14:20:33 crc kubenswrapper[4585]: I1201 14:20:33.416353 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9d1f4c36-f08f-4359-a950-a506d064998b-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-6t4hr\" (UID: \"9d1f4c36-f08f-4359-a950-a506d064998b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6t4hr" Dec 01 14:20:33 crc kubenswrapper[4585]: I1201 14:20:33.416442 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9d1f4c36-f08f-4359-a950-a506d064998b-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-6t4hr\" (UID: \"9d1f4c36-f08f-4359-a950-a506d064998b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6t4hr" Dec 01 14:20:33 crc kubenswrapper[4585]: I1201 14:20:33.416473 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d1f4c36-f08f-4359-a950-a506d064998b-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-6t4hr\" (UID: \"9d1f4c36-f08f-4359-a950-a506d064998b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6t4hr" Dec 01 14:20:33 crc kubenswrapper[4585]: I1201 14:20:33.420882 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9d1f4c36-f08f-4359-a950-a506d064998b-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-6t4hr\" (UID: \"9d1f4c36-f08f-4359-a950-a506d064998b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6t4hr" Dec 01 14:20:33 crc kubenswrapper[4585]: I1201 14:20:33.423006 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d1f4c36-f08f-4359-a950-a506d064998b-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-6t4hr\" (UID: \"9d1f4c36-f08f-4359-a950-a506d064998b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6t4hr" Dec 01 14:20:33 crc kubenswrapper[4585]: I1201 14:20:33.424479 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9d1f4c36-f08f-4359-a950-a506d064998b-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-6t4hr\" (UID: \"9d1f4c36-f08f-4359-a950-a506d064998b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6t4hr" Dec 01 14:20:33 crc kubenswrapper[4585]: I1201 14:20:33.434880 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92k2c\" (UniqueName: \"kubernetes.io/projected/9d1f4c36-f08f-4359-a950-a506d064998b-kube-api-access-92k2c\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-6t4hr\" (UID: \"9d1f4c36-f08f-4359-a950-a506d064998b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6t4hr" Dec 01 14:20:33 crc kubenswrapper[4585]: I1201 14:20:33.568414 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6t4hr" Dec 01 14:20:34 crc kubenswrapper[4585]: I1201 14:20:34.104443 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6t4hr"] Dec 01 14:20:34 crc kubenswrapper[4585]: I1201 14:20:34.135884 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6t4hr" event={"ID":"9d1f4c36-f08f-4359-a950-a506d064998b","Type":"ContainerStarted","Data":"aba16e997bede90653866f6ffc85636c3059ef0f5d0ba0f631a6bc5cfa496087"} Dec 01 14:20:35 crc kubenswrapper[4585]: I1201 14:20:35.147247 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6t4hr" event={"ID":"9d1f4c36-f08f-4359-a950-a506d064998b","Type":"ContainerStarted","Data":"15ce22ceb1595607b17310e8bf8c8b00081e4c0a7ac84de50185b8ad8265ebd3"} Dec 01 14:21:13 crc kubenswrapper[4585]: I1201 14:21:13.716518 4585 patch_prober.go:28] interesting pod/machine-config-daemon-lj9gs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 14:21:13 crc kubenswrapper[4585]: I1201 14:21:13.717265 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 14:21:32 crc kubenswrapper[4585]: I1201 14:21:32.382117 4585 scope.go:117] "RemoveContainer" containerID="994dfe87ff965b12b9062dd5f637b19bf84c6b24024cdd25b602dee97943a437" Dec 01 14:21:32 crc kubenswrapper[4585]: I1201 14:21:32.423374 4585 scope.go:117] "RemoveContainer" containerID="b777eaee6c922789f9a336f3ca54a96b3c11e519ca4993ef5e19721ca1c35cf5" Dec 01 14:21:43 crc kubenswrapper[4585]: I1201 14:21:43.716401 4585 patch_prober.go:28] interesting pod/machine-config-daemon-lj9gs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 14:21:43 crc kubenswrapper[4585]: I1201 14:21:43.718208 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 14:22:13 crc kubenswrapper[4585]: I1201 14:22:13.716577 4585 patch_prober.go:28] interesting pod/machine-config-daemon-lj9gs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 14:22:13 crc kubenswrapper[4585]: I1201 14:22:13.717098 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 14:22:13 crc kubenswrapper[4585]: I1201 14:22:13.717142 4585 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" Dec 01 14:22:13 crc kubenswrapper[4585]: I1201 14:22:13.717956 4585 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1fbf41077863f3fc04eeb32135f7e1a50fc3bb2ac74df27d60186d6226d4dc1b"} pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 14:22:13 crc kubenswrapper[4585]: I1201 14:22:13.718045 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerName="machine-config-daemon" containerID="cri-o://1fbf41077863f3fc04eeb32135f7e1a50fc3bb2ac74df27d60186d6226d4dc1b" gracePeriod=600 Dec 01 14:22:14 crc kubenswrapper[4585]: I1201 14:22:14.209166 4585 generic.go:334] "Generic (PLEG): container finished" podID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerID="1fbf41077863f3fc04eeb32135f7e1a50fc3bb2ac74df27d60186d6226d4dc1b" exitCode=0 Dec 01 14:22:14 crc kubenswrapper[4585]: I1201 14:22:14.209514 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" event={"ID":"f7beb40d-bcd0-43c8-a9fe-c32408790a4c","Type":"ContainerDied","Data":"1fbf41077863f3fc04eeb32135f7e1a50fc3bb2ac74df27d60186d6226d4dc1b"} Dec 01 14:22:14 crc kubenswrapper[4585]: I1201 14:22:14.209589 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" event={"ID":"f7beb40d-bcd0-43c8-a9fe-c32408790a4c","Type":"ContainerStarted","Data":"8681a4c58b14dec4c6ae76a371a009a12a89609fe6403c1612097d1fadf88593"} Dec 01 14:22:14 crc kubenswrapper[4585]: I1201 14:22:14.209605 4585 scope.go:117] "RemoveContainer" containerID="ad6574d507c0610da07eb42cf40383d7aa7800bda84bee35a347684dc954f810" Dec 01 14:22:14 crc kubenswrapper[4585]: I1201 14:22:14.238155 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6t4hr" podStartSLOduration=100.668993959 podStartE2EDuration="1m41.238126535s" podCreationTimestamp="2025-12-01 14:20:33 +0000 UTC" firstStartedPulling="2025-12-01 14:20:34.107090924 +0000 UTC m=+1348.091304779" lastFinishedPulling="2025-12-01 14:20:34.6762235 +0000 UTC m=+1348.660437355" observedRunningTime="2025-12-01 14:20:35.173670165 +0000 UTC m=+1349.157884020" watchObservedRunningTime="2025-12-01 14:22:14.238126535 +0000 UTC m=+1448.222340400" Dec 01 14:22:32 crc kubenswrapper[4585]: I1201 14:22:32.505166 4585 scope.go:117] "RemoveContainer" containerID="1f18573e739a744b3eabfcf4b261bed85eaea03874ad4e993ab153ba3999ffcf" Dec 01 14:22:32 crc kubenswrapper[4585]: I1201 14:22:32.695965 4585 scope.go:117] "RemoveContainer" containerID="3a9ba0070c1ab7d5e1a7069b52779c3f7403565c411ecec70078fa83ead02784" Dec 01 14:22:32 crc kubenswrapper[4585]: I1201 14:22:32.721486 4585 scope.go:117] "RemoveContainer" containerID="44ac16e77a217a165b7ab75f4de65db78cc2a9b0babeb3e2eb14c57da0487bba" Dec 01 14:22:32 crc kubenswrapper[4585]: I1201 14:22:32.744339 4585 scope.go:117] "RemoveContainer" containerID="35efd55eac44182903f892f02ab338367b78ab9ae756aaa0f340e6e43426aacc" Dec 01 14:22:32 crc kubenswrapper[4585]: I1201 14:22:32.764928 4585 scope.go:117] "RemoveContainer" containerID="b0eb593f0744ac672633eec711ebaac9fa0a287fef4a949cc99dcd5d1b19b7d3" Dec 01 14:23:43 crc kubenswrapper[4585]: I1201 14:23:43.795501 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jlvgm"] Dec 01 14:23:43 crc kubenswrapper[4585]: I1201 14:23:43.797943 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jlvgm" Dec 01 14:23:43 crc kubenswrapper[4585]: I1201 14:23:43.809215 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jlvgm"] Dec 01 14:23:43 crc kubenswrapper[4585]: I1201 14:23:43.954622 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6lqd\" (UniqueName: \"kubernetes.io/projected/e84f8c73-2e06-4dfd-a147-bccf0036bb8b-kube-api-access-x6lqd\") pod \"redhat-operators-jlvgm\" (UID: \"e84f8c73-2e06-4dfd-a147-bccf0036bb8b\") " pod="openshift-marketplace/redhat-operators-jlvgm" Dec 01 14:23:43 crc kubenswrapper[4585]: I1201 14:23:43.954701 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e84f8c73-2e06-4dfd-a147-bccf0036bb8b-utilities\") pod \"redhat-operators-jlvgm\" (UID: \"e84f8c73-2e06-4dfd-a147-bccf0036bb8b\") " pod="openshift-marketplace/redhat-operators-jlvgm" Dec 01 14:23:43 crc kubenswrapper[4585]: I1201 14:23:43.955142 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e84f8c73-2e06-4dfd-a147-bccf0036bb8b-catalog-content\") pod \"redhat-operators-jlvgm\" (UID: \"e84f8c73-2e06-4dfd-a147-bccf0036bb8b\") " pod="openshift-marketplace/redhat-operators-jlvgm" Dec 01 14:23:44 crc kubenswrapper[4585]: I1201 14:23:44.056954 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e84f8c73-2e06-4dfd-a147-bccf0036bb8b-catalog-content\") pod \"redhat-operators-jlvgm\" (UID: \"e84f8c73-2e06-4dfd-a147-bccf0036bb8b\") " pod="openshift-marketplace/redhat-operators-jlvgm" Dec 01 14:23:44 crc kubenswrapper[4585]: I1201 14:23:44.057098 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6lqd\" (UniqueName: \"kubernetes.io/projected/e84f8c73-2e06-4dfd-a147-bccf0036bb8b-kube-api-access-x6lqd\") pod \"redhat-operators-jlvgm\" (UID: \"e84f8c73-2e06-4dfd-a147-bccf0036bb8b\") " pod="openshift-marketplace/redhat-operators-jlvgm" Dec 01 14:23:44 crc kubenswrapper[4585]: I1201 14:23:44.057145 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e84f8c73-2e06-4dfd-a147-bccf0036bb8b-utilities\") pod \"redhat-operators-jlvgm\" (UID: \"e84f8c73-2e06-4dfd-a147-bccf0036bb8b\") " pod="openshift-marketplace/redhat-operators-jlvgm" Dec 01 14:23:44 crc kubenswrapper[4585]: I1201 14:23:44.057511 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e84f8c73-2e06-4dfd-a147-bccf0036bb8b-catalog-content\") pod \"redhat-operators-jlvgm\" (UID: \"e84f8c73-2e06-4dfd-a147-bccf0036bb8b\") " pod="openshift-marketplace/redhat-operators-jlvgm" Dec 01 14:23:44 crc kubenswrapper[4585]: I1201 14:23:44.057565 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e84f8c73-2e06-4dfd-a147-bccf0036bb8b-utilities\") pod \"redhat-operators-jlvgm\" (UID: \"e84f8c73-2e06-4dfd-a147-bccf0036bb8b\") " pod="openshift-marketplace/redhat-operators-jlvgm" Dec 01 14:23:44 crc kubenswrapper[4585]: I1201 14:23:44.075671 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6lqd\" (UniqueName: \"kubernetes.io/projected/e84f8c73-2e06-4dfd-a147-bccf0036bb8b-kube-api-access-x6lqd\") pod \"redhat-operators-jlvgm\" (UID: \"e84f8c73-2e06-4dfd-a147-bccf0036bb8b\") " pod="openshift-marketplace/redhat-operators-jlvgm" Dec 01 14:23:44 crc kubenswrapper[4585]: I1201 14:23:44.176704 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jlvgm" Dec 01 14:23:44 crc kubenswrapper[4585]: I1201 14:23:44.616635 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jlvgm"] Dec 01 14:23:45 crc kubenswrapper[4585]: I1201 14:23:45.326165 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jlvgm" event={"ID":"e84f8c73-2e06-4dfd-a147-bccf0036bb8b","Type":"ContainerStarted","Data":"ac5a7569cc848d9447942d079c8536106c433e298647bb5331db4c3ccf869c50"} Dec 01 14:23:45 crc kubenswrapper[4585]: I1201 14:23:45.326678 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jlvgm" event={"ID":"e84f8c73-2e06-4dfd-a147-bccf0036bb8b","Type":"ContainerStarted","Data":"cb6bdd83be8f3f8fb064d53cc1ec2018f2cf8d1b4d9fd7d1a488b97b7b383bd3"} Dec 01 14:23:45 crc kubenswrapper[4585]: I1201 14:23:45.330022 4585 generic.go:334] "Generic (PLEG): container finished" podID="9d1f4c36-f08f-4359-a950-a506d064998b" containerID="15ce22ceb1595607b17310e8bf8c8b00081e4c0a7ac84de50185b8ad8265ebd3" exitCode=0 Dec 01 14:23:45 crc kubenswrapper[4585]: I1201 14:23:45.330100 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6t4hr" event={"ID":"9d1f4c36-f08f-4359-a950-a506d064998b","Type":"ContainerDied","Data":"15ce22ceb1595607b17310e8bf8c8b00081e4c0a7ac84de50185b8ad8265ebd3"} Dec 01 14:23:46 crc kubenswrapper[4585]: I1201 14:23:46.340214 4585 generic.go:334] "Generic (PLEG): container finished" podID="e84f8c73-2e06-4dfd-a147-bccf0036bb8b" containerID="ac5a7569cc848d9447942d079c8536106c433e298647bb5331db4c3ccf869c50" exitCode=0 Dec 01 14:23:46 crc kubenswrapper[4585]: I1201 14:23:46.340254 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jlvgm" event={"ID":"e84f8c73-2e06-4dfd-a147-bccf0036bb8b","Type":"ContainerDied","Data":"ac5a7569cc848d9447942d079c8536106c433e298647bb5331db4c3ccf869c50"} Dec 01 14:23:46 crc kubenswrapper[4585]: I1201 14:23:46.342590 4585 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 14:23:46 crc kubenswrapper[4585]: I1201 14:23:46.757450 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6t4hr" Dec 01 14:23:46 crc kubenswrapper[4585]: I1201 14:23:46.914106 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9d1f4c36-f08f-4359-a950-a506d064998b-ssh-key\") pod \"9d1f4c36-f08f-4359-a950-a506d064998b\" (UID: \"9d1f4c36-f08f-4359-a950-a506d064998b\") " Dec 01 14:23:46 crc kubenswrapper[4585]: I1201 14:23:46.914190 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92k2c\" (UniqueName: \"kubernetes.io/projected/9d1f4c36-f08f-4359-a950-a506d064998b-kube-api-access-92k2c\") pod \"9d1f4c36-f08f-4359-a950-a506d064998b\" (UID: \"9d1f4c36-f08f-4359-a950-a506d064998b\") " Dec 01 14:23:46 crc kubenswrapper[4585]: I1201 14:23:46.914379 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9d1f4c36-f08f-4359-a950-a506d064998b-inventory\") pod \"9d1f4c36-f08f-4359-a950-a506d064998b\" (UID: \"9d1f4c36-f08f-4359-a950-a506d064998b\") " Dec 01 14:23:46 crc kubenswrapper[4585]: I1201 14:23:46.915111 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d1f4c36-f08f-4359-a950-a506d064998b-bootstrap-combined-ca-bundle\") pod \"9d1f4c36-f08f-4359-a950-a506d064998b\" (UID: \"9d1f4c36-f08f-4359-a950-a506d064998b\") " Dec 01 14:23:46 crc kubenswrapper[4585]: I1201 14:23:46.920302 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d1f4c36-f08f-4359-a950-a506d064998b-kube-api-access-92k2c" (OuterVolumeSpecName: "kube-api-access-92k2c") pod "9d1f4c36-f08f-4359-a950-a506d064998b" (UID: "9d1f4c36-f08f-4359-a950-a506d064998b"). InnerVolumeSpecName "kube-api-access-92k2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:23:46 crc kubenswrapper[4585]: I1201 14:23:46.925711 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d1f4c36-f08f-4359-a950-a506d064998b-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "9d1f4c36-f08f-4359-a950-a506d064998b" (UID: "9d1f4c36-f08f-4359-a950-a506d064998b"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:23:46 crc kubenswrapper[4585]: I1201 14:23:46.973626 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d1f4c36-f08f-4359-a950-a506d064998b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9d1f4c36-f08f-4359-a950-a506d064998b" (UID: "9d1f4c36-f08f-4359-a950-a506d064998b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:23:46 crc kubenswrapper[4585]: I1201 14:23:46.995347 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d1f4c36-f08f-4359-a950-a506d064998b-inventory" (OuterVolumeSpecName: "inventory") pod "9d1f4c36-f08f-4359-a950-a506d064998b" (UID: "9d1f4c36-f08f-4359-a950-a506d064998b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:23:47 crc kubenswrapper[4585]: I1201 14:23:47.017151 4585 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9d1f4c36-f08f-4359-a950-a506d064998b-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 14:23:47 crc kubenswrapper[4585]: I1201 14:23:47.017192 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92k2c\" (UniqueName: \"kubernetes.io/projected/9d1f4c36-f08f-4359-a950-a506d064998b-kube-api-access-92k2c\") on node \"crc\" DevicePath \"\"" Dec 01 14:23:47 crc kubenswrapper[4585]: I1201 14:23:47.017203 4585 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9d1f4c36-f08f-4359-a950-a506d064998b-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 14:23:47 crc kubenswrapper[4585]: I1201 14:23:47.017212 4585 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d1f4c36-f08f-4359-a950-a506d064998b-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 14:23:47 crc kubenswrapper[4585]: I1201 14:23:47.349990 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6t4hr" event={"ID":"9d1f4c36-f08f-4359-a950-a506d064998b","Type":"ContainerDied","Data":"aba16e997bede90653866f6ffc85636c3059ef0f5d0ba0f631a6bc5cfa496087"} Dec 01 14:23:47 crc kubenswrapper[4585]: I1201 14:23:47.350035 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aba16e997bede90653866f6ffc85636c3059ef0f5d0ba0f631a6bc5cfa496087" Dec 01 14:23:47 crc kubenswrapper[4585]: I1201 14:23:47.350117 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6t4hr" Dec 01 14:23:47 crc kubenswrapper[4585]: I1201 14:23:47.505496 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pw7wp"] Dec 01 14:23:47 crc kubenswrapper[4585]: E1201 14:23:47.505886 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d1f4c36-f08f-4359-a950-a506d064998b" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 01 14:23:47 crc kubenswrapper[4585]: I1201 14:23:47.505902 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d1f4c36-f08f-4359-a950-a506d064998b" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 01 14:23:47 crc kubenswrapper[4585]: I1201 14:23:47.506132 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d1f4c36-f08f-4359-a950-a506d064998b" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 01 14:23:47 crc kubenswrapper[4585]: I1201 14:23:47.506739 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pw7wp" Dec 01 14:23:47 crc kubenswrapper[4585]: I1201 14:23:47.512553 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 14:23:47 crc kubenswrapper[4585]: I1201 14:23:47.513829 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 14:23:47 crc kubenswrapper[4585]: I1201 14:23:47.513877 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 14:23:47 crc kubenswrapper[4585]: I1201 14:23:47.513829 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-z6vpw" Dec 01 14:23:47 crc kubenswrapper[4585]: I1201 14:23:47.521231 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pw7wp"] Dec 01 14:23:47 crc kubenswrapper[4585]: I1201 14:23:47.628119 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bed7040b-db55-41ae-9384-7b730ced5331-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pw7wp\" (UID: \"bed7040b-db55-41ae-9384-7b730ced5331\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pw7wp" Dec 01 14:23:47 crc kubenswrapper[4585]: I1201 14:23:47.628188 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bed7040b-db55-41ae-9384-7b730ced5331-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pw7wp\" (UID: \"bed7040b-db55-41ae-9384-7b730ced5331\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pw7wp" Dec 01 14:23:47 crc kubenswrapper[4585]: I1201 14:23:47.628222 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfnkb\" (UniqueName: \"kubernetes.io/projected/bed7040b-db55-41ae-9384-7b730ced5331-kube-api-access-rfnkb\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pw7wp\" (UID: \"bed7040b-db55-41ae-9384-7b730ced5331\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pw7wp" Dec 01 14:23:47 crc kubenswrapper[4585]: I1201 14:23:47.729387 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bed7040b-db55-41ae-9384-7b730ced5331-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pw7wp\" (UID: \"bed7040b-db55-41ae-9384-7b730ced5331\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pw7wp" Dec 01 14:23:47 crc kubenswrapper[4585]: I1201 14:23:47.729440 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bed7040b-db55-41ae-9384-7b730ced5331-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pw7wp\" (UID: \"bed7040b-db55-41ae-9384-7b730ced5331\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pw7wp" Dec 01 14:23:47 crc kubenswrapper[4585]: I1201 14:23:47.729463 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfnkb\" (UniqueName: \"kubernetes.io/projected/bed7040b-db55-41ae-9384-7b730ced5331-kube-api-access-rfnkb\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pw7wp\" (UID: \"bed7040b-db55-41ae-9384-7b730ced5331\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pw7wp" Dec 01 14:23:47 crc kubenswrapper[4585]: I1201 14:23:47.735826 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bed7040b-db55-41ae-9384-7b730ced5331-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pw7wp\" (UID: \"bed7040b-db55-41ae-9384-7b730ced5331\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pw7wp" Dec 01 14:23:47 crc kubenswrapper[4585]: I1201 14:23:47.737184 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bed7040b-db55-41ae-9384-7b730ced5331-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pw7wp\" (UID: \"bed7040b-db55-41ae-9384-7b730ced5331\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pw7wp" Dec 01 14:23:47 crc kubenswrapper[4585]: I1201 14:23:47.760956 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfnkb\" (UniqueName: \"kubernetes.io/projected/bed7040b-db55-41ae-9384-7b730ced5331-kube-api-access-rfnkb\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pw7wp\" (UID: \"bed7040b-db55-41ae-9384-7b730ced5331\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pw7wp" Dec 01 14:23:47 crc kubenswrapper[4585]: I1201 14:23:47.821900 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pw7wp" Dec 01 14:23:48 crc kubenswrapper[4585]: I1201 14:23:48.370750 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jlvgm" event={"ID":"e84f8c73-2e06-4dfd-a147-bccf0036bb8b","Type":"ContainerStarted","Data":"14c56e3259e08a56d4b71fd88067137f387499b1bac3e1575d540a78ef3da388"} Dec 01 14:23:48 crc kubenswrapper[4585]: I1201 14:23:48.539388 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pw7wp"] Dec 01 14:23:48 crc kubenswrapper[4585]: W1201 14:23:48.541303 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbed7040b_db55_41ae_9384_7b730ced5331.slice/crio-e87f87769bbc0ef948759755bc25b07de2b59cef66194bf229b1a8ba753c90a3 WatchSource:0}: Error finding container e87f87769bbc0ef948759755bc25b07de2b59cef66194bf229b1a8ba753c90a3: Status 404 returned error can't find the container with id e87f87769bbc0ef948759755bc25b07de2b59cef66194bf229b1a8ba753c90a3 Dec 01 14:23:49 crc kubenswrapper[4585]: I1201 14:23:49.381755 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pw7wp" event={"ID":"bed7040b-db55-41ae-9384-7b730ced5331","Type":"ContainerStarted","Data":"e87f87769bbc0ef948759755bc25b07de2b59cef66194bf229b1a8ba753c90a3"} Dec 01 14:23:51 crc kubenswrapper[4585]: I1201 14:23:51.416132 4585 generic.go:334] "Generic (PLEG): container finished" podID="e84f8c73-2e06-4dfd-a147-bccf0036bb8b" containerID="14c56e3259e08a56d4b71fd88067137f387499b1bac3e1575d540a78ef3da388" exitCode=0 Dec 01 14:23:51 crc kubenswrapper[4585]: I1201 14:23:51.416231 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jlvgm" event={"ID":"e84f8c73-2e06-4dfd-a147-bccf0036bb8b","Type":"ContainerDied","Data":"14c56e3259e08a56d4b71fd88067137f387499b1bac3e1575d540a78ef3da388"} Dec 01 14:23:51 crc kubenswrapper[4585]: I1201 14:23:51.424466 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pw7wp" event={"ID":"bed7040b-db55-41ae-9384-7b730ced5331","Type":"ContainerStarted","Data":"bf31b4fedb87a866365c826937f1c2fea02f17fec2b3ede10021f3aaca9ba72a"} Dec 01 14:23:51 crc kubenswrapper[4585]: I1201 14:23:51.461568 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pw7wp" podStartSLOduration=2.656565565 podStartE2EDuration="4.46154613s" podCreationTimestamp="2025-12-01 14:23:47 +0000 UTC" firstStartedPulling="2025-12-01 14:23:48.543657722 +0000 UTC m=+1542.527871577" lastFinishedPulling="2025-12-01 14:23:50.348638287 +0000 UTC m=+1544.332852142" observedRunningTime="2025-12-01 14:23:51.449189585 +0000 UTC m=+1545.433403440" watchObservedRunningTime="2025-12-01 14:23:51.46154613 +0000 UTC m=+1545.445759985" Dec 01 14:23:53 crc kubenswrapper[4585]: I1201 14:23:53.461215 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jlvgm" event={"ID":"e84f8c73-2e06-4dfd-a147-bccf0036bb8b","Type":"ContainerStarted","Data":"d0c336c1032975de09b6948959a1bc147336e6c79ba5c74964175b0ded037ebf"} Dec 01 14:23:53 crc kubenswrapper[4585]: I1201 14:23:53.493518 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jlvgm" podStartSLOduration=4.464927886 podStartE2EDuration="10.493496124s" podCreationTimestamp="2025-12-01 14:23:43 +0000 UTC" firstStartedPulling="2025-12-01 14:23:46.342380881 +0000 UTC m=+1540.326594736" lastFinishedPulling="2025-12-01 14:23:52.370949119 +0000 UTC m=+1546.355162974" observedRunningTime="2025-12-01 14:23:53.484804105 +0000 UTC m=+1547.469017960" watchObservedRunningTime="2025-12-01 14:23:53.493496124 +0000 UTC m=+1547.477709989" Dec 01 14:23:54 crc kubenswrapper[4585]: I1201 14:23:54.177246 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jlvgm" Dec 01 14:23:54 crc kubenswrapper[4585]: I1201 14:23:54.177300 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jlvgm" Dec 01 14:23:55 crc kubenswrapper[4585]: I1201 14:23:55.221740 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jlvgm" podUID="e84f8c73-2e06-4dfd-a147-bccf0036bb8b" containerName="registry-server" probeResult="failure" output=< Dec 01 14:23:55 crc kubenswrapper[4585]: timeout: failed to connect service ":50051" within 1s Dec 01 14:23:55 crc kubenswrapper[4585]: > Dec 01 14:24:04 crc kubenswrapper[4585]: I1201 14:24:04.230097 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jlvgm" Dec 01 14:24:04 crc kubenswrapper[4585]: I1201 14:24:04.284087 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jlvgm" Dec 01 14:24:04 crc kubenswrapper[4585]: I1201 14:24:04.466131 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jlvgm"] Dec 01 14:24:05 crc kubenswrapper[4585]: I1201 14:24:05.050932 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-3772-account-create-update-x874m"] Dec 01 14:24:05 crc kubenswrapper[4585]: I1201 14:24:05.062166 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-3772-account-create-update-x874m"] Dec 01 14:24:05 crc kubenswrapper[4585]: I1201 14:24:05.564487 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jlvgm" podUID="e84f8c73-2e06-4dfd-a147-bccf0036bb8b" containerName="registry-server" containerID="cri-o://d0c336c1032975de09b6948959a1bc147336e6c79ba5c74964175b0ded037ebf" gracePeriod=2 Dec 01 14:24:06 crc kubenswrapper[4585]: I1201 14:24:06.045536 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jlvgm" Dec 01 14:24:06 crc kubenswrapper[4585]: I1201 14:24:06.057221 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-x849h"] Dec 01 14:24:06 crc kubenswrapper[4585]: I1201 14:24:06.075260 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-jfsfm"] Dec 01 14:24:06 crc kubenswrapper[4585]: I1201 14:24:06.083786 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-4d2c-account-create-update-d4hsr"] Dec 01 14:24:06 crc kubenswrapper[4585]: I1201 14:24:06.104606 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-x849h"] Dec 01 14:24:06 crc kubenswrapper[4585]: I1201 14:24:06.104631 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-jfsfm"] Dec 01 14:24:06 crc kubenswrapper[4585]: I1201 14:24:06.114024 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-4d2c-account-create-update-d4hsr"] Dec 01 14:24:06 crc kubenswrapper[4585]: I1201 14:24:06.220480 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6lqd\" (UniqueName: \"kubernetes.io/projected/e84f8c73-2e06-4dfd-a147-bccf0036bb8b-kube-api-access-x6lqd\") pod \"e84f8c73-2e06-4dfd-a147-bccf0036bb8b\" (UID: \"e84f8c73-2e06-4dfd-a147-bccf0036bb8b\") " Dec 01 14:24:06 crc kubenswrapper[4585]: I1201 14:24:06.220715 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e84f8c73-2e06-4dfd-a147-bccf0036bb8b-catalog-content\") pod \"e84f8c73-2e06-4dfd-a147-bccf0036bb8b\" (UID: \"e84f8c73-2e06-4dfd-a147-bccf0036bb8b\") " Dec 01 14:24:06 crc kubenswrapper[4585]: I1201 14:24:06.220786 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e84f8c73-2e06-4dfd-a147-bccf0036bb8b-utilities\") pod \"e84f8c73-2e06-4dfd-a147-bccf0036bb8b\" (UID: \"e84f8c73-2e06-4dfd-a147-bccf0036bb8b\") " Dec 01 14:24:06 crc kubenswrapper[4585]: I1201 14:24:06.222079 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e84f8c73-2e06-4dfd-a147-bccf0036bb8b-utilities" (OuterVolumeSpecName: "utilities") pod "e84f8c73-2e06-4dfd-a147-bccf0036bb8b" (UID: "e84f8c73-2e06-4dfd-a147-bccf0036bb8b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:24:06 crc kubenswrapper[4585]: I1201 14:24:06.234340 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e84f8c73-2e06-4dfd-a147-bccf0036bb8b-kube-api-access-x6lqd" (OuterVolumeSpecName: "kube-api-access-x6lqd") pod "e84f8c73-2e06-4dfd-a147-bccf0036bb8b" (UID: "e84f8c73-2e06-4dfd-a147-bccf0036bb8b"). InnerVolumeSpecName "kube-api-access-x6lqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:24:06 crc kubenswrapper[4585]: I1201 14:24:06.323434 4585 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e84f8c73-2e06-4dfd-a147-bccf0036bb8b-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 14:24:06 crc kubenswrapper[4585]: I1201 14:24:06.323469 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6lqd\" (UniqueName: \"kubernetes.io/projected/e84f8c73-2e06-4dfd-a147-bccf0036bb8b-kube-api-access-x6lqd\") on node \"crc\" DevicePath \"\"" Dec 01 14:24:06 crc kubenswrapper[4585]: I1201 14:24:06.341742 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e84f8c73-2e06-4dfd-a147-bccf0036bb8b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e84f8c73-2e06-4dfd-a147-bccf0036bb8b" (UID: "e84f8c73-2e06-4dfd-a147-bccf0036bb8b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:24:06 crc kubenswrapper[4585]: I1201 14:24:06.423006 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14c28b2f-8076-4123-8a7b-e907e8d88a30" path="/var/lib/kubelet/pods/14c28b2f-8076-4123-8a7b-e907e8d88a30/volumes" Dec 01 14:24:06 crc kubenswrapper[4585]: I1201 14:24:06.424373 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bef25f3-d94c-4f4a-aa88-e48fb532fcec" path="/var/lib/kubelet/pods/7bef25f3-d94c-4f4a-aa88-e48fb532fcec/volumes" Dec 01 14:24:06 crc kubenswrapper[4585]: I1201 14:24:06.424701 4585 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e84f8c73-2e06-4dfd-a147-bccf0036bb8b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 14:24:06 crc kubenswrapper[4585]: I1201 14:24:06.425401 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="999a47fa-96dd-4791-88bc-ff5e45fe9d6b" path="/var/lib/kubelet/pods/999a47fa-96dd-4791-88bc-ff5e45fe9d6b/volumes" Dec 01 14:24:06 crc kubenswrapper[4585]: I1201 14:24:06.426748 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfdc1d20-5b7c-4dff-988a-a8528d764fdf" path="/var/lib/kubelet/pods/dfdc1d20-5b7c-4dff-988a-a8528d764fdf/volumes" Dec 01 14:24:06 crc kubenswrapper[4585]: I1201 14:24:06.577158 4585 generic.go:334] "Generic (PLEG): container finished" podID="e84f8c73-2e06-4dfd-a147-bccf0036bb8b" containerID="d0c336c1032975de09b6948959a1bc147336e6c79ba5c74964175b0ded037ebf" exitCode=0 Dec 01 14:24:06 crc kubenswrapper[4585]: I1201 14:24:06.577206 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jlvgm" event={"ID":"e84f8c73-2e06-4dfd-a147-bccf0036bb8b","Type":"ContainerDied","Data":"d0c336c1032975de09b6948959a1bc147336e6c79ba5c74964175b0ded037ebf"} Dec 01 14:24:06 crc kubenswrapper[4585]: I1201 14:24:06.577237 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jlvgm" event={"ID":"e84f8c73-2e06-4dfd-a147-bccf0036bb8b","Type":"ContainerDied","Data":"cb6bdd83be8f3f8fb064d53cc1ec2018f2cf8d1b4d9fd7d1a488b97b7b383bd3"} Dec 01 14:24:06 crc kubenswrapper[4585]: I1201 14:24:06.577260 4585 scope.go:117] "RemoveContainer" containerID="d0c336c1032975de09b6948959a1bc147336e6c79ba5c74964175b0ded037ebf" Dec 01 14:24:06 crc kubenswrapper[4585]: I1201 14:24:06.577395 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jlvgm" Dec 01 14:24:06 crc kubenswrapper[4585]: I1201 14:24:06.601756 4585 scope.go:117] "RemoveContainer" containerID="14c56e3259e08a56d4b71fd88067137f387499b1bac3e1575d540a78ef3da388" Dec 01 14:24:06 crc kubenswrapper[4585]: I1201 14:24:06.607927 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jlvgm"] Dec 01 14:24:06 crc kubenswrapper[4585]: I1201 14:24:06.618671 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jlvgm"] Dec 01 14:24:06 crc kubenswrapper[4585]: I1201 14:24:06.633597 4585 scope.go:117] "RemoveContainer" containerID="ac5a7569cc848d9447942d079c8536106c433e298647bb5331db4c3ccf869c50" Dec 01 14:24:06 crc kubenswrapper[4585]: I1201 14:24:06.672069 4585 scope.go:117] "RemoveContainer" containerID="d0c336c1032975de09b6948959a1bc147336e6c79ba5c74964175b0ded037ebf" Dec 01 14:24:06 crc kubenswrapper[4585]: E1201 14:24:06.672580 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0c336c1032975de09b6948959a1bc147336e6c79ba5c74964175b0ded037ebf\": container with ID starting with d0c336c1032975de09b6948959a1bc147336e6c79ba5c74964175b0ded037ebf not found: ID does not exist" containerID="d0c336c1032975de09b6948959a1bc147336e6c79ba5c74964175b0ded037ebf" Dec 01 14:24:06 crc kubenswrapper[4585]: I1201 14:24:06.672617 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0c336c1032975de09b6948959a1bc147336e6c79ba5c74964175b0ded037ebf"} err="failed to get container status \"d0c336c1032975de09b6948959a1bc147336e6c79ba5c74964175b0ded037ebf\": rpc error: code = NotFound desc = could not find container \"d0c336c1032975de09b6948959a1bc147336e6c79ba5c74964175b0ded037ebf\": container with ID starting with d0c336c1032975de09b6948959a1bc147336e6c79ba5c74964175b0ded037ebf not found: ID does not exist" Dec 01 14:24:06 crc kubenswrapper[4585]: I1201 14:24:06.672645 4585 scope.go:117] "RemoveContainer" containerID="14c56e3259e08a56d4b71fd88067137f387499b1bac3e1575d540a78ef3da388" Dec 01 14:24:06 crc kubenswrapper[4585]: E1201 14:24:06.673054 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14c56e3259e08a56d4b71fd88067137f387499b1bac3e1575d540a78ef3da388\": container with ID starting with 14c56e3259e08a56d4b71fd88067137f387499b1bac3e1575d540a78ef3da388 not found: ID does not exist" containerID="14c56e3259e08a56d4b71fd88067137f387499b1bac3e1575d540a78ef3da388" Dec 01 14:24:06 crc kubenswrapper[4585]: I1201 14:24:06.673106 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14c56e3259e08a56d4b71fd88067137f387499b1bac3e1575d540a78ef3da388"} err="failed to get container status \"14c56e3259e08a56d4b71fd88067137f387499b1bac3e1575d540a78ef3da388\": rpc error: code = NotFound desc = could not find container \"14c56e3259e08a56d4b71fd88067137f387499b1bac3e1575d540a78ef3da388\": container with ID starting with 14c56e3259e08a56d4b71fd88067137f387499b1bac3e1575d540a78ef3da388 not found: ID does not exist" Dec 01 14:24:06 crc kubenswrapper[4585]: I1201 14:24:06.673137 4585 scope.go:117] "RemoveContainer" containerID="ac5a7569cc848d9447942d079c8536106c433e298647bb5331db4c3ccf869c50" Dec 01 14:24:06 crc kubenswrapper[4585]: E1201 14:24:06.673543 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac5a7569cc848d9447942d079c8536106c433e298647bb5331db4c3ccf869c50\": container with ID starting with ac5a7569cc848d9447942d079c8536106c433e298647bb5331db4c3ccf869c50 not found: ID does not exist" containerID="ac5a7569cc848d9447942d079c8536106c433e298647bb5331db4c3ccf869c50" Dec 01 14:24:06 crc kubenswrapper[4585]: I1201 14:24:06.673569 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac5a7569cc848d9447942d079c8536106c433e298647bb5331db4c3ccf869c50"} err="failed to get container status \"ac5a7569cc848d9447942d079c8536106c433e298647bb5331db4c3ccf869c50\": rpc error: code = NotFound desc = could not find container \"ac5a7569cc848d9447942d079c8536106c433e298647bb5331db4c3ccf869c50\": container with ID starting with ac5a7569cc848d9447942d079c8536106c433e298647bb5331db4c3ccf869c50 not found: ID does not exist" Dec 01 14:24:07 crc kubenswrapper[4585]: I1201 14:24:07.045817 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-f4d6-account-create-update-d4w4d"] Dec 01 14:24:07 crc kubenswrapper[4585]: I1201 14:24:07.059723 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-2bc92"] Dec 01 14:24:07 crc kubenswrapper[4585]: I1201 14:24:07.070140 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-f4d6-account-create-update-d4w4d"] Dec 01 14:24:07 crc kubenswrapper[4585]: I1201 14:24:07.082412 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-2bc92"] Dec 01 14:24:08 crc kubenswrapper[4585]: I1201 14:24:08.424423 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4552555a-cd04-402f-84f0-48569cbf5fd8" path="/var/lib/kubelet/pods/4552555a-cd04-402f-84f0-48569cbf5fd8/volumes" Dec 01 14:24:08 crc kubenswrapper[4585]: I1201 14:24:08.426065 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbfb665c-c279-40aa-bf1d-ed326b23d184" path="/var/lib/kubelet/pods/dbfb665c-c279-40aa-bf1d-ed326b23d184/volumes" Dec 01 14:24:08 crc kubenswrapper[4585]: I1201 14:24:08.426804 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e84f8c73-2e06-4dfd-a147-bccf0036bb8b" path="/var/lib/kubelet/pods/e84f8c73-2e06-4dfd-a147-bccf0036bb8b/volumes" Dec 01 14:24:32 crc kubenswrapper[4585]: I1201 14:24:32.871260 4585 scope.go:117] "RemoveContainer" containerID="ffe4c8385d3b9dff71988384ecf41a79109d0c4ba3ae2098894a2eb53f9df2a7" Dec 01 14:24:32 crc kubenswrapper[4585]: I1201 14:24:32.902662 4585 scope.go:117] "RemoveContainer" containerID="5f22c4c22b95834a5e9420127714ab200b326e6508a28f2e24da5548eeaf290a" Dec 01 14:24:32 crc kubenswrapper[4585]: I1201 14:24:32.955746 4585 scope.go:117] "RemoveContainer" containerID="2cb5e933ddfa0be02436199b66a85f5dade5e20a4bed7bd3bf3f35e4856d32b3" Dec 01 14:24:33 crc kubenswrapper[4585]: I1201 14:24:33.010551 4585 scope.go:117] "RemoveContainer" containerID="efb2ed550a45d22043b3a1c37c314e2208939f217e6f04f3b25c2a6022186a6d" Dec 01 14:24:33 crc kubenswrapper[4585]: I1201 14:24:33.055680 4585 scope.go:117] "RemoveContainer" containerID="70a8a58e42054315381a278709450610373b4af816ba77e84d5982bd1209b22b" Dec 01 14:24:33 crc kubenswrapper[4585]: I1201 14:24:33.098573 4585 scope.go:117] "RemoveContainer" containerID="bd165c0b7d53bf258740e737c5ee4ef155f34bd7ac5a30965c302947dec9b090" Dec 01 14:24:33 crc kubenswrapper[4585]: I1201 14:24:33.143696 4585 scope.go:117] "RemoveContainer" containerID="67cfb6fa85a26b00c8e0ea9c2dedcad370a340f13fd73bae008662cda3e46254" Dec 01 14:24:33 crc kubenswrapper[4585]: I1201 14:24:33.204919 4585 scope.go:117] "RemoveContainer" containerID="ae80d1a98060f280309194e382f1942ff4486a774beaed833c8d1ac660088af4" Dec 01 14:24:35 crc kubenswrapper[4585]: I1201 14:24:35.042137 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-rvrx7"] Dec 01 14:24:35 crc kubenswrapper[4585]: I1201 14:24:35.053999 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-rvrx7"] Dec 01 14:24:36 crc kubenswrapper[4585]: I1201 14:24:36.425898 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3732bb19-81e6-42db-88c4-f0476ae5ace4" path="/var/lib/kubelet/pods/3732bb19-81e6-42db-88c4-f0476ae5ace4/volumes" Dec 01 14:24:39 crc kubenswrapper[4585]: I1201 14:24:39.446511 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-74dzn"] Dec 01 14:24:39 crc kubenswrapper[4585]: E1201 14:24:39.447211 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e84f8c73-2e06-4dfd-a147-bccf0036bb8b" containerName="extract-content" Dec 01 14:24:39 crc kubenswrapper[4585]: I1201 14:24:39.447223 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="e84f8c73-2e06-4dfd-a147-bccf0036bb8b" containerName="extract-content" Dec 01 14:24:39 crc kubenswrapper[4585]: E1201 14:24:39.447249 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e84f8c73-2e06-4dfd-a147-bccf0036bb8b" containerName="extract-utilities" Dec 01 14:24:39 crc kubenswrapper[4585]: I1201 14:24:39.447255 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="e84f8c73-2e06-4dfd-a147-bccf0036bb8b" containerName="extract-utilities" Dec 01 14:24:39 crc kubenswrapper[4585]: E1201 14:24:39.447281 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e84f8c73-2e06-4dfd-a147-bccf0036bb8b" containerName="registry-server" Dec 01 14:24:39 crc kubenswrapper[4585]: I1201 14:24:39.447289 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="e84f8c73-2e06-4dfd-a147-bccf0036bb8b" containerName="registry-server" Dec 01 14:24:39 crc kubenswrapper[4585]: I1201 14:24:39.447481 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="e84f8c73-2e06-4dfd-a147-bccf0036bb8b" containerName="registry-server" Dec 01 14:24:39 crc kubenswrapper[4585]: I1201 14:24:39.448835 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-74dzn" Dec 01 14:24:39 crc kubenswrapper[4585]: I1201 14:24:39.467138 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-74dzn"] Dec 01 14:24:39 crc kubenswrapper[4585]: I1201 14:24:39.494932 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdaf429b-b6c0-4f60-a032-17262f7466f4-utilities\") pod \"community-operators-74dzn\" (UID: \"cdaf429b-b6c0-4f60-a032-17262f7466f4\") " pod="openshift-marketplace/community-operators-74dzn" Dec 01 14:24:39 crc kubenswrapper[4585]: I1201 14:24:39.495049 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdaf429b-b6c0-4f60-a032-17262f7466f4-catalog-content\") pod \"community-operators-74dzn\" (UID: \"cdaf429b-b6c0-4f60-a032-17262f7466f4\") " pod="openshift-marketplace/community-operators-74dzn" Dec 01 14:24:39 crc kubenswrapper[4585]: I1201 14:24:39.495068 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5wgf\" (UniqueName: \"kubernetes.io/projected/cdaf429b-b6c0-4f60-a032-17262f7466f4-kube-api-access-c5wgf\") pod \"community-operators-74dzn\" (UID: \"cdaf429b-b6c0-4f60-a032-17262f7466f4\") " pod="openshift-marketplace/community-operators-74dzn" Dec 01 14:24:39 crc kubenswrapper[4585]: I1201 14:24:39.597166 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdaf429b-b6c0-4f60-a032-17262f7466f4-utilities\") pod \"community-operators-74dzn\" (UID: \"cdaf429b-b6c0-4f60-a032-17262f7466f4\") " pod="openshift-marketplace/community-operators-74dzn" Dec 01 14:24:39 crc kubenswrapper[4585]: I1201 14:24:39.597243 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdaf429b-b6c0-4f60-a032-17262f7466f4-catalog-content\") pod \"community-operators-74dzn\" (UID: \"cdaf429b-b6c0-4f60-a032-17262f7466f4\") " pod="openshift-marketplace/community-operators-74dzn" Dec 01 14:24:39 crc kubenswrapper[4585]: I1201 14:24:39.597269 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5wgf\" (UniqueName: \"kubernetes.io/projected/cdaf429b-b6c0-4f60-a032-17262f7466f4-kube-api-access-c5wgf\") pod \"community-operators-74dzn\" (UID: \"cdaf429b-b6c0-4f60-a032-17262f7466f4\") " pod="openshift-marketplace/community-operators-74dzn" Dec 01 14:24:39 crc kubenswrapper[4585]: I1201 14:24:39.598054 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdaf429b-b6c0-4f60-a032-17262f7466f4-utilities\") pod \"community-operators-74dzn\" (UID: \"cdaf429b-b6c0-4f60-a032-17262f7466f4\") " pod="openshift-marketplace/community-operators-74dzn" Dec 01 14:24:39 crc kubenswrapper[4585]: I1201 14:24:39.598332 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdaf429b-b6c0-4f60-a032-17262f7466f4-catalog-content\") pod \"community-operators-74dzn\" (UID: \"cdaf429b-b6c0-4f60-a032-17262f7466f4\") " pod="openshift-marketplace/community-operators-74dzn" Dec 01 14:24:39 crc kubenswrapper[4585]: I1201 14:24:39.618120 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5wgf\" (UniqueName: \"kubernetes.io/projected/cdaf429b-b6c0-4f60-a032-17262f7466f4-kube-api-access-c5wgf\") pod \"community-operators-74dzn\" (UID: \"cdaf429b-b6c0-4f60-a032-17262f7466f4\") " pod="openshift-marketplace/community-operators-74dzn" Dec 01 14:24:39 crc kubenswrapper[4585]: I1201 14:24:39.766719 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-74dzn" Dec 01 14:24:40 crc kubenswrapper[4585]: I1201 14:24:40.310272 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-74dzn"] Dec 01 14:24:40 crc kubenswrapper[4585]: I1201 14:24:40.895664 4585 generic.go:334] "Generic (PLEG): container finished" podID="cdaf429b-b6c0-4f60-a032-17262f7466f4" containerID="087e53ecea180fe5a0b88f748d9241a34d16403652d4842209100a9924eff214" exitCode=0 Dec 01 14:24:40 crc kubenswrapper[4585]: I1201 14:24:40.895913 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-74dzn" event={"ID":"cdaf429b-b6c0-4f60-a032-17262f7466f4","Type":"ContainerDied","Data":"087e53ecea180fe5a0b88f748d9241a34d16403652d4842209100a9924eff214"} Dec 01 14:24:40 crc kubenswrapper[4585]: I1201 14:24:40.895942 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-74dzn" event={"ID":"cdaf429b-b6c0-4f60-a032-17262f7466f4","Type":"ContainerStarted","Data":"7fc8379c160b2199ffa048109c40de2c401402804de5e947a524d8d6798e500c"} Dec 01 14:24:43 crc kubenswrapper[4585]: I1201 14:24:43.045031 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-5bz9b"] Dec 01 14:24:43 crc kubenswrapper[4585]: I1201 14:24:43.063601 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-a91f-account-create-update-v4qw7"] Dec 01 14:24:43 crc kubenswrapper[4585]: I1201 14:24:43.076103 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-k9pq4"] Dec 01 14:24:43 crc kubenswrapper[4585]: I1201 14:24:43.086962 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-lrc4n"] Dec 01 14:24:43 crc kubenswrapper[4585]: I1201 14:24:43.097595 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-k9pq4"] Dec 01 14:24:43 crc kubenswrapper[4585]: I1201 14:24:43.105724 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-5bz9b"] Dec 01 14:24:43 crc kubenswrapper[4585]: I1201 14:24:43.116478 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-a91f-account-create-update-v4qw7"] Dec 01 14:24:43 crc kubenswrapper[4585]: I1201 14:24:43.126089 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-lrc4n"] Dec 01 14:24:43 crc kubenswrapper[4585]: I1201 14:24:43.240537 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-n968c"] Dec 01 14:24:43 crc kubenswrapper[4585]: I1201 14:24:43.242970 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n968c" Dec 01 14:24:43 crc kubenswrapper[4585]: I1201 14:24:43.257089 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-n968c"] Dec 01 14:24:43 crc kubenswrapper[4585]: I1201 14:24:43.274191 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b95023af-625f-4997-bb93-7aa6092f173c-utilities\") pod \"redhat-marketplace-n968c\" (UID: \"b95023af-625f-4997-bb93-7aa6092f173c\") " pod="openshift-marketplace/redhat-marketplace-n968c" Dec 01 14:24:43 crc kubenswrapper[4585]: I1201 14:24:43.274521 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b95023af-625f-4997-bb93-7aa6092f173c-catalog-content\") pod \"redhat-marketplace-n968c\" (UID: \"b95023af-625f-4997-bb93-7aa6092f173c\") " pod="openshift-marketplace/redhat-marketplace-n968c" Dec 01 14:24:43 crc kubenswrapper[4585]: I1201 14:24:43.274616 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngwzs\" (UniqueName: \"kubernetes.io/projected/b95023af-625f-4997-bb93-7aa6092f173c-kube-api-access-ngwzs\") pod \"redhat-marketplace-n968c\" (UID: \"b95023af-625f-4997-bb93-7aa6092f173c\") " pod="openshift-marketplace/redhat-marketplace-n968c" Dec 01 14:24:43 crc kubenswrapper[4585]: I1201 14:24:43.376425 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b95023af-625f-4997-bb93-7aa6092f173c-catalog-content\") pod \"redhat-marketplace-n968c\" (UID: \"b95023af-625f-4997-bb93-7aa6092f173c\") " pod="openshift-marketplace/redhat-marketplace-n968c" Dec 01 14:24:43 crc kubenswrapper[4585]: I1201 14:24:43.376475 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngwzs\" (UniqueName: \"kubernetes.io/projected/b95023af-625f-4997-bb93-7aa6092f173c-kube-api-access-ngwzs\") pod \"redhat-marketplace-n968c\" (UID: \"b95023af-625f-4997-bb93-7aa6092f173c\") " pod="openshift-marketplace/redhat-marketplace-n968c" Dec 01 14:24:43 crc kubenswrapper[4585]: I1201 14:24:43.376544 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b95023af-625f-4997-bb93-7aa6092f173c-utilities\") pod \"redhat-marketplace-n968c\" (UID: \"b95023af-625f-4997-bb93-7aa6092f173c\") " pod="openshift-marketplace/redhat-marketplace-n968c" Dec 01 14:24:43 crc kubenswrapper[4585]: I1201 14:24:43.377390 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b95023af-625f-4997-bb93-7aa6092f173c-utilities\") pod \"redhat-marketplace-n968c\" (UID: \"b95023af-625f-4997-bb93-7aa6092f173c\") " pod="openshift-marketplace/redhat-marketplace-n968c" Dec 01 14:24:43 crc kubenswrapper[4585]: I1201 14:24:43.377434 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b95023af-625f-4997-bb93-7aa6092f173c-catalog-content\") pod \"redhat-marketplace-n968c\" (UID: \"b95023af-625f-4997-bb93-7aa6092f173c\") " pod="openshift-marketplace/redhat-marketplace-n968c" Dec 01 14:24:43 crc kubenswrapper[4585]: I1201 14:24:43.395525 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngwzs\" (UniqueName: \"kubernetes.io/projected/b95023af-625f-4997-bb93-7aa6092f173c-kube-api-access-ngwzs\") pod \"redhat-marketplace-n968c\" (UID: \"b95023af-625f-4997-bb93-7aa6092f173c\") " pod="openshift-marketplace/redhat-marketplace-n968c" Dec 01 14:24:43 crc kubenswrapper[4585]: I1201 14:24:43.560010 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n968c" Dec 01 14:24:43 crc kubenswrapper[4585]: I1201 14:24:43.717099 4585 patch_prober.go:28] interesting pod/machine-config-daemon-lj9gs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 14:24:43 crc kubenswrapper[4585]: I1201 14:24:43.717176 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 14:24:44 crc kubenswrapper[4585]: I1201 14:24:44.422784 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9444ec25-acd8-4038-8e7f-052ec1ba2f36" path="/var/lib/kubelet/pods/9444ec25-acd8-4038-8e7f-052ec1ba2f36/volumes" Dec 01 14:24:44 crc kubenswrapper[4585]: I1201 14:24:44.423690 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="977a54d5-9b5e-4399-80ce-6682e0a78d3c" path="/var/lib/kubelet/pods/977a54d5-9b5e-4399-80ce-6682e0a78d3c/volumes" Dec 01 14:24:44 crc kubenswrapper[4585]: I1201 14:24:44.425152 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bed28d37-5c2e-4088-b0ed-d4b3dbbc42b2" path="/var/lib/kubelet/pods/bed28d37-5c2e-4088-b0ed-d4b3dbbc42b2/volumes" Dec 01 14:24:44 crc kubenswrapper[4585]: I1201 14:24:44.426377 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb978ed4-fbc2-4706-821e-3e820802d995" path="/var/lib/kubelet/pods/fb978ed4-fbc2-4706-821e-3e820802d995/volumes" Dec 01 14:24:45 crc kubenswrapper[4585]: I1201 14:24:45.977648 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-74dzn" event={"ID":"cdaf429b-b6c0-4f60-a032-17262f7466f4","Type":"ContainerStarted","Data":"4ada3ea5c8329142576eed278462fc281c11cb8cc1f188f1fcccd098bdef9d74"} Dec 01 14:24:46 crc kubenswrapper[4585]: I1201 14:24:46.041038 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-1a88-account-create-update-wkjzb"] Dec 01 14:24:46 crc kubenswrapper[4585]: I1201 14:24:46.049403 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-1a88-account-create-update-wkjzb"] Dec 01 14:24:46 crc kubenswrapper[4585]: I1201 14:24:46.059111 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-8a75-account-create-update-qzf69"] Dec 01 14:24:46 crc kubenswrapper[4585]: I1201 14:24:46.068206 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-8a75-account-create-update-qzf69"] Dec 01 14:24:46 crc kubenswrapper[4585]: I1201 14:24:46.157856 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-n968c"] Dec 01 14:24:46 crc kubenswrapper[4585]: I1201 14:24:46.424681 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0be6c2fe-329c-411f-8557-cf19f2f0be4c" path="/var/lib/kubelet/pods/0be6c2fe-329c-411f-8557-cf19f2f0be4c/volumes" Dec 01 14:24:46 crc kubenswrapper[4585]: I1201 14:24:46.426694 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27af24bb-3898-4376-9985-63237c74d33f" path="/var/lib/kubelet/pods/27af24bb-3898-4376-9985-63237c74d33f/volumes" Dec 01 14:24:46 crc kubenswrapper[4585]: I1201 14:24:46.988096 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n968c" event={"ID":"b95023af-625f-4997-bb93-7aa6092f173c","Type":"ContainerStarted","Data":"d5ac8b41adad8ad88a3509c6951860736d28f61931ce4e154af682a26ca987d6"} Dec 01 14:24:48 crc kubenswrapper[4585]: I1201 14:24:48.007858 4585 generic.go:334] "Generic (PLEG): container finished" podID="cdaf429b-b6c0-4f60-a032-17262f7466f4" containerID="4ada3ea5c8329142576eed278462fc281c11cb8cc1f188f1fcccd098bdef9d74" exitCode=0 Dec 01 14:24:48 crc kubenswrapper[4585]: I1201 14:24:48.008032 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-74dzn" event={"ID":"cdaf429b-b6c0-4f60-a032-17262f7466f4","Type":"ContainerDied","Data":"4ada3ea5c8329142576eed278462fc281c11cb8cc1f188f1fcccd098bdef9d74"} Dec 01 14:24:49 crc kubenswrapper[4585]: I1201 14:24:49.019473 4585 generic.go:334] "Generic (PLEG): container finished" podID="b95023af-625f-4997-bb93-7aa6092f173c" containerID="269b1ac4f14ddc06b1cb5e71d9296065c7b95cb84e18464c975790fcd54e2e7a" exitCode=0 Dec 01 14:24:49 crc kubenswrapper[4585]: I1201 14:24:49.019565 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n968c" event={"ID":"b95023af-625f-4997-bb93-7aa6092f173c","Type":"ContainerDied","Data":"269b1ac4f14ddc06b1cb5e71d9296065c7b95cb84e18464c975790fcd54e2e7a"} Dec 01 14:24:49 crc kubenswrapper[4585]: I1201 14:24:49.038519 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-74dzn" event={"ID":"cdaf429b-b6c0-4f60-a032-17262f7466f4","Type":"ContainerStarted","Data":"cd21273eef296ad9ac3725c0af39dada2494b1859f32619c2cbb1b8970c34aa0"} Dec 01 14:24:49 crc kubenswrapper[4585]: I1201 14:24:49.065993 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-74dzn" podStartSLOduration=2.4644324060000002 podStartE2EDuration="10.065958105s" podCreationTimestamp="2025-12-01 14:24:39 +0000 UTC" firstStartedPulling="2025-12-01 14:24:40.898393403 +0000 UTC m=+1594.882607258" lastFinishedPulling="2025-12-01 14:24:48.499919102 +0000 UTC m=+1602.484132957" observedRunningTime="2025-12-01 14:24:49.061374345 +0000 UTC m=+1603.045588200" watchObservedRunningTime="2025-12-01 14:24:49.065958105 +0000 UTC m=+1603.050171960" Dec 01 14:24:49 crc kubenswrapper[4585]: I1201 14:24:49.767936 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-74dzn" Dec 01 14:24:49 crc kubenswrapper[4585]: I1201 14:24:49.768288 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-74dzn" Dec 01 14:24:50 crc kubenswrapper[4585]: I1201 14:24:50.059442 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n968c" event={"ID":"b95023af-625f-4997-bb93-7aa6092f173c","Type":"ContainerStarted","Data":"a84cc75b8b7bd1582f2080a8968917a844152f4c878ac5d6d3cc86a0a2ccd7aa"} Dec 01 14:24:50 crc kubenswrapper[4585]: I1201 14:24:50.837590 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-74dzn" podUID="cdaf429b-b6c0-4f60-a032-17262f7466f4" containerName="registry-server" probeResult="failure" output=< Dec 01 14:24:50 crc kubenswrapper[4585]: timeout: failed to connect service ":50051" within 1s Dec 01 14:24:50 crc kubenswrapper[4585]: > Dec 01 14:24:51 crc kubenswrapper[4585]: I1201 14:24:51.040917 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-fh5fg"] Dec 01 14:24:51 crc kubenswrapper[4585]: I1201 14:24:51.052031 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-fh5fg"] Dec 01 14:24:51 crc kubenswrapper[4585]: I1201 14:24:51.069420 4585 generic.go:334] "Generic (PLEG): container finished" podID="b95023af-625f-4997-bb93-7aa6092f173c" containerID="a84cc75b8b7bd1582f2080a8968917a844152f4c878ac5d6d3cc86a0a2ccd7aa" exitCode=0 Dec 01 14:24:51 crc kubenswrapper[4585]: I1201 14:24:51.069464 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n968c" event={"ID":"b95023af-625f-4997-bb93-7aa6092f173c","Type":"ContainerDied","Data":"a84cc75b8b7bd1582f2080a8968917a844152f4c878ac5d6d3cc86a0a2ccd7aa"} Dec 01 14:24:52 crc kubenswrapper[4585]: I1201 14:24:52.082236 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n968c" event={"ID":"b95023af-625f-4997-bb93-7aa6092f173c","Type":"ContainerStarted","Data":"7d12822d3a88727f6fd023a2a0ae7e532b9dd2b0c582c2adae9a8ab132e0f922"} Dec 01 14:24:52 crc kubenswrapper[4585]: I1201 14:24:52.131829 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-n968c" podStartSLOduration=6.451452407 podStartE2EDuration="9.131809207s" podCreationTimestamp="2025-12-01 14:24:43 +0000 UTC" firstStartedPulling="2025-12-01 14:24:49.024361283 +0000 UTC m=+1603.008575138" lastFinishedPulling="2025-12-01 14:24:51.704718083 +0000 UTC m=+1605.688931938" observedRunningTime="2025-12-01 14:24:52.12694818 +0000 UTC m=+1606.111162035" watchObservedRunningTime="2025-12-01 14:24:52.131809207 +0000 UTC m=+1606.116023062" Dec 01 14:24:52 crc kubenswrapper[4585]: I1201 14:24:52.423159 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f99104c3-84fc-45e1-8b1d-e92a2cf55633" path="/var/lib/kubelet/pods/f99104c3-84fc-45e1-8b1d-e92a2cf55633/volumes" Dec 01 14:24:53 crc kubenswrapper[4585]: I1201 14:24:53.561029 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-n968c" Dec 01 14:24:53 crc kubenswrapper[4585]: I1201 14:24:53.562237 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-n968c" Dec 01 14:24:54 crc kubenswrapper[4585]: I1201 14:24:54.618617 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-n968c" podUID="b95023af-625f-4997-bb93-7aa6092f173c" containerName="registry-server" probeResult="failure" output=< Dec 01 14:24:54 crc kubenswrapper[4585]: timeout: failed to connect service ":50051" within 1s Dec 01 14:24:54 crc kubenswrapper[4585]: > Dec 01 14:24:59 crc kubenswrapper[4585]: I1201 14:24:59.820166 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-74dzn" Dec 01 14:24:59 crc kubenswrapper[4585]: I1201 14:24:59.868779 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-74dzn" Dec 01 14:24:59 crc kubenswrapper[4585]: I1201 14:24:59.942342 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-74dzn"] Dec 01 14:25:00 crc kubenswrapper[4585]: I1201 14:25:00.078564 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-558pq"] Dec 01 14:25:00 crc kubenswrapper[4585]: I1201 14:25:00.079142 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-558pq" podUID="820218ea-5c55-45de-a8f8-1a512cf30252" containerName="registry-server" containerID="cri-o://4abc2efc4436002a76a1d7f4cde6d3f25f9899a79ceafb60b4239a7db0446aa9" gracePeriod=2 Dec 01 14:25:00 crc kubenswrapper[4585]: I1201 14:25:00.541917 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-558pq" Dec 01 14:25:00 crc kubenswrapper[4585]: I1201 14:25:00.630800 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsbw6\" (UniqueName: \"kubernetes.io/projected/820218ea-5c55-45de-a8f8-1a512cf30252-kube-api-access-nsbw6\") pod \"820218ea-5c55-45de-a8f8-1a512cf30252\" (UID: \"820218ea-5c55-45de-a8f8-1a512cf30252\") " Dec 01 14:25:00 crc kubenswrapper[4585]: I1201 14:25:00.630942 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/820218ea-5c55-45de-a8f8-1a512cf30252-catalog-content\") pod \"820218ea-5c55-45de-a8f8-1a512cf30252\" (UID: \"820218ea-5c55-45de-a8f8-1a512cf30252\") " Dec 01 14:25:00 crc kubenswrapper[4585]: I1201 14:25:00.631073 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/820218ea-5c55-45de-a8f8-1a512cf30252-utilities\") pod \"820218ea-5c55-45de-a8f8-1a512cf30252\" (UID: \"820218ea-5c55-45de-a8f8-1a512cf30252\") " Dec 01 14:25:00 crc kubenswrapper[4585]: I1201 14:25:00.633129 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/820218ea-5c55-45de-a8f8-1a512cf30252-utilities" (OuterVolumeSpecName: "utilities") pod "820218ea-5c55-45de-a8f8-1a512cf30252" (UID: "820218ea-5c55-45de-a8f8-1a512cf30252"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:25:00 crc kubenswrapper[4585]: I1201 14:25:00.666258 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/820218ea-5c55-45de-a8f8-1a512cf30252-kube-api-access-nsbw6" (OuterVolumeSpecName: "kube-api-access-nsbw6") pod "820218ea-5c55-45de-a8f8-1a512cf30252" (UID: "820218ea-5c55-45de-a8f8-1a512cf30252"). InnerVolumeSpecName "kube-api-access-nsbw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:25:00 crc kubenswrapper[4585]: I1201 14:25:00.733338 4585 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/820218ea-5c55-45de-a8f8-1a512cf30252-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 14:25:00 crc kubenswrapper[4585]: I1201 14:25:00.733365 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nsbw6\" (UniqueName: \"kubernetes.io/projected/820218ea-5c55-45de-a8f8-1a512cf30252-kube-api-access-nsbw6\") on node \"crc\" DevicePath \"\"" Dec 01 14:25:00 crc kubenswrapper[4585]: I1201 14:25:00.785700 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/820218ea-5c55-45de-a8f8-1a512cf30252-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "820218ea-5c55-45de-a8f8-1a512cf30252" (UID: "820218ea-5c55-45de-a8f8-1a512cf30252"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:25:00 crc kubenswrapper[4585]: I1201 14:25:00.835187 4585 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/820218ea-5c55-45de-a8f8-1a512cf30252-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 14:25:01 crc kubenswrapper[4585]: I1201 14:25:01.176661 4585 generic.go:334] "Generic (PLEG): container finished" podID="820218ea-5c55-45de-a8f8-1a512cf30252" containerID="4abc2efc4436002a76a1d7f4cde6d3f25f9899a79ceafb60b4239a7db0446aa9" exitCode=0 Dec 01 14:25:01 crc kubenswrapper[4585]: I1201 14:25:01.177793 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-558pq" Dec 01 14:25:01 crc kubenswrapper[4585]: I1201 14:25:01.180024 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-558pq" event={"ID":"820218ea-5c55-45de-a8f8-1a512cf30252","Type":"ContainerDied","Data":"4abc2efc4436002a76a1d7f4cde6d3f25f9899a79ceafb60b4239a7db0446aa9"} Dec 01 14:25:01 crc kubenswrapper[4585]: I1201 14:25:01.180059 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-558pq" event={"ID":"820218ea-5c55-45de-a8f8-1a512cf30252","Type":"ContainerDied","Data":"ef83dc908a5d72992f32df6dd1aa3e54553f0898a1bc5e75cadbaa6d520f8d75"} Dec 01 14:25:01 crc kubenswrapper[4585]: I1201 14:25:01.180081 4585 scope.go:117] "RemoveContainer" containerID="4abc2efc4436002a76a1d7f4cde6d3f25f9899a79ceafb60b4239a7db0446aa9" Dec 01 14:25:01 crc kubenswrapper[4585]: I1201 14:25:01.222846 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-558pq"] Dec 01 14:25:01 crc kubenswrapper[4585]: I1201 14:25:01.227618 4585 scope.go:117] "RemoveContainer" containerID="7e07cfbf95e6c2ba1f18bfafc9b5cdc93a27b597f8047708ef703b94c7a5ce9b" Dec 01 14:25:01 crc kubenswrapper[4585]: I1201 14:25:01.235518 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-558pq"] Dec 01 14:25:01 crc kubenswrapper[4585]: I1201 14:25:01.257129 4585 scope.go:117] "RemoveContainer" containerID="44b7f196db22ed0219d5430739fa98b4186fe15596d53442742e9809c6013d29" Dec 01 14:25:01 crc kubenswrapper[4585]: I1201 14:25:01.339455 4585 scope.go:117] "RemoveContainer" containerID="4abc2efc4436002a76a1d7f4cde6d3f25f9899a79ceafb60b4239a7db0446aa9" Dec 01 14:25:01 crc kubenswrapper[4585]: E1201 14:25:01.341396 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4abc2efc4436002a76a1d7f4cde6d3f25f9899a79ceafb60b4239a7db0446aa9\": container with ID starting with 4abc2efc4436002a76a1d7f4cde6d3f25f9899a79ceafb60b4239a7db0446aa9 not found: ID does not exist" containerID="4abc2efc4436002a76a1d7f4cde6d3f25f9899a79ceafb60b4239a7db0446aa9" Dec 01 14:25:01 crc kubenswrapper[4585]: I1201 14:25:01.341428 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4abc2efc4436002a76a1d7f4cde6d3f25f9899a79ceafb60b4239a7db0446aa9"} err="failed to get container status \"4abc2efc4436002a76a1d7f4cde6d3f25f9899a79ceafb60b4239a7db0446aa9\": rpc error: code = NotFound desc = could not find container \"4abc2efc4436002a76a1d7f4cde6d3f25f9899a79ceafb60b4239a7db0446aa9\": container with ID starting with 4abc2efc4436002a76a1d7f4cde6d3f25f9899a79ceafb60b4239a7db0446aa9 not found: ID does not exist" Dec 01 14:25:01 crc kubenswrapper[4585]: I1201 14:25:01.341450 4585 scope.go:117] "RemoveContainer" containerID="7e07cfbf95e6c2ba1f18bfafc9b5cdc93a27b597f8047708ef703b94c7a5ce9b" Dec 01 14:25:01 crc kubenswrapper[4585]: E1201 14:25:01.341802 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e07cfbf95e6c2ba1f18bfafc9b5cdc93a27b597f8047708ef703b94c7a5ce9b\": container with ID starting with 7e07cfbf95e6c2ba1f18bfafc9b5cdc93a27b597f8047708ef703b94c7a5ce9b not found: ID does not exist" containerID="7e07cfbf95e6c2ba1f18bfafc9b5cdc93a27b597f8047708ef703b94c7a5ce9b" Dec 01 14:25:01 crc kubenswrapper[4585]: I1201 14:25:01.341845 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e07cfbf95e6c2ba1f18bfafc9b5cdc93a27b597f8047708ef703b94c7a5ce9b"} err="failed to get container status \"7e07cfbf95e6c2ba1f18bfafc9b5cdc93a27b597f8047708ef703b94c7a5ce9b\": rpc error: code = NotFound desc = could not find container \"7e07cfbf95e6c2ba1f18bfafc9b5cdc93a27b597f8047708ef703b94c7a5ce9b\": container with ID starting with 7e07cfbf95e6c2ba1f18bfafc9b5cdc93a27b597f8047708ef703b94c7a5ce9b not found: ID does not exist" Dec 01 14:25:01 crc kubenswrapper[4585]: I1201 14:25:01.341871 4585 scope.go:117] "RemoveContainer" containerID="44b7f196db22ed0219d5430739fa98b4186fe15596d53442742e9809c6013d29" Dec 01 14:25:01 crc kubenswrapper[4585]: E1201 14:25:01.342291 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44b7f196db22ed0219d5430739fa98b4186fe15596d53442742e9809c6013d29\": container with ID starting with 44b7f196db22ed0219d5430739fa98b4186fe15596d53442742e9809c6013d29 not found: ID does not exist" containerID="44b7f196db22ed0219d5430739fa98b4186fe15596d53442742e9809c6013d29" Dec 01 14:25:01 crc kubenswrapper[4585]: I1201 14:25:01.342334 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44b7f196db22ed0219d5430739fa98b4186fe15596d53442742e9809c6013d29"} err="failed to get container status \"44b7f196db22ed0219d5430739fa98b4186fe15596d53442742e9809c6013d29\": rpc error: code = NotFound desc = could not find container \"44b7f196db22ed0219d5430739fa98b4186fe15596d53442742e9809c6013d29\": container with ID starting with 44b7f196db22ed0219d5430739fa98b4186fe15596d53442742e9809c6013d29 not found: ID does not exist" Dec 01 14:25:02 crc kubenswrapper[4585]: I1201 14:25:02.429499 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="820218ea-5c55-45de-a8f8-1a512cf30252" path="/var/lib/kubelet/pods/820218ea-5c55-45de-a8f8-1a512cf30252/volumes" Dec 01 14:25:03 crc kubenswrapper[4585]: I1201 14:25:03.610071 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-n968c" Dec 01 14:25:03 crc kubenswrapper[4585]: I1201 14:25:03.660453 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-n968c" Dec 01 14:25:04 crc kubenswrapper[4585]: I1201 14:25:04.270336 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-n968c"] Dec 01 14:25:05 crc kubenswrapper[4585]: I1201 14:25:05.218346 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-n968c" podUID="b95023af-625f-4997-bb93-7aa6092f173c" containerName="registry-server" containerID="cri-o://7d12822d3a88727f6fd023a2a0ae7e532b9dd2b0c582c2adae9a8ab132e0f922" gracePeriod=2 Dec 01 14:25:05 crc kubenswrapper[4585]: I1201 14:25:05.682355 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n968c" Dec 01 14:25:05 crc kubenswrapper[4585]: I1201 14:25:05.822347 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngwzs\" (UniqueName: \"kubernetes.io/projected/b95023af-625f-4997-bb93-7aa6092f173c-kube-api-access-ngwzs\") pod \"b95023af-625f-4997-bb93-7aa6092f173c\" (UID: \"b95023af-625f-4997-bb93-7aa6092f173c\") " Dec 01 14:25:05 crc kubenswrapper[4585]: I1201 14:25:05.822393 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b95023af-625f-4997-bb93-7aa6092f173c-utilities\") pod \"b95023af-625f-4997-bb93-7aa6092f173c\" (UID: \"b95023af-625f-4997-bb93-7aa6092f173c\") " Dec 01 14:25:05 crc kubenswrapper[4585]: I1201 14:25:05.822553 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b95023af-625f-4997-bb93-7aa6092f173c-catalog-content\") pod \"b95023af-625f-4997-bb93-7aa6092f173c\" (UID: \"b95023af-625f-4997-bb93-7aa6092f173c\") " Dec 01 14:25:05 crc kubenswrapper[4585]: I1201 14:25:05.823532 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b95023af-625f-4997-bb93-7aa6092f173c-utilities" (OuterVolumeSpecName: "utilities") pod "b95023af-625f-4997-bb93-7aa6092f173c" (UID: "b95023af-625f-4997-bb93-7aa6092f173c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:25:05 crc kubenswrapper[4585]: I1201 14:25:05.828433 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b95023af-625f-4997-bb93-7aa6092f173c-kube-api-access-ngwzs" (OuterVolumeSpecName: "kube-api-access-ngwzs") pod "b95023af-625f-4997-bb93-7aa6092f173c" (UID: "b95023af-625f-4997-bb93-7aa6092f173c"). InnerVolumeSpecName "kube-api-access-ngwzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:25:05 crc kubenswrapper[4585]: I1201 14:25:05.841178 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b95023af-625f-4997-bb93-7aa6092f173c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b95023af-625f-4997-bb93-7aa6092f173c" (UID: "b95023af-625f-4997-bb93-7aa6092f173c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:25:05 crc kubenswrapper[4585]: I1201 14:25:05.925197 4585 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b95023af-625f-4997-bb93-7aa6092f173c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 14:25:05 crc kubenswrapper[4585]: I1201 14:25:05.925237 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngwzs\" (UniqueName: \"kubernetes.io/projected/b95023af-625f-4997-bb93-7aa6092f173c-kube-api-access-ngwzs\") on node \"crc\" DevicePath \"\"" Dec 01 14:25:05 crc kubenswrapper[4585]: I1201 14:25:05.925252 4585 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b95023af-625f-4997-bb93-7aa6092f173c-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 14:25:06 crc kubenswrapper[4585]: I1201 14:25:06.230014 4585 generic.go:334] "Generic (PLEG): container finished" podID="b95023af-625f-4997-bb93-7aa6092f173c" containerID="7d12822d3a88727f6fd023a2a0ae7e532b9dd2b0c582c2adae9a8ab132e0f922" exitCode=0 Dec 01 14:25:06 crc kubenswrapper[4585]: I1201 14:25:06.230051 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n968c" event={"ID":"b95023af-625f-4997-bb93-7aa6092f173c","Type":"ContainerDied","Data":"7d12822d3a88727f6fd023a2a0ae7e532b9dd2b0c582c2adae9a8ab132e0f922"} Dec 01 14:25:06 crc kubenswrapper[4585]: I1201 14:25:06.230078 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n968c" event={"ID":"b95023af-625f-4997-bb93-7aa6092f173c","Type":"ContainerDied","Data":"d5ac8b41adad8ad88a3509c6951860736d28f61931ce4e154af682a26ca987d6"} Dec 01 14:25:06 crc kubenswrapper[4585]: I1201 14:25:06.230094 4585 scope.go:117] "RemoveContainer" containerID="7d12822d3a88727f6fd023a2a0ae7e532b9dd2b0c582c2adae9a8ab132e0f922" Dec 01 14:25:06 crc kubenswrapper[4585]: I1201 14:25:06.231100 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n968c" Dec 01 14:25:06 crc kubenswrapper[4585]: I1201 14:25:06.260629 4585 scope.go:117] "RemoveContainer" containerID="a84cc75b8b7bd1582f2080a8968917a844152f4c878ac5d6d3cc86a0a2ccd7aa" Dec 01 14:25:06 crc kubenswrapper[4585]: I1201 14:25:06.271306 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-n968c"] Dec 01 14:25:06 crc kubenswrapper[4585]: I1201 14:25:06.280449 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-n968c"] Dec 01 14:25:06 crc kubenswrapper[4585]: I1201 14:25:06.292212 4585 scope.go:117] "RemoveContainer" containerID="269b1ac4f14ddc06b1cb5e71d9296065c7b95cb84e18464c975790fcd54e2e7a" Dec 01 14:25:06 crc kubenswrapper[4585]: I1201 14:25:06.380446 4585 scope.go:117] "RemoveContainer" containerID="7d12822d3a88727f6fd023a2a0ae7e532b9dd2b0c582c2adae9a8ab132e0f922" Dec 01 14:25:06 crc kubenswrapper[4585]: E1201 14:25:06.382712 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d12822d3a88727f6fd023a2a0ae7e532b9dd2b0c582c2adae9a8ab132e0f922\": container with ID starting with 7d12822d3a88727f6fd023a2a0ae7e532b9dd2b0c582c2adae9a8ab132e0f922 not found: ID does not exist" containerID="7d12822d3a88727f6fd023a2a0ae7e532b9dd2b0c582c2adae9a8ab132e0f922" Dec 01 14:25:06 crc kubenswrapper[4585]: I1201 14:25:06.382760 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d12822d3a88727f6fd023a2a0ae7e532b9dd2b0c582c2adae9a8ab132e0f922"} err="failed to get container status \"7d12822d3a88727f6fd023a2a0ae7e532b9dd2b0c582c2adae9a8ab132e0f922\": rpc error: code = NotFound desc = could not find container \"7d12822d3a88727f6fd023a2a0ae7e532b9dd2b0c582c2adae9a8ab132e0f922\": container with ID starting with 7d12822d3a88727f6fd023a2a0ae7e532b9dd2b0c582c2adae9a8ab132e0f922 not found: ID does not exist" Dec 01 14:25:06 crc kubenswrapper[4585]: I1201 14:25:06.382782 4585 scope.go:117] "RemoveContainer" containerID="a84cc75b8b7bd1582f2080a8968917a844152f4c878ac5d6d3cc86a0a2ccd7aa" Dec 01 14:25:06 crc kubenswrapper[4585]: E1201 14:25:06.384687 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a84cc75b8b7bd1582f2080a8968917a844152f4c878ac5d6d3cc86a0a2ccd7aa\": container with ID starting with a84cc75b8b7bd1582f2080a8968917a844152f4c878ac5d6d3cc86a0a2ccd7aa not found: ID does not exist" containerID="a84cc75b8b7bd1582f2080a8968917a844152f4c878ac5d6d3cc86a0a2ccd7aa" Dec 01 14:25:06 crc kubenswrapper[4585]: I1201 14:25:06.384719 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a84cc75b8b7bd1582f2080a8968917a844152f4c878ac5d6d3cc86a0a2ccd7aa"} err="failed to get container status \"a84cc75b8b7bd1582f2080a8968917a844152f4c878ac5d6d3cc86a0a2ccd7aa\": rpc error: code = NotFound desc = could not find container \"a84cc75b8b7bd1582f2080a8968917a844152f4c878ac5d6d3cc86a0a2ccd7aa\": container with ID starting with a84cc75b8b7bd1582f2080a8968917a844152f4c878ac5d6d3cc86a0a2ccd7aa not found: ID does not exist" Dec 01 14:25:06 crc kubenswrapper[4585]: I1201 14:25:06.384746 4585 scope.go:117] "RemoveContainer" containerID="269b1ac4f14ddc06b1cb5e71d9296065c7b95cb84e18464c975790fcd54e2e7a" Dec 01 14:25:06 crc kubenswrapper[4585]: E1201 14:25:06.387180 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"269b1ac4f14ddc06b1cb5e71d9296065c7b95cb84e18464c975790fcd54e2e7a\": container with ID starting with 269b1ac4f14ddc06b1cb5e71d9296065c7b95cb84e18464c975790fcd54e2e7a not found: ID does not exist" containerID="269b1ac4f14ddc06b1cb5e71d9296065c7b95cb84e18464c975790fcd54e2e7a" Dec 01 14:25:06 crc kubenswrapper[4585]: I1201 14:25:06.387205 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"269b1ac4f14ddc06b1cb5e71d9296065c7b95cb84e18464c975790fcd54e2e7a"} err="failed to get container status \"269b1ac4f14ddc06b1cb5e71d9296065c7b95cb84e18464c975790fcd54e2e7a\": rpc error: code = NotFound desc = could not find container \"269b1ac4f14ddc06b1cb5e71d9296065c7b95cb84e18464c975790fcd54e2e7a\": container with ID starting with 269b1ac4f14ddc06b1cb5e71d9296065c7b95cb84e18464c975790fcd54e2e7a not found: ID does not exist" Dec 01 14:25:06 crc kubenswrapper[4585]: I1201 14:25:06.438692 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b95023af-625f-4997-bb93-7aa6092f173c" path="/var/lib/kubelet/pods/b95023af-625f-4997-bb93-7aa6092f173c/volumes" Dec 01 14:25:11 crc kubenswrapper[4585]: I1201 14:25:11.011257 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gwpph"] Dec 01 14:25:11 crc kubenswrapper[4585]: E1201 14:25:11.013178 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b95023af-625f-4997-bb93-7aa6092f173c" containerName="extract-content" Dec 01 14:25:11 crc kubenswrapper[4585]: I1201 14:25:11.013276 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="b95023af-625f-4997-bb93-7aa6092f173c" containerName="extract-content" Dec 01 14:25:11 crc kubenswrapper[4585]: E1201 14:25:11.013364 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="820218ea-5c55-45de-a8f8-1a512cf30252" containerName="extract-content" Dec 01 14:25:11 crc kubenswrapper[4585]: I1201 14:25:11.013420 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="820218ea-5c55-45de-a8f8-1a512cf30252" containerName="extract-content" Dec 01 14:25:11 crc kubenswrapper[4585]: E1201 14:25:11.013665 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="820218ea-5c55-45de-a8f8-1a512cf30252" containerName="registry-server" Dec 01 14:25:11 crc kubenswrapper[4585]: I1201 14:25:11.013757 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="820218ea-5c55-45de-a8f8-1a512cf30252" containerName="registry-server" Dec 01 14:25:11 crc kubenswrapper[4585]: E1201 14:25:11.013823 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="820218ea-5c55-45de-a8f8-1a512cf30252" containerName="extract-utilities" Dec 01 14:25:11 crc kubenswrapper[4585]: I1201 14:25:11.013939 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="820218ea-5c55-45de-a8f8-1a512cf30252" containerName="extract-utilities" Dec 01 14:25:11 crc kubenswrapper[4585]: E1201 14:25:11.014054 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b95023af-625f-4997-bb93-7aa6092f173c" containerName="extract-utilities" Dec 01 14:25:11 crc kubenswrapper[4585]: I1201 14:25:11.014119 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="b95023af-625f-4997-bb93-7aa6092f173c" containerName="extract-utilities" Dec 01 14:25:11 crc kubenswrapper[4585]: E1201 14:25:11.014181 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b95023af-625f-4997-bb93-7aa6092f173c" containerName="registry-server" Dec 01 14:25:11 crc kubenswrapper[4585]: I1201 14:25:11.014644 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="b95023af-625f-4997-bb93-7aa6092f173c" containerName="registry-server" Dec 01 14:25:11 crc kubenswrapper[4585]: I1201 14:25:11.015054 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="b95023af-625f-4997-bb93-7aa6092f173c" containerName="registry-server" Dec 01 14:25:11 crc kubenswrapper[4585]: I1201 14:25:11.015139 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="820218ea-5c55-45de-a8f8-1a512cf30252" containerName="registry-server" Dec 01 14:25:11 crc kubenswrapper[4585]: I1201 14:25:11.017150 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gwpph" Dec 01 14:25:11 crc kubenswrapper[4585]: I1201 14:25:11.025870 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gwpph"] Dec 01 14:25:11 crc kubenswrapper[4585]: I1201 14:25:11.110861 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5a1b478-757d-4300-8713-6c85214121aa-catalog-content\") pod \"certified-operators-gwpph\" (UID: \"a5a1b478-757d-4300-8713-6c85214121aa\") " pod="openshift-marketplace/certified-operators-gwpph" Dec 01 14:25:11 crc kubenswrapper[4585]: I1201 14:25:11.110935 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlhs4\" (UniqueName: \"kubernetes.io/projected/a5a1b478-757d-4300-8713-6c85214121aa-kube-api-access-dlhs4\") pod \"certified-operators-gwpph\" (UID: \"a5a1b478-757d-4300-8713-6c85214121aa\") " pod="openshift-marketplace/certified-operators-gwpph" Dec 01 14:25:11 crc kubenswrapper[4585]: I1201 14:25:11.110993 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5a1b478-757d-4300-8713-6c85214121aa-utilities\") pod \"certified-operators-gwpph\" (UID: \"a5a1b478-757d-4300-8713-6c85214121aa\") " pod="openshift-marketplace/certified-operators-gwpph" Dec 01 14:25:11 crc kubenswrapper[4585]: I1201 14:25:11.212959 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlhs4\" (UniqueName: \"kubernetes.io/projected/a5a1b478-757d-4300-8713-6c85214121aa-kube-api-access-dlhs4\") pod \"certified-operators-gwpph\" (UID: \"a5a1b478-757d-4300-8713-6c85214121aa\") " pod="openshift-marketplace/certified-operators-gwpph" Dec 01 14:25:11 crc kubenswrapper[4585]: I1201 14:25:11.213077 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5a1b478-757d-4300-8713-6c85214121aa-utilities\") pod \"certified-operators-gwpph\" (UID: \"a5a1b478-757d-4300-8713-6c85214121aa\") " pod="openshift-marketplace/certified-operators-gwpph" Dec 01 14:25:11 crc kubenswrapper[4585]: I1201 14:25:11.213245 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5a1b478-757d-4300-8713-6c85214121aa-catalog-content\") pod \"certified-operators-gwpph\" (UID: \"a5a1b478-757d-4300-8713-6c85214121aa\") " pod="openshift-marketplace/certified-operators-gwpph" Dec 01 14:25:11 crc kubenswrapper[4585]: I1201 14:25:11.213564 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5a1b478-757d-4300-8713-6c85214121aa-utilities\") pod \"certified-operators-gwpph\" (UID: \"a5a1b478-757d-4300-8713-6c85214121aa\") " pod="openshift-marketplace/certified-operators-gwpph" Dec 01 14:25:11 crc kubenswrapper[4585]: I1201 14:25:11.213740 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5a1b478-757d-4300-8713-6c85214121aa-catalog-content\") pod \"certified-operators-gwpph\" (UID: \"a5a1b478-757d-4300-8713-6c85214121aa\") " pod="openshift-marketplace/certified-operators-gwpph" Dec 01 14:25:11 crc kubenswrapper[4585]: I1201 14:25:11.241297 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlhs4\" (UniqueName: \"kubernetes.io/projected/a5a1b478-757d-4300-8713-6c85214121aa-kube-api-access-dlhs4\") pod \"certified-operators-gwpph\" (UID: \"a5a1b478-757d-4300-8713-6c85214121aa\") " pod="openshift-marketplace/certified-operators-gwpph" Dec 01 14:25:11 crc kubenswrapper[4585]: I1201 14:25:11.353073 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gwpph" Dec 01 14:25:11 crc kubenswrapper[4585]: I1201 14:25:11.899143 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gwpph"] Dec 01 14:25:12 crc kubenswrapper[4585]: I1201 14:25:12.295591 4585 generic.go:334] "Generic (PLEG): container finished" podID="a5a1b478-757d-4300-8713-6c85214121aa" containerID="926f314367b97fd6491b974cb574c03ec6dfbe6c33573649d40a484f4d1f006f" exitCode=0 Dec 01 14:25:12 crc kubenswrapper[4585]: I1201 14:25:12.295855 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gwpph" event={"ID":"a5a1b478-757d-4300-8713-6c85214121aa","Type":"ContainerDied","Data":"926f314367b97fd6491b974cb574c03ec6dfbe6c33573649d40a484f4d1f006f"} Dec 01 14:25:12 crc kubenswrapper[4585]: I1201 14:25:12.295890 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gwpph" event={"ID":"a5a1b478-757d-4300-8713-6c85214121aa","Type":"ContainerStarted","Data":"a84f885e30cfa7b9e96417fd085c770a5faed074659af2ebf5c579d03e8a756c"} Dec 01 14:25:13 crc kubenswrapper[4585]: I1201 14:25:13.716081 4585 patch_prober.go:28] interesting pod/machine-config-daemon-lj9gs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 14:25:13 crc kubenswrapper[4585]: I1201 14:25:13.716357 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 14:25:14 crc kubenswrapper[4585]: I1201 14:25:14.329614 4585 generic.go:334] "Generic (PLEG): container finished" podID="a5a1b478-757d-4300-8713-6c85214121aa" containerID="d53d13c8ba5a74c479c831b71156314b1edf2f78bd1c6af7dd34154f00bda298" exitCode=0 Dec 01 14:25:14 crc kubenswrapper[4585]: I1201 14:25:14.329682 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gwpph" event={"ID":"a5a1b478-757d-4300-8713-6c85214121aa","Type":"ContainerDied","Data":"d53d13c8ba5a74c479c831b71156314b1edf2f78bd1c6af7dd34154f00bda298"} Dec 01 14:25:17 crc kubenswrapper[4585]: I1201 14:25:17.359036 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gwpph" event={"ID":"a5a1b478-757d-4300-8713-6c85214121aa","Type":"ContainerStarted","Data":"65de880c0221d31d914d17d469e5be28ad53dc69ddd4426ef1375b47edc83ae5"} Dec 01 14:25:17 crc kubenswrapper[4585]: I1201 14:25:17.381486 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gwpph" podStartSLOduration=3.481157457 podStartE2EDuration="7.381468351s" podCreationTimestamp="2025-12-01 14:25:10 +0000 UTC" firstStartedPulling="2025-12-01 14:25:12.297648771 +0000 UTC m=+1626.281862626" lastFinishedPulling="2025-12-01 14:25:16.197959665 +0000 UTC m=+1630.182173520" observedRunningTime="2025-12-01 14:25:17.373858191 +0000 UTC m=+1631.358072046" watchObservedRunningTime="2025-12-01 14:25:17.381468351 +0000 UTC m=+1631.365682206" Dec 01 14:25:21 crc kubenswrapper[4585]: I1201 14:25:21.353824 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gwpph" Dec 01 14:25:21 crc kubenswrapper[4585]: I1201 14:25:21.354082 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gwpph" Dec 01 14:25:21 crc kubenswrapper[4585]: I1201 14:25:21.397997 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gwpph" Dec 01 14:25:21 crc kubenswrapper[4585]: I1201 14:25:21.461897 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gwpph" Dec 01 14:25:21 crc kubenswrapper[4585]: I1201 14:25:21.641251 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gwpph"] Dec 01 14:25:22 crc kubenswrapper[4585]: I1201 14:25:22.055096 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-kg2tb"] Dec 01 14:25:22 crc kubenswrapper[4585]: I1201 14:25:22.064408 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-kg2tb"] Dec 01 14:25:22 crc kubenswrapper[4585]: I1201 14:25:22.428138 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc9ff7ae-64b5-41db-94e0-5b30cb0a923c" path="/var/lib/kubelet/pods/cc9ff7ae-64b5-41db-94e0-5b30cb0a923c/volumes" Dec 01 14:25:23 crc kubenswrapper[4585]: I1201 14:25:23.408420 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gwpph" podUID="a5a1b478-757d-4300-8713-6c85214121aa" containerName="registry-server" containerID="cri-o://65de880c0221d31d914d17d469e5be28ad53dc69ddd4426ef1375b47edc83ae5" gracePeriod=2 Dec 01 14:25:23 crc kubenswrapper[4585]: I1201 14:25:23.886957 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gwpph" Dec 01 14:25:23 crc kubenswrapper[4585]: I1201 14:25:23.975450 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlhs4\" (UniqueName: \"kubernetes.io/projected/a5a1b478-757d-4300-8713-6c85214121aa-kube-api-access-dlhs4\") pod \"a5a1b478-757d-4300-8713-6c85214121aa\" (UID: \"a5a1b478-757d-4300-8713-6c85214121aa\") " Dec 01 14:25:23 crc kubenswrapper[4585]: I1201 14:25:23.975589 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5a1b478-757d-4300-8713-6c85214121aa-utilities\") pod \"a5a1b478-757d-4300-8713-6c85214121aa\" (UID: \"a5a1b478-757d-4300-8713-6c85214121aa\") " Dec 01 14:25:23 crc kubenswrapper[4585]: I1201 14:25:23.975695 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5a1b478-757d-4300-8713-6c85214121aa-catalog-content\") pod \"a5a1b478-757d-4300-8713-6c85214121aa\" (UID: \"a5a1b478-757d-4300-8713-6c85214121aa\") " Dec 01 14:25:23 crc kubenswrapper[4585]: I1201 14:25:23.976636 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5a1b478-757d-4300-8713-6c85214121aa-utilities" (OuterVolumeSpecName: "utilities") pod "a5a1b478-757d-4300-8713-6c85214121aa" (UID: "a5a1b478-757d-4300-8713-6c85214121aa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:25:23 crc kubenswrapper[4585]: I1201 14:25:23.980492 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5a1b478-757d-4300-8713-6c85214121aa-kube-api-access-dlhs4" (OuterVolumeSpecName: "kube-api-access-dlhs4") pod "a5a1b478-757d-4300-8713-6c85214121aa" (UID: "a5a1b478-757d-4300-8713-6c85214121aa"). InnerVolumeSpecName "kube-api-access-dlhs4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:25:24 crc kubenswrapper[4585]: I1201 14:25:24.028534 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5a1b478-757d-4300-8713-6c85214121aa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a5a1b478-757d-4300-8713-6c85214121aa" (UID: "a5a1b478-757d-4300-8713-6c85214121aa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:25:24 crc kubenswrapper[4585]: I1201 14:25:24.078010 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlhs4\" (UniqueName: \"kubernetes.io/projected/a5a1b478-757d-4300-8713-6c85214121aa-kube-api-access-dlhs4\") on node \"crc\" DevicePath \"\"" Dec 01 14:25:24 crc kubenswrapper[4585]: I1201 14:25:24.078053 4585 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5a1b478-757d-4300-8713-6c85214121aa-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 14:25:24 crc kubenswrapper[4585]: I1201 14:25:24.078069 4585 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5a1b478-757d-4300-8713-6c85214121aa-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 14:25:24 crc kubenswrapper[4585]: I1201 14:25:24.419063 4585 generic.go:334] "Generic (PLEG): container finished" podID="a5a1b478-757d-4300-8713-6c85214121aa" containerID="65de880c0221d31d914d17d469e5be28ad53dc69ddd4426ef1375b47edc83ae5" exitCode=0 Dec 01 14:25:24 crc kubenswrapper[4585]: I1201 14:25:24.419144 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gwpph" Dec 01 14:25:24 crc kubenswrapper[4585]: I1201 14:25:24.424414 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gwpph" event={"ID":"a5a1b478-757d-4300-8713-6c85214121aa","Type":"ContainerDied","Data":"65de880c0221d31d914d17d469e5be28ad53dc69ddd4426ef1375b47edc83ae5"} Dec 01 14:25:24 crc kubenswrapper[4585]: I1201 14:25:24.424454 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gwpph" event={"ID":"a5a1b478-757d-4300-8713-6c85214121aa","Type":"ContainerDied","Data":"a84f885e30cfa7b9e96417fd085c770a5faed074659af2ebf5c579d03e8a756c"} Dec 01 14:25:24 crc kubenswrapper[4585]: I1201 14:25:24.424478 4585 scope.go:117] "RemoveContainer" containerID="65de880c0221d31d914d17d469e5be28ad53dc69ddd4426ef1375b47edc83ae5" Dec 01 14:25:24 crc kubenswrapper[4585]: I1201 14:25:24.466011 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gwpph"] Dec 01 14:25:24 crc kubenswrapper[4585]: I1201 14:25:24.471550 4585 scope.go:117] "RemoveContainer" containerID="d53d13c8ba5a74c479c831b71156314b1edf2f78bd1c6af7dd34154f00bda298" Dec 01 14:25:24 crc kubenswrapper[4585]: I1201 14:25:24.475686 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gwpph"] Dec 01 14:25:24 crc kubenswrapper[4585]: I1201 14:25:24.490227 4585 scope.go:117] "RemoveContainer" containerID="926f314367b97fd6491b974cb574c03ec6dfbe6c33573649d40a484f4d1f006f" Dec 01 14:25:24 crc kubenswrapper[4585]: I1201 14:25:24.545589 4585 scope.go:117] "RemoveContainer" containerID="65de880c0221d31d914d17d469e5be28ad53dc69ddd4426ef1375b47edc83ae5" Dec 01 14:25:24 crc kubenswrapper[4585]: E1201 14:25:24.546061 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65de880c0221d31d914d17d469e5be28ad53dc69ddd4426ef1375b47edc83ae5\": container with ID starting with 65de880c0221d31d914d17d469e5be28ad53dc69ddd4426ef1375b47edc83ae5 not found: ID does not exist" containerID="65de880c0221d31d914d17d469e5be28ad53dc69ddd4426ef1375b47edc83ae5" Dec 01 14:25:24 crc kubenswrapper[4585]: I1201 14:25:24.546098 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65de880c0221d31d914d17d469e5be28ad53dc69ddd4426ef1375b47edc83ae5"} err="failed to get container status \"65de880c0221d31d914d17d469e5be28ad53dc69ddd4426ef1375b47edc83ae5\": rpc error: code = NotFound desc = could not find container \"65de880c0221d31d914d17d469e5be28ad53dc69ddd4426ef1375b47edc83ae5\": container with ID starting with 65de880c0221d31d914d17d469e5be28ad53dc69ddd4426ef1375b47edc83ae5 not found: ID does not exist" Dec 01 14:25:24 crc kubenswrapper[4585]: I1201 14:25:24.546120 4585 scope.go:117] "RemoveContainer" containerID="d53d13c8ba5a74c479c831b71156314b1edf2f78bd1c6af7dd34154f00bda298" Dec 01 14:25:24 crc kubenswrapper[4585]: E1201 14:25:24.546412 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d53d13c8ba5a74c479c831b71156314b1edf2f78bd1c6af7dd34154f00bda298\": container with ID starting with d53d13c8ba5a74c479c831b71156314b1edf2f78bd1c6af7dd34154f00bda298 not found: ID does not exist" containerID="d53d13c8ba5a74c479c831b71156314b1edf2f78bd1c6af7dd34154f00bda298" Dec 01 14:25:24 crc kubenswrapper[4585]: I1201 14:25:24.546456 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d53d13c8ba5a74c479c831b71156314b1edf2f78bd1c6af7dd34154f00bda298"} err="failed to get container status \"d53d13c8ba5a74c479c831b71156314b1edf2f78bd1c6af7dd34154f00bda298\": rpc error: code = NotFound desc = could not find container \"d53d13c8ba5a74c479c831b71156314b1edf2f78bd1c6af7dd34154f00bda298\": container with ID starting with d53d13c8ba5a74c479c831b71156314b1edf2f78bd1c6af7dd34154f00bda298 not found: ID does not exist" Dec 01 14:25:24 crc kubenswrapper[4585]: I1201 14:25:24.546487 4585 scope.go:117] "RemoveContainer" containerID="926f314367b97fd6491b974cb574c03ec6dfbe6c33573649d40a484f4d1f006f" Dec 01 14:25:24 crc kubenswrapper[4585]: E1201 14:25:24.546822 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"926f314367b97fd6491b974cb574c03ec6dfbe6c33573649d40a484f4d1f006f\": container with ID starting with 926f314367b97fd6491b974cb574c03ec6dfbe6c33573649d40a484f4d1f006f not found: ID does not exist" containerID="926f314367b97fd6491b974cb574c03ec6dfbe6c33573649d40a484f4d1f006f" Dec 01 14:25:24 crc kubenswrapper[4585]: I1201 14:25:24.546883 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"926f314367b97fd6491b974cb574c03ec6dfbe6c33573649d40a484f4d1f006f"} err="failed to get container status \"926f314367b97fd6491b974cb574c03ec6dfbe6c33573649d40a484f4d1f006f\": rpc error: code = NotFound desc = could not find container \"926f314367b97fd6491b974cb574c03ec6dfbe6c33573649d40a484f4d1f006f\": container with ID starting with 926f314367b97fd6491b974cb574c03ec6dfbe6c33573649d40a484f4d1f006f not found: ID does not exist" Dec 01 14:25:26 crc kubenswrapper[4585]: I1201 14:25:26.423813 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5a1b478-757d-4300-8713-6c85214121aa" path="/var/lib/kubelet/pods/a5a1b478-757d-4300-8713-6c85214121aa/volumes" Dec 01 14:25:33 crc kubenswrapper[4585]: I1201 14:25:33.354927 4585 scope.go:117] "RemoveContainer" containerID="fe9aa6d17ef960a658e666c9d866827cb0d099f697364ce2f6cc4b9686014cb1" Dec 01 14:25:33 crc kubenswrapper[4585]: I1201 14:25:33.386291 4585 scope.go:117] "RemoveContainer" containerID="cd2e776b376b295748b994c27808891c21e62c16b16ebbc155364236a0b76001" Dec 01 14:25:33 crc kubenswrapper[4585]: I1201 14:25:33.438029 4585 scope.go:117] "RemoveContainer" containerID="824042bdef4d457c0c50bdb2e61a134d74854465c34f29dd09592c239618a939" Dec 01 14:25:33 crc kubenswrapper[4585]: I1201 14:25:33.465064 4585 scope.go:117] "RemoveContainer" containerID="435c0ba6b3fa213eaa1b88590bc3d810dc145ce2c8fb5ff6e1ef3ff9258846a7" Dec 01 14:25:33 crc kubenswrapper[4585]: I1201 14:25:33.505146 4585 generic.go:334] "Generic (PLEG): container finished" podID="bed7040b-db55-41ae-9384-7b730ced5331" containerID="bf31b4fedb87a866365c826937f1c2fea02f17fec2b3ede10021f3aaca9ba72a" exitCode=0 Dec 01 14:25:33 crc kubenswrapper[4585]: I1201 14:25:33.505207 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pw7wp" event={"ID":"bed7040b-db55-41ae-9384-7b730ced5331","Type":"ContainerDied","Data":"bf31b4fedb87a866365c826937f1c2fea02f17fec2b3ede10021f3aaca9ba72a"} Dec 01 14:25:33 crc kubenswrapper[4585]: I1201 14:25:33.524895 4585 scope.go:117] "RemoveContainer" containerID="290f7aca9059a288646ea974a383ec6db3819c88e856a2add53f4a313d8b00fe" Dec 01 14:25:33 crc kubenswrapper[4585]: I1201 14:25:33.556815 4585 scope.go:117] "RemoveContainer" containerID="d9e2499863533663da2eb2ac3ce14aec95113619c7e1324357b02610d35f5953" Dec 01 14:25:33 crc kubenswrapper[4585]: I1201 14:25:33.598767 4585 scope.go:117] "RemoveContainer" containerID="8bd698d13098a76d2d67f8c25fdd60899fe0d676c9b860ef4f351828a01641ec" Dec 01 14:25:33 crc kubenswrapper[4585]: I1201 14:25:33.617480 4585 scope.go:117] "RemoveContainer" containerID="6c1ef3d1614a76dcda37f03f84c33346175a6b23c90a90aa3258596190ac356f" Dec 01 14:25:33 crc kubenswrapper[4585]: I1201 14:25:33.635007 4585 scope.go:117] "RemoveContainer" containerID="9c70defbbb5c1c8842a26ab188f71e6a51ea09a520e94f285f9597e1d7859d01" Dec 01 14:25:34 crc kubenswrapper[4585]: I1201 14:25:34.912030 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pw7wp" Dec 01 14:25:35 crc kubenswrapper[4585]: I1201 14:25:35.002378 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bed7040b-db55-41ae-9384-7b730ced5331-ssh-key\") pod \"bed7040b-db55-41ae-9384-7b730ced5331\" (UID: \"bed7040b-db55-41ae-9384-7b730ced5331\") " Dec 01 14:25:35 crc kubenswrapper[4585]: I1201 14:25:35.003344 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bed7040b-db55-41ae-9384-7b730ced5331-inventory\") pod \"bed7040b-db55-41ae-9384-7b730ced5331\" (UID: \"bed7040b-db55-41ae-9384-7b730ced5331\") " Dec 01 14:25:35 crc kubenswrapper[4585]: I1201 14:25:35.003548 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfnkb\" (UniqueName: \"kubernetes.io/projected/bed7040b-db55-41ae-9384-7b730ced5331-kube-api-access-rfnkb\") pod \"bed7040b-db55-41ae-9384-7b730ced5331\" (UID: \"bed7040b-db55-41ae-9384-7b730ced5331\") " Dec 01 14:25:35 crc kubenswrapper[4585]: I1201 14:25:35.028836 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bed7040b-db55-41ae-9384-7b730ced5331-kube-api-access-rfnkb" (OuterVolumeSpecName: "kube-api-access-rfnkb") pod "bed7040b-db55-41ae-9384-7b730ced5331" (UID: "bed7040b-db55-41ae-9384-7b730ced5331"). InnerVolumeSpecName "kube-api-access-rfnkb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:25:35 crc kubenswrapper[4585]: I1201 14:25:35.039767 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bed7040b-db55-41ae-9384-7b730ced5331-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "bed7040b-db55-41ae-9384-7b730ced5331" (UID: "bed7040b-db55-41ae-9384-7b730ced5331"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:25:35 crc kubenswrapper[4585]: I1201 14:25:35.064872 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bed7040b-db55-41ae-9384-7b730ced5331-inventory" (OuterVolumeSpecName: "inventory") pod "bed7040b-db55-41ae-9384-7b730ced5331" (UID: "bed7040b-db55-41ae-9384-7b730ced5331"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:25:35 crc kubenswrapper[4585]: I1201 14:25:35.107088 4585 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bed7040b-db55-41ae-9384-7b730ced5331-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 14:25:35 crc kubenswrapper[4585]: I1201 14:25:35.107125 4585 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bed7040b-db55-41ae-9384-7b730ced5331-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 14:25:35 crc kubenswrapper[4585]: I1201 14:25:35.107139 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfnkb\" (UniqueName: \"kubernetes.io/projected/bed7040b-db55-41ae-9384-7b730ced5331-kube-api-access-rfnkb\") on node \"crc\" DevicePath \"\"" Dec 01 14:25:35 crc kubenswrapper[4585]: I1201 14:25:35.524796 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pw7wp" event={"ID":"bed7040b-db55-41ae-9384-7b730ced5331","Type":"ContainerDied","Data":"e87f87769bbc0ef948759755bc25b07de2b59cef66194bf229b1a8ba753c90a3"} Dec 01 14:25:35 crc kubenswrapper[4585]: I1201 14:25:35.524845 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e87f87769bbc0ef948759755bc25b07de2b59cef66194bf229b1a8ba753c90a3" Dec 01 14:25:35 crc kubenswrapper[4585]: I1201 14:25:35.524844 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pw7wp" Dec 01 14:25:35 crc kubenswrapper[4585]: I1201 14:25:35.599650 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xjc7h"] Dec 01 14:25:35 crc kubenswrapper[4585]: E1201 14:25:35.600056 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5a1b478-757d-4300-8713-6c85214121aa" containerName="extract-utilities" Dec 01 14:25:35 crc kubenswrapper[4585]: I1201 14:25:35.600072 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5a1b478-757d-4300-8713-6c85214121aa" containerName="extract-utilities" Dec 01 14:25:35 crc kubenswrapper[4585]: E1201 14:25:35.600087 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bed7040b-db55-41ae-9384-7b730ced5331" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 01 14:25:35 crc kubenswrapper[4585]: I1201 14:25:35.600093 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="bed7040b-db55-41ae-9384-7b730ced5331" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 01 14:25:35 crc kubenswrapper[4585]: E1201 14:25:35.600120 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5a1b478-757d-4300-8713-6c85214121aa" containerName="registry-server" Dec 01 14:25:35 crc kubenswrapper[4585]: I1201 14:25:35.600126 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5a1b478-757d-4300-8713-6c85214121aa" containerName="registry-server" Dec 01 14:25:35 crc kubenswrapper[4585]: E1201 14:25:35.600138 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5a1b478-757d-4300-8713-6c85214121aa" containerName="extract-content" Dec 01 14:25:35 crc kubenswrapper[4585]: I1201 14:25:35.600143 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5a1b478-757d-4300-8713-6c85214121aa" containerName="extract-content" Dec 01 14:25:35 crc kubenswrapper[4585]: I1201 14:25:35.600300 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5a1b478-757d-4300-8713-6c85214121aa" containerName="registry-server" Dec 01 14:25:35 crc kubenswrapper[4585]: I1201 14:25:35.600319 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="bed7040b-db55-41ae-9384-7b730ced5331" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 01 14:25:35 crc kubenswrapper[4585]: I1201 14:25:35.603248 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xjc7h" Dec 01 14:25:35 crc kubenswrapper[4585]: I1201 14:25:35.608301 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 14:25:35 crc kubenswrapper[4585]: I1201 14:25:35.608490 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 14:25:35 crc kubenswrapper[4585]: I1201 14:25:35.608521 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 14:25:35 crc kubenswrapper[4585]: I1201 14:25:35.608710 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-z6vpw" Dec 01 14:25:35 crc kubenswrapper[4585]: I1201 14:25:35.624640 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xjc7h"] Dec 01 14:25:35 crc kubenswrapper[4585]: I1201 14:25:35.718371 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/16e7590a-927c-4ff1-8eb8-3b4a248ce6f8-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xjc7h\" (UID: \"16e7590a-927c-4ff1-8eb8-3b4a248ce6f8\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xjc7h" Dec 01 14:25:35 crc kubenswrapper[4585]: I1201 14:25:35.718588 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/16e7590a-927c-4ff1-8eb8-3b4a248ce6f8-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xjc7h\" (UID: \"16e7590a-927c-4ff1-8eb8-3b4a248ce6f8\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xjc7h" Dec 01 14:25:35 crc kubenswrapper[4585]: I1201 14:25:35.718759 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lg68j\" (UniqueName: \"kubernetes.io/projected/16e7590a-927c-4ff1-8eb8-3b4a248ce6f8-kube-api-access-lg68j\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xjc7h\" (UID: \"16e7590a-927c-4ff1-8eb8-3b4a248ce6f8\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xjc7h" Dec 01 14:25:35 crc kubenswrapper[4585]: I1201 14:25:35.820670 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lg68j\" (UniqueName: \"kubernetes.io/projected/16e7590a-927c-4ff1-8eb8-3b4a248ce6f8-kube-api-access-lg68j\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xjc7h\" (UID: \"16e7590a-927c-4ff1-8eb8-3b4a248ce6f8\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xjc7h" Dec 01 14:25:35 crc kubenswrapper[4585]: I1201 14:25:35.820818 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/16e7590a-927c-4ff1-8eb8-3b4a248ce6f8-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xjc7h\" (UID: \"16e7590a-927c-4ff1-8eb8-3b4a248ce6f8\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xjc7h" Dec 01 14:25:35 crc kubenswrapper[4585]: I1201 14:25:35.820844 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/16e7590a-927c-4ff1-8eb8-3b4a248ce6f8-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xjc7h\" (UID: \"16e7590a-927c-4ff1-8eb8-3b4a248ce6f8\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xjc7h" Dec 01 14:25:35 crc kubenswrapper[4585]: I1201 14:25:35.826615 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/16e7590a-927c-4ff1-8eb8-3b4a248ce6f8-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xjc7h\" (UID: \"16e7590a-927c-4ff1-8eb8-3b4a248ce6f8\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xjc7h" Dec 01 14:25:35 crc kubenswrapper[4585]: I1201 14:25:35.826652 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/16e7590a-927c-4ff1-8eb8-3b4a248ce6f8-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xjc7h\" (UID: \"16e7590a-927c-4ff1-8eb8-3b4a248ce6f8\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xjc7h" Dec 01 14:25:35 crc kubenswrapper[4585]: I1201 14:25:35.839197 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lg68j\" (UniqueName: \"kubernetes.io/projected/16e7590a-927c-4ff1-8eb8-3b4a248ce6f8-kube-api-access-lg68j\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xjc7h\" (UID: \"16e7590a-927c-4ff1-8eb8-3b4a248ce6f8\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xjc7h" Dec 01 14:25:35 crc kubenswrapper[4585]: I1201 14:25:35.920348 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xjc7h" Dec 01 14:25:36 crc kubenswrapper[4585]: I1201 14:25:36.027132 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-825nv"] Dec 01 14:25:36 crc kubenswrapper[4585]: I1201 14:25:36.032771 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-825nv"] Dec 01 14:25:36 crc kubenswrapper[4585]: I1201 14:25:36.426679 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30f08b08-85f9-4df9-97ce-f0f25238e889" path="/var/lib/kubelet/pods/30f08b08-85f9-4df9-97ce-f0f25238e889/volumes" Dec 01 14:25:36 crc kubenswrapper[4585]: I1201 14:25:36.491038 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xjc7h"] Dec 01 14:25:36 crc kubenswrapper[4585]: I1201 14:25:36.533916 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xjc7h" event={"ID":"16e7590a-927c-4ff1-8eb8-3b4a248ce6f8","Type":"ContainerStarted","Data":"89b9c665e97411b19d3734dc69584a9149c19b2815eec5fe942ab673d52e740d"} Dec 01 14:25:37 crc kubenswrapper[4585]: I1201 14:25:37.036192 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-xbp2z"] Dec 01 14:25:37 crc kubenswrapper[4585]: I1201 14:25:37.046468 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-xbp2z"] Dec 01 14:25:37 crc kubenswrapper[4585]: I1201 14:25:37.542343 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xjc7h" event={"ID":"16e7590a-927c-4ff1-8eb8-3b4a248ce6f8","Type":"ContainerStarted","Data":"79fca915fe92b8620ff49f714743fe50f0e4dfdbba545db273f982a663486b34"} Dec 01 14:25:37 crc kubenswrapper[4585]: I1201 14:25:37.556320 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xjc7h" podStartSLOduration=1.7554311299999998 podStartE2EDuration="2.556301388s" podCreationTimestamp="2025-12-01 14:25:35 +0000 UTC" firstStartedPulling="2025-12-01 14:25:36.504804878 +0000 UTC m=+1650.489018733" lastFinishedPulling="2025-12-01 14:25:37.305675136 +0000 UTC m=+1651.289888991" observedRunningTime="2025-12-01 14:25:37.554368917 +0000 UTC m=+1651.538582772" watchObservedRunningTime="2025-12-01 14:25:37.556301388 +0000 UTC m=+1651.540515243" Dec 01 14:25:38 crc kubenswrapper[4585]: I1201 14:25:38.424856 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b34175d-932c-4b94-b6cf-b164891fc965" path="/var/lib/kubelet/pods/1b34175d-932c-4b94-b6cf-b164891fc965/volumes" Dec 01 14:25:43 crc kubenswrapper[4585]: I1201 14:25:43.715851 4585 patch_prober.go:28] interesting pod/machine-config-daemon-lj9gs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 14:25:43 crc kubenswrapper[4585]: I1201 14:25:43.716207 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 14:25:43 crc kubenswrapper[4585]: I1201 14:25:43.716249 4585 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" Dec 01 14:25:43 crc kubenswrapper[4585]: I1201 14:25:43.716927 4585 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8681a4c58b14dec4c6ae76a371a009a12a89609fe6403c1612097d1fadf88593"} pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 14:25:43 crc kubenswrapper[4585]: I1201 14:25:43.717010 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerName="machine-config-daemon" containerID="cri-o://8681a4c58b14dec4c6ae76a371a009a12a89609fe6403c1612097d1fadf88593" gracePeriod=600 Dec 01 14:25:43 crc kubenswrapper[4585]: E1201 14:25:43.846189 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:25:44 crc kubenswrapper[4585]: I1201 14:25:44.598835 4585 generic.go:334] "Generic (PLEG): container finished" podID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerID="8681a4c58b14dec4c6ae76a371a009a12a89609fe6403c1612097d1fadf88593" exitCode=0 Dec 01 14:25:44 crc kubenswrapper[4585]: I1201 14:25:44.598913 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" event={"ID":"f7beb40d-bcd0-43c8-a9fe-c32408790a4c","Type":"ContainerDied","Data":"8681a4c58b14dec4c6ae76a371a009a12a89609fe6403c1612097d1fadf88593"} Dec 01 14:25:44 crc kubenswrapper[4585]: I1201 14:25:44.599211 4585 scope.go:117] "RemoveContainer" containerID="1fbf41077863f3fc04eeb32135f7e1a50fc3bb2ac74df27d60186d6226d4dc1b" Dec 01 14:25:44 crc kubenswrapper[4585]: I1201 14:25:44.599820 4585 scope.go:117] "RemoveContainer" containerID="8681a4c58b14dec4c6ae76a371a009a12a89609fe6403c1612097d1fadf88593" Dec 01 14:25:44 crc kubenswrapper[4585]: E1201 14:25:44.600221 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:25:47 crc kubenswrapper[4585]: I1201 14:25:47.043412 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-sf4wz"] Dec 01 14:25:47 crc kubenswrapper[4585]: I1201 14:25:47.057338 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-sf4wz"] Dec 01 14:25:48 crc kubenswrapper[4585]: I1201 14:25:48.434623 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7557d295-9d3a-4d0f-933a-77390e7e179e" path="/var/lib/kubelet/pods/7557d295-9d3a-4d0f-933a-77390e7e179e/volumes" Dec 01 14:25:59 crc kubenswrapper[4585]: I1201 14:25:59.412344 4585 scope.go:117] "RemoveContainer" containerID="8681a4c58b14dec4c6ae76a371a009a12a89609fe6403c1612097d1fadf88593" Dec 01 14:25:59 crc kubenswrapper[4585]: E1201 14:25:59.413093 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:26:03 crc kubenswrapper[4585]: I1201 14:26:03.052238 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-wmh4t"] Dec 01 14:26:03 crc kubenswrapper[4585]: I1201 14:26:03.061694 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-wmh4t"] Dec 01 14:26:04 crc kubenswrapper[4585]: I1201 14:26:04.424170 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7dead942-d6c5-4a4a-aa5e-5c57b6da0c48" path="/var/lib/kubelet/pods/7dead942-d6c5-4a4a-aa5e-5c57b6da0c48/volumes" Dec 01 14:26:11 crc kubenswrapper[4585]: I1201 14:26:11.412700 4585 scope.go:117] "RemoveContainer" containerID="8681a4c58b14dec4c6ae76a371a009a12a89609fe6403c1612097d1fadf88593" Dec 01 14:26:11 crc kubenswrapper[4585]: E1201 14:26:11.413410 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:26:24 crc kubenswrapper[4585]: I1201 14:26:24.413536 4585 scope.go:117] "RemoveContainer" containerID="8681a4c58b14dec4c6ae76a371a009a12a89609fe6403c1612097d1fadf88593" Dec 01 14:26:24 crc kubenswrapper[4585]: E1201 14:26:24.414597 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:26:33 crc kubenswrapper[4585]: I1201 14:26:33.880814 4585 scope.go:117] "RemoveContainer" containerID="c3462b3634ded13b7f47d7b9fed51f7c11e55c11f88e9558889f60d36bb6fe91" Dec 01 14:26:33 crc kubenswrapper[4585]: I1201 14:26:33.932801 4585 scope.go:117] "RemoveContainer" containerID="8159ec23eb18a08fdc106c8e732cbfa469362438df2218ee2363245f975bf8cb" Dec 01 14:26:33 crc kubenswrapper[4585]: I1201 14:26:33.980123 4585 scope.go:117] "RemoveContainer" containerID="7b5fe4f21372621fcdc69c13b5289393dffb7bef4fab18f7188c18ce70125f59" Dec 01 14:26:34 crc kubenswrapper[4585]: I1201 14:26:34.036330 4585 scope.go:117] "RemoveContainer" containerID="084e10a06a307eb11f7434765a84865afe81b8eb1a3e6453eb7a9fdfd8da628e" Dec 01 14:26:39 crc kubenswrapper[4585]: I1201 14:26:39.413633 4585 scope.go:117] "RemoveContainer" containerID="8681a4c58b14dec4c6ae76a371a009a12a89609fe6403c1612097d1fadf88593" Dec 01 14:26:39 crc kubenswrapper[4585]: E1201 14:26:39.414423 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:26:42 crc kubenswrapper[4585]: I1201 14:26:42.052140 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-3107-account-create-update-h6npx"] Dec 01 14:26:42 crc kubenswrapper[4585]: I1201 14:26:42.068104 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-xw7sm"] Dec 01 14:26:42 crc kubenswrapper[4585]: I1201 14:26:42.076076 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-xw7sm"] Dec 01 14:26:42 crc kubenswrapper[4585]: I1201 14:26:42.083193 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-3107-account-create-update-h6npx"] Dec 01 14:26:42 crc kubenswrapper[4585]: I1201 14:26:42.426920 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd5ea5ee-2cac-4fab-893d-83569ffbae4c" path="/var/lib/kubelet/pods/cd5ea5ee-2cac-4fab-893d-83569ffbae4c/volumes" Dec 01 14:26:42 crc kubenswrapper[4585]: I1201 14:26:42.427689 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df1492cf-2c9b-4573-9e32-dd372af19bfe" path="/var/lib/kubelet/pods/df1492cf-2c9b-4573-9e32-dd372af19bfe/volumes" Dec 01 14:26:43 crc kubenswrapper[4585]: I1201 14:26:43.035343 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-kwrm4"] Dec 01 14:26:43 crc kubenswrapper[4585]: I1201 14:26:43.045140 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-kwrm4"] Dec 01 14:26:44 crc kubenswrapper[4585]: I1201 14:26:44.029183 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-nj957"] Dec 01 14:26:44 crc kubenswrapper[4585]: I1201 14:26:44.037791 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-96f3-account-create-update-gbhsj"] Dec 01 14:26:44 crc kubenswrapper[4585]: I1201 14:26:44.045989 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-be71-account-create-update-jns4b"] Dec 01 14:26:44 crc kubenswrapper[4585]: I1201 14:26:44.055449 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-nj957"] Dec 01 14:26:44 crc kubenswrapper[4585]: I1201 14:26:44.063267 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-96f3-account-create-update-gbhsj"] Dec 01 14:26:44 crc kubenswrapper[4585]: I1201 14:26:44.070159 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-be71-account-create-update-jns4b"] Dec 01 14:26:44 crc kubenswrapper[4585]: I1201 14:26:44.424209 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e77338c-2d1b-4b3f-812a-d8419ed44fb8" path="/var/lib/kubelet/pods/5e77338c-2d1b-4b3f-812a-d8419ed44fb8/volumes" Dec 01 14:26:44 crc kubenswrapper[4585]: I1201 14:26:44.425403 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fae72ee-6647-4105-ab94-ed2ab6bed7da" path="/var/lib/kubelet/pods/8fae72ee-6647-4105-ab94-ed2ab6bed7da/volumes" Dec 01 14:26:44 crc kubenswrapper[4585]: I1201 14:26:44.426320 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9288a94-0029-4c7b-825a-ad1d005c736d" path="/var/lib/kubelet/pods/c9288a94-0029-4c7b-825a-ad1d005c736d/volumes" Dec 01 14:26:44 crc kubenswrapper[4585]: I1201 14:26:44.427348 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf5a50c8-3aeb-4d5c-b313-b2eed4da3517" path="/var/lib/kubelet/pods/cf5a50c8-3aeb-4d5c-b313-b2eed4da3517/volumes" Dec 01 14:26:50 crc kubenswrapper[4585]: I1201 14:26:50.416601 4585 scope.go:117] "RemoveContainer" containerID="8681a4c58b14dec4c6ae76a371a009a12a89609fe6403c1612097d1fadf88593" Dec 01 14:26:50 crc kubenswrapper[4585]: E1201 14:26:50.418243 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:26:51 crc kubenswrapper[4585]: I1201 14:26:51.179427 4585 generic.go:334] "Generic (PLEG): container finished" podID="16e7590a-927c-4ff1-8eb8-3b4a248ce6f8" containerID="79fca915fe92b8620ff49f714743fe50f0e4dfdbba545db273f982a663486b34" exitCode=0 Dec 01 14:26:51 crc kubenswrapper[4585]: I1201 14:26:51.179520 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xjc7h" event={"ID":"16e7590a-927c-4ff1-8eb8-3b4a248ce6f8","Type":"ContainerDied","Data":"79fca915fe92b8620ff49f714743fe50f0e4dfdbba545db273f982a663486b34"} Dec 01 14:26:52 crc kubenswrapper[4585]: I1201 14:26:52.597551 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xjc7h" Dec 01 14:26:52 crc kubenswrapper[4585]: I1201 14:26:52.777814 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lg68j\" (UniqueName: \"kubernetes.io/projected/16e7590a-927c-4ff1-8eb8-3b4a248ce6f8-kube-api-access-lg68j\") pod \"16e7590a-927c-4ff1-8eb8-3b4a248ce6f8\" (UID: \"16e7590a-927c-4ff1-8eb8-3b4a248ce6f8\") " Dec 01 14:26:52 crc kubenswrapper[4585]: I1201 14:26:52.777912 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/16e7590a-927c-4ff1-8eb8-3b4a248ce6f8-inventory\") pod \"16e7590a-927c-4ff1-8eb8-3b4a248ce6f8\" (UID: \"16e7590a-927c-4ff1-8eb8-3b4a248ce6f8\") " Dec 01 14:26:52 crc kubenswrapper[4585]: I1201 14:26:52.778097 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/16e7590a-927c-4ff1-8eb8-3b4a248ce6f8-ssh-key\") pod \"16e7590a-927c-4ff1-8eb8-3b4a248ce6f8\" (UID: \"16e7590a-927c-4ff1-8eb8-3b4a248ce6f8\") " Dec 01 14:26:52 crc kubenswrapper[4585]: I1201 14:26:52.785220 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16e7590a-927c-4ff1-8eb8-3b4a248ce6f8-kube-api-access-lg68j" (OuterVolumeSpecName: "kube-api-access-lg68j") pod "16e7590a-927c-4ff1-8eb8-3b4a248ce6f8" (UID: "16e7590a-927c-4ff1-8eb8-3b4a248ce6f8"). InnerVolumeSpecName "kube-api-access-lg68j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:26:52 crc kubenswrapper[4585]: I1201 14:26:52.804817 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16e7590a-927c-4ff1-8eb8-3b4a248ce6f8-inventory" (OuterVolumeSpecName: "inventory") pod "16e7590a-927c-4ff1-8eb8-3b4a248ce6f8" (UID: "16e7590a-927c-4ff1-8eb8-3b4a248ce6f8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:26:52 crc kubenswrapper[4585]: I1201 14:26:52.805225 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16e7590a-927c-4ff1-8eb8-3b4a248ce6f8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "16e7590a-927c-4ff1-8eb8-3b4a248ce6f8" (UID: "16e7590a-927c-4ff1-8eb8-3b4a248ce6f8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:26:52 crc kubenswrapper[4585]: I1201 14:26:52.880497 4585 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/16e7590a-927c-4ff1-8eb8-3b4a248ce6f8-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 14:26:52 crc kubenswrapper[4585]: I1201 14:26:52.880553 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lg68j\" (UniqueName: \"kubernetes.io/projected/16e7590a-927c-4ff1-8eb8-3b4a248ce6f8-kube-api-access-lg68j\") on node \"crc\" DevicePath \"\"" Dec 01 14:26:52 crc kubenswrapper[4585]: I1201 14:26:52.880566 4585 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/16e7590a-927c-4ff1-8eb8-3b4a248ce6f8-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 14:26:53 crc kubenswrapper[4585]: I1201 14:26:53.206618 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xjc7h" event={"ID":"16e7590a-927c-4ff1-8eb8-3b4a248ce6f8","Type":"ContainerDied","Data":"89b9c665e97411b19d3734dc69584a9149c19b2815eec5fe942ab673d52e740d"} Dec 01 14:26:53 crc kubenswrapper[4585]: I1201 14:26:53.206923 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89b9c665e97411b19d3734dc69584a9149c19b2815eec5fe942ab673d52e740d" Dec 01 14:26:53 crc kubenswrapper[4585]: I1201 14:26:53.206691 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xjc7h" Dec 01 14:26:53 crc kubenswrapper[4585]: I1201 14:26:53.361123 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jrbqs"] Dec 01 14:26:53 crc kubenswrapper[4585]: E1201 14:26:53.361493 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16e7590a-927c-4ff1-8eb8-3b4a248ce6f8" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 01 14:26:53 crc kubenswrapper[4585]: I1201 14:26:53.361512 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="16e7590a-927c-4ff1-8eb8-3b4a248ce6f8" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 01 14:26:53 crc kubenswrapper[4585]: I1201 14:26:53.361703 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="16e7590a-927c-4ff1-8eb8-3b4a248ce6f8" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 01 14:26:53 crc kubenswrapper[4585]: I1201 14:26:53.362329 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jrbqs" Dec 01 14:26:53 crc kubenswrapper[4585]: I1201 14:26:53.371990 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jrbqs"] Dec 01 14:26:53 crc kubenswrapper[4585]: I1201 14:26:53.375845 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 14:26:53 crc kubenswrapper[4585]: I1201 14:26:53.376086 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 14:26:53 crc kubenswrapper[4585]: I1201 14:26:53.376305 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-z6vpw" Dec 01 14:26:53 crc kubenswrapper[4585]: I1201 14:26:53.376636 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 14:26:53 crc kubenswrapper[4585]: I1201 14:26:53.491948 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sf78q\" (UniqueName: \"kubernetes.io/projected/a09e5590-d28a-4c20-80eb-ff1f448ec290-kube-api-access-sf78q\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jrbqs\" (UID: \"a09e5590-d28a-4c20-80eb-ff1f448ec290\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jrbqs" Dec 01 14:26:53 crc kubenswrapper[4585]: I1201 14:26:53.493010 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a09e5590-d28a-4c20-80eb-ff1f448ec290-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jrbqs\" (UID: \"a09e5590-d28a-4c20-80eb-ff1f448ec290\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jrbqs" Dec 01 14:26:53 crc kubenswrapper[4585]: I1201 14:26:53.493319 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a09e5590-d28a-4c20-80eb-ff1f448ec290-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jrbqs\" (UID: \"a09e5590-d28a-4c20-80eb-ff1f448ec290\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jrbqs" Dec 01 14:26:53 crc kubenswrapper[4585]: I1201 14:26:53.595481 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sf78q\" (UniqueName: \"kubernetes.io/projected/a09e5590-d28a-4c20-80eb-ff1f448ec290-kube-api-access-sf78q\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jrbqs\" (UID: \"a09e5590-d28a-4c20-80eb-ff1f448ec290\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jrbqs" Dec 01 14:26:53 crc kubenswrapper[4585]: I1201 14:26:53.595588 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a09e5590-d28a-4c20-80eb-ff1f448ec290-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jrbqs\" (UID: \"a09e5590-d28a-4c20-80eb-ff1f448ec290\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jrbqs" Dec 01 14:26:53 crc kubenswrapper[4585]: I1201 14:26:53.595647 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a09e5590-d28a-4c20-80eb-ff1f448ec290-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jrbqs\" (UID: \"a09e5590-d28a-4c20-80eb-ff1f448ec290\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jrbqs" Dec 01 14:26:53 crc kubenswrapper[4585]: I1201 14:26:53.601544 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a09e5590-d28a-4c20-80eb-ff1f448ec290-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jrbqs\" (UID: \"a09e5590-d28a-4c20-80eb-ff1f448ec290\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jrbqs" Dec 01 14:26:53 crc kubenswrapper[4585]: I1201 14:26:53.602495 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a09e5590-d28a-4c20-80eb-ff1f448ec290-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jrbqs\" (UID: \"a09e5590-d28a-4c20-80eb-ff1f448ec290\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jrbqs" Dec 01 14:26:53 crc kubenswrapper[4585]: I1201 14:26:53.618789 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sf78q\" (UniqueName: \"kubernetes.io/projected/a09e5590-d28a-4c20-80eb-ff1f448ec290-kube-api-access-sf78q\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jrbqs\" (UID: \"a09e5590-d28a-4c20-80eb-ff1f448ec290\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jrbqs" Dec 01 14:26:53 crc kubenswrapper[4585]: I1201 14:26:53.678504 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jrbqs" Dec 01 14:26:54 crc kubenswrapper[4585]: I1201 14:26:54.454831 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jrbqs"] Dec 01 14:26:55 crc kubenswrapper[4585]: I1201 14:26:55.239895 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jrbqs" event={"ID":"a09e5590-d28a-4c20-80eb-ff1f448ec290","Type":"ContainerStarted","Data":"ee7000c1b705369e63795cc34ab679bcba89276d49df7d151b4c09c73a5bfa56"} Dec 01 14:26:56 crc kubenswrapper[4585]: I1201 14:26:56.247956 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jrbqs" event={"ID":"a09e5590-d28a-4c20-80eb-ff1f448ec290","Type":"ContainerStarted","Data":"4023d59803baa9187aed770bfef49fda915cd465a75fe1f60c4ed412cde696d1"} Dec 01 14:26:56 crc kubenswrapper[4585]: I1201 14:26:56.268350 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jrbqs" podStartSLOduration=2.613895321 podStartE2EDuration="3.268331296s" podCreationTimestamp="2025-12-01 14:26:53 +0000 UTC" firstStartedPulling="2025-12-01 14:26:54.434484673 +0000 UTC m=+1728.418698528" lastFinishedPulling="2025-12-01 14:26:55.088920648 +0000 UTC m=+1729.073134503" observedRunningTime="2025-12-01 14:26:56.264826024 +0000 UTC m=+1730.249039899" watchObservedRunningTime="2025-12-01 14:26:56.268331296 +0000 UTC m=+1730.252545151" Dec 01 14:27:00 crc kubenswrapper[4585]: I1201 14:27:00.294683 4585 generic.go:334] "Generic (PLEG): container finished" podID="a09e5590-d28a-4c20-80eb-ff1f448ec290" containerID="4023d59803baa9187aed770bfef49fda915cd465a75fe1f60c4ed412cde696d1" exitCode=0 Dec 01 14:27:00 crc kubenswrapper[4585]: I1201 14:27:00.294909 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jrbqs" event={"ID":"a09e5590-d28a-4c20-80eb-ff1f448ec290","Type":"ContainerDied","Data":"4023d59803baa9187aed770bfef49fda915cd465a75fe1f60c4ed412cde696d1"} Dec 01 14:27:01 crc kubenswrapper[4585]: I1201 14:27:01.699830 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jrbqs" Dec 01 14:27:01 crc kubenswrapper[4585]: I1201 14:27:01.875048 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sf78q\" (UniqueName: \"kubernetes.io/projected/a09e5590-d28a-4c20-80eb-ff1f448ec290-kube-api-access-sf78q\") pod \"a09e5590-d28a-4c20-80eb-ff1f448ec290\" (UID: \"a09e5590-d28a-4c20-80eb-ff1f448ec290\") " Dec 01 14:27:01 crc kubenswrapper[4585]: I1201 14:27:01.875152 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a09e5590-d28a-4c20-80eb-ff1f448ec290-ssh-key\") pod \"a09e5590-d28a-4c20-80eb-ff1f448ec290\" (UID: \"a09e5590-d28a-4c20-80eb-ff1f448ec290\") " Dec 01 14:27:01 crc kubenswrapper[4585]: I1201 14:27:01.875178 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a09e5590-d28a-4c20-80eb-ff1f448ec290-inventory\") pod \"a09e5590-d28a-4c20-80eb-ff1f448ec290\" (UID: \"a09e5590-d28a-4c20-80eb-ff1f448ec290\") " Dec 01 14:27:01 crc kubenswrapper[4585]: I1201 14:27:01.880576 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a09e5590-d28a-4c20-80eb-ff1f448ec290-kube-api-access-sf78q" (OuterVolumeSpecName: "kube-api-access-sf78q") pod "a09e5590-d28a-4c20-80eb-ff1f448ec290" (UID: "a09e5590-d28a-4c20-80eb-ff1f448ec290"). InnerVolumeSpecName "kube-api-access-sf78q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:27:01 crc kubenswrapper[4585]: I1201 14:27:01.906103 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a09e5590-d28a-4c20-80eb-ff1f448ec290-inventory" (OuterVolumeSpecName: "inventory") pod "a09e5590-d28a-4c20-80eb-ff1f448ec290" (UID: "a09e5590-d28a-4c20-80eb-ff1f448ec290"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:27:01 crc kubenswrapper[4585]: I1201 14:27:01.907047 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a09e5590-d28a-4c20-80eb-ff1f448ec290-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a09e5590-d28a-4c20-80eb-ff1f448ec290" (UID: "a09e5590-d28a-4c20-80eb-ff1f448ec290"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:27:01 crc kubenswrapper[4585]: I1201 14:27:01.977940 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sf78q\" (UniqueName: \"kubernetes.io/projected/a09e5590-d28a-4c20-80eb-ff1f448ec290-kube-api-access-sf78q\") on node \"crc\" DevicePath \"\"" Dec 01 14:27:01 crc kubenswrapper[4585]: I1201 14:27:01.977998 4585 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a09e5590-d28a-4c20-80eb-ff1f448ec290-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 14:27:01 crc kubenswrapper[4585]: I1201 14:27:01.978011 4585 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a09e5590-d28a-4c20-80eb-ff1f448ec290-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 14:27:02 crc kubenswrapper[4585]: I1201 14:27:02.318001 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jrbqs" event={"ID":"a09e5590-d28a-4c20-80eb-ff1f448ec290","Type":"ContainerDied","Data":"ee7000c1b705369e63795cc34ab679bcba89276d49df7d151b4c09c73a5bfa56"} Dec 01 14:27:02 crc kubenswrapper[4585]: I1201 14:27:02.318051 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee7000c1b705369e63795cc34ab679bcba89276d49df7d151b4c09c73a5bfa56" Dec 01 14:27:02 crc kubenswrapper[4585]: I1201 14:27:02.318066 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jrbqs" Dec 01 14:27:02 crc kubenswrapper[4585]: I1201 14:27:02.387353 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-pzrlq"] Dec 01 14:27:02 crc kubenswrapper[4585]: E1201 14:27:02.387742 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a09e5590-d28a-4c20-80eb-ff1f448ec290" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 01 14:27:02 crc kubenswrapper[4585]: I1201 14:27:02.387756 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="a09e5590-d28a-4c20-80eb-ff1f448ec290" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 01 14:27:02 crc kubenswrapper[4585]: I1201 14:27:02.387935 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="a09e5590-d28a-4c20-80eb-ff1f448ec290" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 01 14:27:02 crc kubenswrapper[4585]: I1201 14:27:02.388513 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pzrlq" Dec 01 14:27:02 crc kubenswrapper[4585]: I1201 14:27:02.391231 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 14:27:02 crc kubenswrapper[4585]: I1201 14:27:02.391433 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 14:27:02 crc kubenswrapper[4585]: I1201 14:27:02.391662 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 14:27:02 crc kubenswrapper[4585]: I1201 14:27:02.391796 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-z6vpw" Dec 01 14:27:02 crc kubenswrapper[4585]: I1201 14:27:02.403880 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-pzrlq"] Dec 01 14:27:02 crc kubenswrapper[4585]: I1201 14:27:02.588816 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqh75\" (UniqueName: \"kubernetes.io/projected/0919f038-3bf5-4f3c-baa3-5c85ceef4819-kube-api-access-tqh75\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pzrlq\" (UID: \"0919f038-3bf5-4f3c-baa3-5c85ceef4819\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pzrlq" Dec 01 14:27:02 crc kubenswrapper[4585]: I1201 14:27:02.588944 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0919f038-3bf5-4f3c-baa3-5c85ceef4819-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pzrlq\" (UID: \"0919f038-3bf5-4f3c-baa3-5c85ceef4819\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pzrlq" Dec 01 14:27:02 crc kubenswrapper[4585]: I1201 14:27:02.589105 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0919f038-3bf5-4f3c-baa3-5c85ceef4819-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pzrlq\" (UID: \"0919f038-3bf5-4f3c-baa3-5c85ceef4819\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pzrlq" Dec 01 14:27:02 crc kubenswrapper[4585]: I1201 14:27:02.696356 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqh75\" (UniqueName: \"kubernetes.io/projected/0919f038-3bf5-4f3c-baa3-5c85ceef4819-kube-api-access-tqh75\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pzrlq\" (UID: \"0919f038-3bf5-4f3c-baa3-5c85ceef4819\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pzrlq" Dec 01 14:27:02 crc kubenswrapper[4585]: I1201 14:27:02.696454 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0919f038-3bf5-4f3c-baa3-5c85ceef4819-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pzrlq\" (UID: \"0919f038-3bf5-4f3c-baa3-5c85ceef4819\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pzrlq" Dec 01 14:27:02 crc kubenswrapper[4585]: I1201 14:27:02.696547 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0919f038-3bf5-4f3c-baa3-5c85ceef4819-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pzrlq\" (UID: \"0919f038-3bf5-4f3c-baa3-5c85ceef4819\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pzrlq" Dec 01 14:27:02 crc kubenswrapper[4585]: I1201 14:27:02.702793 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0919f038-3bf5-4f3c-baa3-5c85ceef4819-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pzrlq\" (UID: \"0919f038-3bf5-4f3c-baa3-5c85ceef4819\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pzrlq" Dec 01 14:27:02 crc kubenswrapper[4585]: I1201 14:27:02.713624 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0919f038-3bf5-4f3c-baa3-5c85ceef4819-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pzrlq\" (UID: \"0919f038-3bf5-4f3c-baa3-5c85ceef4819\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pzrlq" Dec 01 14:27:02 crc kubenswrapper[4585]: I1201 14:27:02.720530 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqh75\" (UniqueName: \"kubernetes.io/projected/0919f038-3bf5-4f3c-baa3-5c85ceef4819-kube-api-access-tqh75\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-pzrlq\" (UID: \"0919f038-3bf5-4f3c-baa3-5c85ceef4819\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pzrlq" Dec 01 14:27:03 crc kubenswrapper[4585]: I1201 14:27:03.016335 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pzrlq" Dec 01 14:27:03 crc kubenswrapper[4585]: I1201 14:27:03.552357 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-pzrlq"] Dec 01 14:27:04 crc kubenswrapper[4585]: I1201 14:27:04.336505 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pzrlq" event={"ID":"0919f038-3bf5-4f3c-baa3-5c85ceef4819","Type":"ContainerStarted","Data":"b72330e2997776e92e5701c45bc6d1705271bb4cd181b17c26ee7657681cec90"} Dec 01 14:27:04 crc kubenswrapper[4585]: I1201 14:27:04.336839 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pzrlq" event={"ID":"0919f038-3bf5-4f3c-baa3-5c85ceef4819","Type":"ContainerStarted","Data":"75b3a1e2e56b567a6959bf6a61c395d6a9a6f6c184c1e33cb15168c87b8fce03"} Dec 01 14:27:04 crc kubenswrapper[4585]: I1201 14:27:04.359366 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pzrlq" podStartSLOduration=1.940794197 podStartE2EDuration="2.359348686s" podCreationTimestamp="2025-12-01 14:27:02 +0000 UTC" firstStartedPulling="2025-12-01 14:27:03.559099732 +0000 UTC m=+1737.543313597" lastFinishedPulling="2025-12-01 14:27:03.977654231 +0000 UTC m=+1737.961868086" observedRunningTime="2025-12-01 14:27:04.359287565 +0000 UTC m=+1738.343501430" watchObservedRunningTime="2025-12-01 14:27:04.359348686 +0000 UTC m=+1738.343562531" Dec 01 14:27:05 crc kubenswrapper[4585]: I1201 14:27:05.412867 4585 scope.go:117] "RemoveContainer" containerID="8681a4c58b14dec4c6ae76a371a009a12a89609fe6403c1612097d1fadf88593" Dec 01 14:27:05 crc kubenswrapper[4585]: E1201 14:27:05.413343 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:27:16 crc kubenswrapper[4585]: I1201 14:27:16.418522 4585 scope.go:117] "RemoveContainer" containerID="8681a4c58b14dec4c6ae76a371a009a12a89609fe6403c1612097d1fadf88593" Dec 01 14:27:16 crc kubenswrapper[4585]: E1201 14:27:16.419357 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:27:20 crc kubenswrapper[4585]: I1201 14:27:20.040736 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-lc2cr"] Dec 01 14:27:20 crc kubenswrapper[4585]: I1201 14:27:20.056208 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-lc2cr"] Dec 01 14:27:20 crc kubenswrapper[4585]: I1201 14:27:20.426649 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d820d720-5215-4178-ad7f-f0b493bf2529" path="/var/lib/kubelet/pods/d820d720-5215-4178-ad7f-f0b493bf2529/volumes" Dec 01 14:27:30 crc kubenswrapper[4585]: I1201 14:27:30.412582 4585 scope.go:117] "RemoveContainer" containerID="8681a4c58b14dec4c6ae76a371a009a12a89609fe6403c1612097d1fadf88593" Dec 01 14:27:30 crc kubenswrapper[4585]: E1201 14:27:30.413325 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:27:34 crc kubenswrapper[4585]: I1201 14:27:34.136512 4585 scope.go:117] "RemoveContainer" containerID="e68645bbc5a4775bda75665356d68ad4171f5673bf6730506664aa42347b28d5" Dec 01 14:27:34 crc kubenswrapper[4585]: I1201 14:27:34.178766 4585 scope.go:117] "RemoveContainer" containerID="68f80228535ebc2a63137bc109a9b59ff5ca0b48b8f3cb66e64b313712519a9f" Dec 01 14:27:34 crc kubenswrapper[4585]: I1201 14:27:34.225635 4585 scope.go:117] "RemoveContainer" containerID="6b10d374ff49f0b52403252a14d9078e06a365dc921dac51dd9a7129fea487fd" Dec 01 14:27:34 crc kubenswrapper[4585]: I1201 14:27:34.258214 4585 scope.go:117] "RemoveContainer" containerID="56623aba05bacca8f53c48692379602533229bec61a02d4e039c500069208602" Dec 01 14:27:34 crc kubenswrapper[4585]: I1201 14:27:34.313636 4585 scope.go:117] "RemoveContainer" containerID="6c085fb6bd248981b08329c5f59cde937909cb1150b8d67b7030daab26098120" Dec 01 14:27:34 crc kubenswrapper[4585]: I1201 14:27:34.349125 4585 scope.go:117] "RemoveContainer" containerID="096cf172eca6df5c714d4622a97f04fa426f712f10545ae184a763b5eb8ad637" Dec 01 14:27:34 crc kubenswrapper[4585]: I1201 14:27:34.413391 4585 scope.go:117] "RemoveContainer" containerID="d82a1c29492bac4e19d334478f06a5f4e23b64ac27e343752d065d1407092ed0" Dec 01 14:27:43 crc kubenswrapper[4585]: I1201 14:27:43.412902 4585 scope.go:117] "RemoveContainer" containerID="8681a4c58b14dec4c6ae76a371a009a12a89609fe6403c1612097d1fadf88593" Dec 01 14:27:43 crc kubenswrapper[4585]: E1201 14:27:43.413694 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:27:44 crc kubenswrapper[4585]: I1201 14:27:44.683472 4585 generic.go:334] "Generic (PLEG): container finished" podID="0919f038-3bf5-4f3c-baa3-5c85ceef4819" containerID="b72330e2997776e92e5701c45bc6d1705271bb4cd181b17c26ee7657681cec90" exitCode=0 Dec 01 14:27:44 crc kubenswrapper[4585]: I1201 14:27:44.683514 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pzrlq" event={"ID":"0919f038-3bf5-4f3c-baa3-5c85ceef4819","Type":"ContainerDied","Data":"b72330e2997776e92e5701c45bc6d1705271bb4cd181b17c26ee7657681cec90"} Dec 01 14:27:46 crc kubenswrapper[4585]: I1201 14:27:46.044491 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-b7vt8"] Dec 01 14:27:46 crc kubenswrapper[4585]: I1201 14:27:46.061017 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-b7vt8"] Dec 01 14:27:46 crc kubenswrapper[4585]: I1201 14:27:46.078356 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-vfxqx"] Dec 01 14:27:46 crc kubenswrapper[4585]: I1201 14:27:46.087657 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-vfxqx"] Dec 01 14:27:46 crc kubenswrapper[4585]: I1201 14:27:46.157347 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pzrlq" Dec 01 14:27:46 crc kubenswrapper[4585]: I1201 14:27:46.324192 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0919f038-3bf5-4f3c-baa3-5c85ceef4819-ssh-key\") pod \"0919f038-3bf5-4f3c-baa3-5c85ceef4819\" (UID: \"0919f038-3bf5-4f3c-baa3-5c85ceef4819\") " Dec 01 14:27:46 crc kubenswrapper[4585]: I1201 14:27:46.324346 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0919f038-3bf5-4f3c-baa3-5c85ceef4819-inventory\") pod \"0919f038-3bf5-4f3c-baa3-5c85ceef4819\" (UID: \"0919f038-3bf5-4f3c-baa3-5c85ceef4819\") " Dec 01 14:27:46 crc kubenswrapper[4585]: I1201 14:27:46.324515 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqh75\" (UniqueName: \"kubernetes.io/projected/0919f038-3bf5-4f3c-baa3-5c85ceef4819-kube-api-access-tqh75\") pod \"0919f038-3bf5-4f3c-baa3-5c85ceef4819\" (UID: \"0919f038-3bf5-4f3c-baa3-5c85ceef4819\") " Dec 01 14:27:46 crc kubenswrapper[4585]: I1201 14:27:46.335755 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0919f038-3bf5-4f3c-baa3-5c85ceef4819-kube-api-access-tqh75" (OuterVolumeSpecName: "kube-api-access-tqh75") pod "0919f038-3bf5-4f3c-baa3-5c85ceef4819" (UID: "0919f038-3bf5-4f3c-baa3-5c85ceef4819"). InnerVolumeSpecName "kube-api-access-tqh75". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:27:46 crc kubenswrapper[4585]: I1201 14:27:46.349607 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0919f038-3bf5-4f3c-baa3-5c85ceef4819-inventory" (OuterVolumeSpecName: "inventory") pod "0919f038-3bf5-4f3c-baa3-5c85ceef4819" (UID: "0919f038-3bf5-4f3c-baa3-5c85ceef4819"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:27:46 crc kubenswrapper[4585]: I1201 14:27:46.364408 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0919f038-3bf5-4f3c-baa3-5c85ceef4819-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0919f038-3bf5-4f3c-baa3-5c85ceef4819" (UID: "0919f038-3bf5-4f3c-baa3-5c85ceef4819"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:27:46 crc kubenswrapper[4585]: I1201 14:27:46.426712 4585 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0919f038-3bf5-4f3c-baa3-5c85ceef4819-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 14:27:46 crc kubenswrapper[4585]: I1201 14:27:46.426754 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqh75\" (UniqueName: \"kubernetes.io/projected/0919f038-3bf5-4f3c-baa3-5c85ceef4819-kube-api-access-tqh75\") on node \"crc\" DevicePath \"\"" Dec 01 14:27:46 crc kubenswrapper[4585]: I1201 14:27:46.426771 4585 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0919f038-3bf5-4f3c-baa3-5c85ceef4819-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 14:27:46 crc kubenswrapper[4585]: I1201 14:27:46.426720 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31a54fa3-c02b-4d90-8375-b50ab8de60fe" path="/var/lib/kubelet/pods/31a54fa3-c02b-4d90-8375-b50ab8de60fe/volumes" Dec 01 14:27:46 crc kubenswrapper[4585]: I1201 14:27:46.427533 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa5f1c18-8b57-47a1-ba5d-5ae47fd1ae30" path="/var/lib/kubelet/pods/fa5f1c18-8b57-47a1-ba5d-5ae47fd1ae30/volumes" Dec 01 14:27:46 crc kubenswrapper[4585]: I1201 14:27:46.707193 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pzrlq" event={"ID":"0919f038-3bf5-4f3c-baa3-5c85ceef4819","Type":"ContainerDied","Data":"75b3a1e2e56b567a6959bf6a61c395d6a9a6f6c184c1e33cb15168c87b8fce03"} Dec 01 14:27:46 crc kubenswrapper[4585]: I1201 14:27:46.707237 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75b3a1e2e56b567a6959bf6a61c395d6a9a6f6c184c1e33cb15168c87b8fce03" Dec 01 14:27:46 crc kubenswrapper[4585]: I1201 14:27:46.707255 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-pzrlq" Dec 01 14:27:46 crc kubenswrapper[4585]: I1201 14:27:46.790275 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hb7s8"] Dec 01 14:27:46 crc kubenswrapper[4585]: E1201 14:27:46.790806 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0919f038-3bf5-4f3c-baa3-5c85ceef4819" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 01 14:27:46 crc kubenswrapper[4585]: I1201 14:27:46.790832 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="0919f038-3bf5-4f3c-baa3-5c85ceef4819" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 01 14:27:46 crc kubenswrapper[4585]: I1201 14:27:46.791093 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="0919f038-3bf5-4f3c-baa3-5c85ceef4819" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 01 14:27:46 crc kubenswrapper[4585]: I1201 14:27:46.791854 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hb7s8" Dec 01 14:27:46 crc kubenswrapper[4585]: I1201 14:27:46.796061 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 14:27:46 crc kubenswrapper[4585]: I1201 14:27:46.796269 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 14:27:46 crc kubenswrapper[4585]: I1201 14:27:46.796454 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 14:27:46 crc kubenswrapper[4585]: I1201 14:27:46.796725 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-z6vpw" Dec 01 14:27:46 crc kubenswrapper[4585]: I1201 14:27:46.805417 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hb7s8"] Dec 01 14:27:46 crc kubenswrapper[4585]: I1201 14:27:46.936197 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j54ws\" (UniqueName: \"kubernetes.io/projected/c12f5739-060f-4047-b987-d8c958aeb133-kube-api-access-j54ws\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hb7s8\" (UID: \"c12f5739-060f-4047-b987-d8c958aeb133\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hb7s8" Dec 01 14:27:46 crc kubenswrapper[4585]: I1201 14:27:46.936559 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c12f5739-060f-4047-b987-d8c958aeb133-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hb7s8\" (UID: \"c12f5739-060f-4047-b987-d8c958aeb133\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hb7s8" Dec 01 14:27:46 crc kubenswrapper[4585]: I1201 14:27:46.936596 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c12f5739-060f-4047-b987-d8c958aeb133-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hb7s8\" (UID: \"c12f5739-060f-4047-b987-d8c958aeb133\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hb7s8" Dec 01 14:27:47 crc kubenswrapper[4585]: I1201 14:27:47.038246 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c12f5739-060f-4047-b987-d8c958aeb133-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hb7s8\" (UID: \"c12f5739-060f-4047-b987-d8c958aeb133\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hb7s8" Dec 01 14:27:47 crc kubenswrapper[4585]: I1201 14:27:47.038560 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c12f5739-060f-4047-b987-d8c958aeb133-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hb7s8\" (UID: \"c12f5739-060f-4047-b987-d8c958aeb133\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hb7s8" Dec 01 14:27:47 crc kubenswrapper[4585]: I1201 14:27:47.038736 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j54ws\" (UniqueName: \"kubernetes.io/projected/c12f5739-060f-4047-b987-d8c958aeb133-kube-api-access-j54ws\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hb7s8\" (UID: \"c12f5739-060f-4047-b987-d8c958aeb133\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hb7s8" Dec 01 14:27:47 crc kubenswrapper[4585]: I1201 14:27:47.043408 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c12f5739-060f-4047-b987-d8c958aeb133-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hb7s8\" (UID: \"c12f5739-060f-4047-b987-d8c958aeb133\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hb7s8" Dec 01 14:27:47 crc kubenswrapper[4585]: I1201 14:27:47.046826 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c12f5739-060f-4047-b987-d8c958aeb133-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hb7s8\" (UID: \"c12f5739-060f-4047-b987-d8c958aeb133\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hb7s8" Dec 01 14:27:47 crc kubenswrapper[4585]: I1201 14:27:47.055576 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j54ws\" (UniqueName: \"kubernetes.io/projected/c12f5739-060f-4047-b987-d8c958aeb133-kube-api-access-j54ws\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hb7s8\" (UID: \"c12f5739-060f-4047-b987-d8c958aeb133\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hb7s8" Dec 01 14:27:47 crc kubenswrapper[4585]: I1201 14:27:47.115820 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hb7s8" Dec 01 14:27:47 crc kubenswrapper[4585]: I1201 14:27:47.637145 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hb7s8"] Dec 01 14:27:47 crc kubenswrapper[4585]: I1201 14:27:47.716163 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hb7s8" event={"ID":"c12f5739-060f-4047-b987-d8c958aeb133","Type":"ContainerStarted","Data":"8807841ba009377fda390d03d4cee02a4399d50c2148112eb1cd7107c4767302"} Dec 01 14:27:48 crc kubenswrapper[4585]: I1201 14:27:48.726631 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hb7s8" event={"ID":"c12f5739-060f-4047-b987-d8c958aeb133","Type":"ContainerStarted","Data":"aa793d7d7b8ff7956e8b98de9aa9ea2590f2aeb75f7d44b075a348b8b3a271c3"} Dec 01 14:27:48 crc kubenswrapper[4585]: I1201 14:27:48.747207 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hb7s8" podStartSLOduration=2.222855482 podStartE2EDuration="2.747189574s" podCreationTimestamp="2025-12-01 14:27:46 +0000 UTC" firstStartedPulling="2025-12-01 14:27:47.650498629 +0000 UTC m=+1781.634712484" lastFinishedPulling="2025-12-01 14:27:48.174832711 +0000 UTC m=+1782.159046576" observedRunningTime="2025-12-01 14:27:48.741967055 +0000 UTC m=+1782.726180920" watchObservedRunningTime="2025-12-01 14:27:48.747189574 +0000 UTC m=+1782.731403419" Dec 01 14:27:54 crc kubenswrapper[4585]: I1201 14:27:54.413606 4585 scope.go:117] "RemoveContainer" containerID="8681a4c58b14dec4c6ae76a371a009a12a89609fe6403c1612097d1fadf88593" Dec 01 14:27:54 crc kubenswrapper[4585]: E1201 14:27:54.414470 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:28:08 crc kubenswrapper[4585]: I1201 14:28:08.412955 4585 scope.go:117] "RemoveContainer" containerID="8681a4c58b14dec4c6ae76a371a009a12a89609fe6403c1612097d1fadf88593" Dec 01 14:28:08 crc kubenswrapper[4585]: E1201 14:28:08.413942 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:28:20 crc kubenswrapper[4585]: I1201 14:28:20.414264 4585 scope.go:117] "RemoveContainer" containerID="8681a4c58b14dec4c6ae76a371a009a12a89609fe6403c1612097d1fadf88593" Dec 01 14:28:20 crc kubenswrapper[4585]: E1201 14:28:20.415016 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:28:30 crc kubenswrapper[4585]: I1201 14:28:30.050319 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-q986x"] Dec 01 14:28:30 crc kubenswrapper[4585]: I1201 14:28:30.057542 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-q986x"] Dec 01 14:28:30 crc kubenswrapper[4585]: I1201 14:28:30.432219 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae2d266c-cec5-4d51-afcc-d948d3cd7903" path="/var/lib/kubelet/pods/ae2d266c-cec5-4d51-afcc-d948d3cd7903/volumes" Dec 01 14:28:32 crc kubenswrapper[4585]: I1201 14:28:32.413841 4585 scope.go:117] "RemoveContainer" containerID="8681a4c58b14dec4c6ae76a371a009a12a89609fe6403c1612097d1fadf88593" Dec 01 14:28:32 crc kubenswrapper[4585]: E1201 14:28:32.414409 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:28:34 crc kubenswrapper[4585]: I1201 14:28:34.552855 4585 scope.go:117] "RemoveContainer" containerID="166f287ec8ac7df3fc80d66b577864cf2107dbc4710cf00e57d7f82a993c8ae9" Dec 01 14:28:34 crc kubenswrapper[4585]: I1201 14:28:34.595154 4585 scope.go:117] "RemoveContainer" containerID="06d1b0509366b8cb1d0354b0ea6d376a208ae3ae2bdfdfa9d6a09d3b1c2672f7" Dec 01 14:28:34 crc kubenswrapper[4585]: I1201 14:28:34.640277 4585 scope.go:117] "RemoveContainer" containerID="fcea060dd8edaf28d77724cb85e1b60a91cb53c09162f38fdc417a7e1376ad5c" Dec 01 14:28:45 crc kubenswrapper[4585]: I1201 14:28:45.224005 4585 generic.go:334] "Generic (PLEG): container finished" podID="c12f5739-060f-4047-b987-d8c958aeb133" containerID="aa793d7d7b8ff7956e8b98de9aa9ea2590f2aeb75f7d44b075a348b8b3a271c3" exitCode=0 Dec 01 14:28:45 crc kubenswrapper[4585]: I1201 14:28:45.224073 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hb7s8" event={"ID":"c12f5739-060f-4047-b987-d8c958aeb133","Type":"ContainerDied","Data":"aa793d7d7b8ff7956e8b98de9aa9ea2590f2aeb75f7d44b075a348b8b3a271c3"} Dec 01 14:28:46 crc kubenswrapper[4585]: I1201 14:28:46.429366 4585 scope.go:117] "RemoveContainer" containerID="8681a4c58b14dec4c6ae76a371a009a12a89609fe6403c1612097d1fadf88593" Dec 01 14:28:46 crc kubenswrapper[4585]: E1201 14:28:46.429857 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:28:46 crc kubenswrapper[4585]: I1201 14:28:46.650440 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hb7s8" Dec 01 14:28:46 crc kubenswrapper[4585]: I1201 14:28:46.716196 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j54ws\" (UniqueName: \"kubernetes.io/projected/c12f5739-060f-4047-b987-d8c958aeb133-kube-api-access-j54ws\") pod \"c12f5739-060f-4047-b987-d8c958aeb133\" (UID: \"c12f5739-060f-4047-b987-d8c958aeb133\") " Dec 01 14:28:46 crc kubenswrapper[4585]: I1201 14:28:46.716446 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c12f5739-060f-4047-b987-d8c958aeb133-inventory\") pod \"c12f5739-060f-4047-b987-d8c958aeb133\" (UID: \"c12f5739-060f-4047-b987-d8c958aeb133\") " Dec 01 14:28:46 crc kubenswrapper[4585]: I1201 14:28:46.716492 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c12f5739-060f-4047-b987-d8c958aeb133-ssh-key\") pod \"c12f5739-060f-4047-b987-d8c958aeb133\" (UID: \"c12f5739-060f-4047-b987-d8c958aeb133\") " Dec 01 14:28:46 crc kubenswrapper[4585]: I1201 14:28:46.723304 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c12f5739-060f-4047-b987-d8c958aeb133-kube-api-access-j54ws" (OuterVolumeSpecName: "kube-api-access-j54ws") pod "c12f5739-060f-4047-b987-d8c958aeb133" (UID: "c12f5739-060f-4047-b987-d8c958aeb133"). InnerVolumeSpecName "kube-api-access-j54ws". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:28:46 crc kubenswrapper[4585]: I1201 14:28:46.745821 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c12f5739-060f-4047-b987-d8c958aeb133-inventory" (OuterVolumeSpecName: "inventory") pod "c12f5739-060f-4047-b987-d8c958aeb133" (UID: "c12f5739-060f-4047-b987-d8c958aeb133"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:28:46 crc kubenswrapper[4585]: I1201 14:28:46.746158 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c12f5739-060f-4047-b987-d8c958aeb133-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c12f5739-060f-4047-b987-d8c958aeb133" (UID: "c12f5739-060f-4047-b987-d8c958aeb133"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:28:46 crc kubenswrapper[4585]: I1201 14:28:46.819317 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j54ws\" (UniqueName: \"kubernetes.io/projected/c12f5739-060f-4047-b987-d8c958aeb133-kube-api-access-j54ws\") on node \"crc\" DevicePath \"\"" Dec 01 14:28:46 crc kubenswrapper[4585]: I1201 14:28:46.819353 4585 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c12f5739-060f-4047-b987-d8c958aeb133-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 14:28:46 crc kubenswrapper[4585]: I1201 14:28:46.819362 4585 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c12f5739-060f-4047-b987-d8c958aeb133-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 14:28:47 crc kubenswrapper[4585]: I1201 14:28:47.244648 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hb7s8" event={"ID":"c12f5739-060f-4047-b987-d8c958aeb133","Type":"ContainerDied","Data":"8807841ba009377fda390d03d4cee02a4399d50c2148112eb1cd7107c4767302"} Dec 01 14:28:47 crc kubenswrapper[4585]: I1201 14:28:47.244700 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8807841ba009377fda390d03d4cee02a4399d50c2148112eb1cd7107c4767302" Dec 01 14:28:47 crc kubenswrapper[4585]: I1201 14:28:47.244717 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hb7s8" Dec 01 14:28:47 crc kubenswrapper[4585]: I1201 14:28:47.375222 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-pbt6b"] Dec 01 14:28:47 crc kubenswrapper[4585]: E1201 14:28:47.375828 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c12f5739-060f-4047-b987-d8c958aeb133" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 01 14:28:47 crc kubenswrapper[4585]: I1201 14:28:47.375852 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="c12f5739-060f-4047-b987-d8c958aeb133" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 01 14:28:47 crc kubenswrapper[4585]: I1201 14:28:47.376153 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="c12f5739-060f-4047-b987-d8c958aeb133" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 01 14:28:47 crc kubenswrapper[4585]: I1201 14:28:47.377012 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-pbt6b" Dec 01 14:28:47 crc kubenswrapper[4585]: I1201 14:28:47.378898 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-z6vpw" Dec 01 14:28:47 crc kubenswrapper[4585]: I1201 14:28:47.379586 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 14:28:47 crc kubenswrapper[4585]: I1201 14:28:47.379841 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 14:28:47 crc kubenswrapper[4585]: I1201 14:28:47.381199 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-pbt6b"] Dec 01 14:28:47 crc kubenswrapper[4585]: I1201 14:28:47.381239 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 14:28:47 crc kubenswrapper[4585]: I1201 14:28:47.430108 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/31a42d9c-35e4-437d-8f54-47a3cef27d7e-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-pbt6b\" (UID: \"31a42d9c-35e4-437d-8f54-47a3cef27d7e\") " pod="openstack/ssh-known-hosts-edpm-deployment-pbt6b" Dec 01 14:28:47 crc kubenswrapper[4585]: I1201 14:28:47.430239 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/31a42d9c-35e4-437d-8f54-47a3cef27d7e-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-pbt6b\" (UID: \"31a42d9c-35e4-437d-8f54-47a3cef27d7e\") " pod="openstack/ssh-known-hosts-edpm-deployment-pbt6b" Dec 01 14:28:47 crc kubenswrapper[4585]: I1201 14:28:47.430275 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7drd\" (UniqueName: \"kubernetes.io/projected/31a42d9c-35e4-437d-8f54-47a3cef27d7e-kube-api-access-b7drd\") pod \"ssh-known-hosts-edpm-deployment-pbt6b\" (UID: \"31a42d9c-35e4-437d-8f54-47a3cef27d7e\") " pod="openstack/ssh-known-hosts-edpm-deployment-pbt6b" Dec 01 14:28:47 crc kubenswrapper[4585]: I1201 14:28:47.531955 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/31a42d9c-35e4-437d-8f54-47a3cef27d7e-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-pbt6b\" (UID: \"31a42d9c-35e4-437d-8f54-47a3cef27d7e\") " pod="openstack/ssh-known-hosts-edpm-deployment-pbt6b" Dec 01 14:28:47 crc kubenswrapper[4585]: I1201 14:28:47.532089 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/31a42d9c-35e4-437d-8f54-47a3cef27d7e-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-pbt6b\" (UID: \"31a42d9c-35e4-437d-8f54-47a3cef27d7e\") " pod="openstack/ssh-known-hosts-edpm-deployment-pbt6b" Dec 01 14:28:47 crc kubenswrapper[4585]: I1201 14:28:47.532130 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7drd\" (UniqueName: \"kubernetes.io/projected/31a42d9c-35e4-437d-8f54-47a3cef27d7e-kube-api-access-b7drd\") pod \"ssh-known-hosts-edpm-deployment-pbt6b\" (UID: \"31a42d9c-35e4-437d-8f54-47a3cef27d7e\") " pod="openstack/ssh-known-hosts-edpm-deployment-pbt6b" Dec 01 14:28:47 crc kubenswrapper[4585]: I1201 14:28:47.535736 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/31a42d9c-35e4-437d-8f54-47a3cef27d7e-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-pbt6b\" (UID: \"31a42d9c-35e4-437d-8f54-47a3cef27d7e\") " pod="openstack/ssh-known-hosts-edpm-deployment-pbt6b" Dec 01 14:28:47 crc kubenswrapper[4585]: I1201 14:28:47.540398 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/31a42d9c-35e4-437d-8f54-47a3cef27d7e-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-pbt6b\" (UID: \"31a42d9c-35e4-437d-8f54-47a3cef27d7e\") " pod="openstack/ssh-known-hosts-edpm-deployment-pbt6b" Dec 01 14:28:47 crc kubenswrapper[4585]: I1201 14:28:47.549076 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7drd\" (UniqueName: \"kubernetes.io/projected/31a42d9c-35e4-437d-8f54-47a3cef27d7e-kube-api-access-b7drd\") pod \"ssh-known-hosts-edpm-deployment-pbt6b\" (UID: \"31a42d9c-35e4-437d-8f54-47a3cef27d7e\") " pod="openstack/ssh-known-hosts-edpm-deployment-pbt6b" Dec 01 14:28:47 crc kubenswrapper[4585]: I1201 14:28:47.704719 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-pbt6b" Dec 01 14:28:48 crc kubenswrapper[4585]: I1201 14:28:48.237231 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-pbt6b"] Dec 01 14:28:48 crc kubenswrapper[4585]: I1201 14:28:48.251536 4585 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 14:28:49 crc kubenswrapper[4585]: I1201 14:28:49.264024 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-pbt6b" event={"ID":"31a42d9c-35e4-437d-8f54-47a3cef27d7e","Type":"ContainerStarted","Data":"1b12707d28071d91152c9fcd80b11bc4e3500f062d8fe9757056489571eef566"} Dec 01 14:28:49 crc kubenswrapper[4585]: I1201 14:28:49.264321 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-pbt6b" event={"ID":"31a42d9c-35e4-437d-8f54-47a3cef27d7e","Type":"ContainerStarted","Data":"5c894a018b9f3d501caaabd7a00651eb2545b6c654dc61048917a044841118c9"} Dec 01 14:28:49 crc kubenswrapper[4585]: I1201 14:28:49.283787 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-pbt6b" podStartSLOduration=1.7036485940000001 podStartE2EDuration="2.283769204s" podCreationTimestamp="2025-12-01 14:28:47 +0000 UTC" firstStartedPulling="2025-12-01 14:28:48.251353075 +0000 UTC m=+1842.235566930" lastFinishedPulling="2025-12-01 14:28:48.831473685 +0000 UTC m=+1842.815687540" observedRunningTime="2025-12-01 14:28:49.280954279 +0000 UTC m=+1843.265168144" watchObservedRunningTime="2025-12-01 14:28:49.283769204 +0000 UTC m=+1843.267983069" Dec 01 14:28:56 crc kubenswrapper[4585]: I1201 14:28:56.329761 4585 generic.go:334] "Generic (PLEG): container finished" podID="31a42d9c-35e4-437d-8f54-47a3cef27d7e" containerID="1b12707d28071d91152c9fcd80b11bc4e3500f062d8fe9757056489571eef566" exitCode=0 Dec 01 14:28:56 crc kubenswrapper[4585]: I1201 14:28:56.329926 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-pbt6b" event={"ID":"31a42d9c-35e4-437d-8f54-47a3cef27d7e","Type":"ContainerDied","Data":"1b12707d28071d91152c9fcd80b11bc4e3500f062d8fe9757056489571eef566"} Dec 01 14:28:57 crc kubenswrapper[4585]: I1201 14:28:57.775137 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-pbt6b" Dec 01 14:28:57 crc kubenswrapper[4585]: I1201 14:28:57.927292 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7drd\" (UniqueName: \"kubernetes.io/projected/31a42d9c-35e4-437d-8f54-47a3cef27d7e-kube-api-access-b7drd\") pod \"31a42d9c-35e4-437d-8f54-47a3cef27d7e\" (UID: \"31a42d9c-35e4-437d-8f54-47a3cef27d7e\") " Dec 01 14:28:57 crc kubenswrapper[4585]: I1201 14:28:57.927534 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/31a42d9c-35e4-437d-8f54-47a3cef27d7e-ssh-key-openstack-edpm-ipam\") pod \"31a42d9c-35e4-437d-8f54-47a3cef27d7e\" (UID: \"31a42d9c-35e4-437d-8f54-47a3cef27d7e\") " Dec 01 14:28:57 crc kubenswrapper[4585]: I1201 14:28:57.927759 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/31a42d9c-35e4-437d-8f54-47a3cef27d7e-inventory-0\") pod \"31a42d9c-35e4-437d-8f54-47a3cef27d7e\" (UID: \"31a42d9c-35e4-437d-8f54-47a3cef27d7e\") " Dec 01 14:28:57 crc kubenswrapper[4585]: I1201 14:28:57.932763 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31a42d9c-35e4-437d-8f54-47a3cef27d7e-kube-api-access-b7drd" (OuterVolumeSpecName: "kube-api-access-b7drd") pod "31a42d9c-35e4-437d-8f54-47a3cef27d7e" (UID: "31a42d9c-35e4-437d-8f54-47a3cef27d7e"). InnerVolumeSpecName "kube-api-access-b7drd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:28:57 crc kubenswrapper[4585]: I1201 14:28:57.953138 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31a42d9c-35e4-437d-8f54-47a3cef27d7e-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "31a42d9c-35e4-437d-8f54-47a3cef27d7e" (UID: "31a42d9c-35e4-437d-8f54-47a3cef27d7e"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:28:57 crc kubenswrapper[4585]: I1201 14:28:57.955464 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31a42d9c-35e4-437d-8f54-47a3cef27d7e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "31a42d9c-35e4-437d-8f54-47a3cef27d7e" (UID: "31a42d9c-35e4-437d-8f54-47a3cef27d7e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:28:58 crc kubenswrapper[4585]: I1201 14:28:58.030682 4585 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/31a42d9c-35e4-437d-8f54-47a3cef27d7e-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 01 14:28:58 crc kubenswrapper[4585]: I1201 14:28:58.030714 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7drd\" (UniqueName: \"kubernetes.io/projected/31a42d9c-35e4-437d-8f54-47a3cef27d7e-kube-api-access-b7drd\") on node \"crc\" DevicePath \"\"" Dec 01 14:28:58 crc kubenswrapper[4585]: I1201 14:28:58.030725 4585 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/31a42d9c-35e4-437d-8f54-47a3cef27d7e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 01 14:28:58 crc kubenswrapper[4585]: I1201 14:28:58.347223 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-pbt6b" event={"ID":"31a42d9c-35e4-437d-8f54-47a3cef27d7e","Type":"ContainerDied","Data":"5c894a018b9f3d501caaabd7a00651eb2545b6c654dc61048917a044841118c9"} Dec 01 14:28:58 crc kubenswrapper[4585]: I1201 14:28:58.347262 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c894a018b9f3d501caaabd7a00651eb2545b6c654dc61048917a044841118c9" Dec 01 14:28:58 crc kubenswrapper[4585]: I1201 14:28:58.347271 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-pbt6b" Dec 01 14:28:58 crc kubenswrapper[4585]: I1201 14:28:58.413001 4585 scope.go:117] "RemoveContainer" containerID="8681a4c58b14dec4c6ae76a371a009a12a89609fe6403c1612097d1fadf88593" Dec 01 14:28:58 crc kubenswrapper[4585]: E1201 14:28:58.413252 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:28:58 crc kubenswrapper[4585]: I1201 14:28:58.434442 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-td45x"] Dec 01 14:28:58 crc kubenswrapper[4585]: E1201 14:28:58.434852 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31a42d9c-35e4-437d-8f54-47a3cef27d7e" containerName="ssh-known-hosts-edpm-deployment" Dec 01 14:28:58 crc kubenswrapper[4585]: I1201 14:28:58.434872 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="31a42d9c-35e4-437d-8f54-47a3cef27d7e" containerName="ssh-known-hosts-edpm-deployment" Dec 01 14:28:58 crc kubenswrapper[4585]: I1201 14:28:58.435081 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="31a42d9c-35e4-437d-8f54-47a3cef27d7e" containerName="ssh-known-hosts-edpm-deployment" Dec 01 14:28:58 crc kubenswrapper[4585]: I1201 14:28:58.435653 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-td45x" Dec 01 14:28:58 crc kubenswrapper[4585]: I1201 14:28:58.439963 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-z6vpw" Dec 01 14:28:58 crc kubenswrapper[4585]: I1201 14:28:58.440475 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 14:28:58 crc kubenswrapper[4585]: I1201 14:28:58.443684 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 14:28:58 crc kubenswrapper[4585]: I1201 14:28:58.444437 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-td45x"] Dec 01 14:28:58 crc kubenswrapper[4585]: I1201 14:28:58.444882 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 14:28:58 crc kubenswrapper[4585]: I1201 14:28:58.538590 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d807047-8744-4a9e-9bf8-1f492a8034b5-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-td45x\" (UID: \"3d807047-8744-4a9e-9bf8-1f492a8034b5\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-td45x" Dec 01 14:28:58 crc kubenswrapper[4585]: I1201 14:28:58.538787 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frtnq\" (UniqueName: \"kubernetes.io/projected/3d807047-8744-4a9e-9bf8-1f492a8034b5-kube-api-access-frtnq\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-td45x\" (UID: \"3d807047-8744-4a9e-9bf8-1f492a8034b5\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-td45x" Dec 01 14:28:58 crc kubenswrapper[4585]: I1201 14:28:58.538836 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3d807047-8744-4a9e-9bf8-1f492a8034b5-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-td45x\" (UID: \"3d807047-8744-4a9e-9bf8-1f492a8034b5\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-td45x" Dec 01 14:28:58 crc kubenswrapper[4585]: I1201 14:28:58.640399 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frtnq\" (UniqueName: \"kubernetes.io/projected/3d807047-8744-4a9e-9bf8-1f492a8034b5-kube-api-access-frtnq\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-td45x\" (UID: \"3d807047-8744-4a9e-9bf8-1f492a8034b5\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-td45x" Dec 01 14:28:58 crc kubenswrapper[4585]: I1201 14:28:58.640467 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3d807047-8744-4a9e-9bf8-1f492a8034b5-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-td45x\" (UID: \"3d807047-8744-4a9e-9bf8-1f492a8034b5\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-td45x" Dec 01 14:28:58 crc kubenswrapper[4585]: I1201 14:28:58.640592 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d807047-8744-4a9e-9bf8-1f492a8034b5-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-td45x\" (UID: \"3d807047-8744-4a9e-9bf8-1f492a8034b5\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-td45x" Dec 01 14:28:58 crc kubenswrapper[4585]: I1201 14:28:58.643922 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3d807047-8744-4a9e-9bf8-1f492a8034b5-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-td45x\" (UID: \"3d807047-8744-4a9e-9bf8-1f492a8034b5\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-td45x" Dec 01 14:28:58 crc kubenswrapper[4585]: I1201 14:28:58.644092 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d807047-8744-4a9e-9bf8-1f492a8034b5-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-td45x\" (UID: \"3d807047-8744-4a9e-9bf8-1f492a8034b5\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-td45x" Dec 01 14:28:58 crc kubenswrapper[4585]: I1201 14:28:58.655997 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frtnq\" (UniqueName: \"kubernetes.io/projected/3d807047-8744-4a9e-9bf8-1f492a8034b5-kube-api-access-frtnq\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-td45x\" (UID: \"3d807047-8744-4a9e-9bf8-1f492a8034b5\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-td45x" Dec 01 14:28:58 crc kubenswrapper[4585]: I1201 14:28:58.752657 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-td45x" Dec 01 14:28:59 crc kubenswrapper[4585]: I1201 14:28:59.278819 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-td45x"] Dec 01 14:28:59 crc kubenswrapper[4585]: I1201 14:28:59.357666 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-td45x" event={"ID":"3d807047-8744-4a9e-9bf8-1f492a8034b5","Type":"ContainerStarted","Data":"133f4258faad94a451efdba73f608cfc8c2513fa95a66362cab46e653c87e17e"} Dec 01 14:29:00 crc kubenswrapper[4585]: I1201 14:29:00.365880 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-td45x" event={"ID":"3d807047-8744-4a9e-9bf8-1f492a8034b5","Type":"ContainerStarted","Data":"ac92192fb80a6a2bb0411cdb9ad17fa2fc16ec549b93b538ed4bbb52c17e13b8"} Dec 01 14:29:00 crc kubenswrapper[4585]: I1201 14:29:00.388319 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-td45x" podStartSLOduration=1.620201154 podStartE2EDuration="2.38830229s" podCreationTimestamp="2025-12-01 14:28:58 +0000 UTC" firstStartedPulling="2025-12-01 14:28:59.280128239 +0000 UTC m=+1853.264342094" lastFinishedPulling="2025-12-01 14:29:00.048229375 +0000 UTC m=+1854.032443230" observedRunningTime="2025-12-01 14:29:00.384100168 +0000 UTC m=+1854.368314023" watchObservedRunningTime="2025-12-01 14:29:00.38830229 +0000 UTC m=+1854.372516135" Dec 01 14:29:09 crc kubenswrapper[4585]: I1201 14:29:09.477188 4585 generic.go:334] "Generic (PLEG): container finished" podID="3d807047-8744-4a9e-9bf8-1f492a8034b5" containerID="ac92192fb80a6a2bb0411cdb9ad17fa2fc16ec549b93b538ed4bbb52c17e13b8" exitCode=0 Dec 01 14:29:09 crc kubenswrapper[4585]: I1201 14:29:09.477347 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-td45x" event={"ID":"3d807047-8744-4a9e-9bf8-1f492a8034b5","Type":"ContainerDied","Data":"ac92192fb80a6a2bb0411cdb9ad17fa2fc16ec549b93b538ed4bbb52c17e13b8"} Dec 01 14:29:10 crc kubenswrapper[4585]: I1201 14:29:10.412411 4585 scope.go:117] "RemoveContainer" containerID="8681a4c58b14dec4c6ae76a371a009a12a89609fe6403c1612097d1fadf88593" Dec 01 14:29:10 crc kubenswrapper[4585]: E1201 14:29:10.412861 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:29:10 crc kubenswrapper[4585]: I1201 14:29:10.902283 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-td45x" Dec 01 14:29:10 crc kubenswrapper[4585]: I1201 14:29:10.972168 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frtnq\" (UniqueName: \"kubernetes.io/projected/3d807047-8744-4a9e-9bf8-1f492a8034b5-kube-api-access-frtnq\") pod \"3d807047-8744-4a9e-9bf8-1f492a8034b5\" (UID: \"3d807047-8744-4a9e-9bf8-1f492a8034b5\") " Dec 01 14:29:10 crc kubenswrapper[4585]: I1201 14:29:10.972220 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3d807047-8744-4a9e-9bf8-1f492a8034b5-ssh-key\") pod \"3d807047-8744-4a9e-9bf8-1f492a8034b5\" (UID: \"3d807047-8744-4a9e-9bf8-1f492a8034b5\") " Dec 01 14:29:10 crc kubenswrapper[4585]: I1201 14:29:10.974442 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d807047-8744-4a9e-9bf8-1f492a8034b5-inventory\") pod \"3d807047-8744-4a9e-9bf8-1f492a8034b5\" (UID: \"3d807047-8744-4a9e-9bf8-1f492a8034b5\") " Dec 01 14:29:10 crc kubenswrapper[4585]: I1201 14:29:10.989709 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d807047-8744-4a9e-9bf8-1f492a8034b5-kube-api-access-frtnq" (OuterVolumeSpecName: "kube-api-access-frtnq") pod "3d807047-8744-4a9e-9bf8-1f492a8034b5" (UID: "3d807047-8744-4a9e-9bf8-1f492a8034b5"). InnerVolumeSpecName "kube-api-access-frtnq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:29:10 crc kubenswrapper[4585]: I1201 14:29:10.999536 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d807047-8744-4a9e-9bf8-1f492a8034b5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3d807047-8744-4a9e-9bf8-1f492a8034b5" (UID: "3d807047-8744-4a9e-9bf8-1f492a8034b5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:29:11 crc kubenswrapper[4585]: I1201 14:29:11.010011 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d807047-8744-4a9e-9bf8-1f492a8034b5-inventory" (OuterVolumeSpecName: "inventory") pod "3d807047-8744-4a9e-9bf8-1f492a8034b5" (UID: "3d807047-8744-4a9e-9bf8-1f492a8034b5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:29:11 crc kubenswrapper[4585]: I1201 14:29:11.076789 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frtnq\" (UniqueName: \"kubernetes.io/projected/3d807047-8744-4a9e-9bf8-1f492a8034b5-kube-api-access-frtnq\") on node \"crc\" DevicePath \"\"" Dec 01 14:29:11 crc kubenswrapper[4585]: I1201 14:29:11.076824 4585 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3d807047-8744-4a9e-9bf8-1f492a8034b5-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 14:29:11 crc kubenswrapper[4585]: I1201 14:29:11.076833 4585 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d807047-8744-4a9e-9bf8-1f492a8034b5-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 14:29:11 crc kubenswrapper[4585]: I1201 14:29:11.493872 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-td45x" event={"ID":"3d807047-8744-4a9e-9bf8-1f492a8034b5","Type":"ContainerDied","Data":"133f4258faad94a451efdba73f608cfc8c2513fa95a66362cab46e653c87e17e"} Dec 01 14:29:11 crc kubenswrapper[4585]: I1201 14:29:11.493922 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="133f4258faad94a451efdba73f608cfc8c2513fa95a66362cab46e653c87e17e" Dec 01 14:29:11 crc kubenswrapper[4585]: I1201 14:29:11.493942 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-td45x" Dec 01 14:29:11 crc kubenswrapper[4585]: I1201 14:29:11.593440 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-g98cz"] Dec 01 14:29:11 crc kubenswrapper[4585]: E1201 14:29:11.593799 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d807047-8744-4a9e-9bf8-1f492a8034b5" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 01 14:29:11 crc kubenswrapper[4585]: I1201 14:29:11.593817 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d807047-8744-4a9e-9bf8-1f492a8034b5" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 01 14:29:11 crc kubenswrapper[4585]: I1201 14:29:11.594024 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d807047-8744-4a9e-9bf8-1f492a8034b5" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 01 14:29:11 crc kubenswrapper[4585]: I1201 14:29:11.594620 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-g98cz" Dec 01 14:29:11 crc kubenswrapper[4585]: I1201 14:29:11.601762 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 14:29:11 crc kubenswrapper[4585]: I1201 14:29:11.601835 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-z6vpw" Dec 01 14:29:11 crc kubenswrapper[4585]: I1201 14:29:11.601862 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 14:29:11 crc kubenswrapper[4585]: I1201 14:29:11.602343 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 14:29:11 crc kubenswrapper[4585]: I1201 14:29:11.622601 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-g98cz"] Dec 01 14:29:11 crc kubenswrapper[4585]: I1201 14:29:11.685822 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/27a8adc5-7598-4bf7-b46f-9a853afce3e6-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-g98cz\" (UID: \"27a8adc5-7598-4bf7-b46f-9a853afce3e6\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-g98cz" Dec 01 14:29:11 crc kubenswrapper[4585]: I1201 14:29:11.685881 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/27a8adc5-7598-4bf7-b46f-9a853afce3e6-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-g98cz\" (UID: \"27a8adc5-7598-4bf7-b46f-9a853afce3e6\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-g98cz" Dec 01 14:29:11 crc kubenswrapper[4585]: I1201 14:29:11.686108 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4m9s4\" (UniqueName: \"kubernetes.io/projected/27a8adc5-7598-4bf7-b46f-9a853afce3e6-kube-api-access-4m9s4\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-g98cz\" (UID: \"27a8adc5-7598-4bf7-b46f-9a853afce3e6\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-g98cz" Dec 01 14:29:11 crc kubenswrapper[4585]: I1201 14:29:11.788735 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/27a8adc5-7598-4bf7-b46f-9a853afce3e6-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-g98cz\" (UID: \"27a8adc5-7598-4bf7-b46f-9a853afce3e6\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-g98cz" Dec 01 14:29:11 crc kubenswrapper[4585]: I1201 14:29:11.788799 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/27a8adc5-7598-4bf7-b46f-9a853afce3e6-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-g98cz\" (UID: \"27a8adc5-7598-4bf7-b46f-9a853afce3e6\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-g98cz" Dec 01 14:29:11 crc kubenswrapper[4585]: I1201 14:29:11.788838 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4m9s4\" (UniqueName: \"kubernetes.io/projected/27a8adc5-7598-4bf7-b46f-9a853afce3e6-kube-api-access-4m9s4\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-g98cz\" (UID: \"27a8adc5-7598-4bf7-b46f-9a853afce3e6\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-g98cz" Dec 01 14:29:11 crc kubenswrapper[4585]: I1201 14:29:11.801363 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/27a8adc5-7598-4bf7-b46f-9a853afce3e6-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-g98cz\" (UID: \"27a8adc5-7598-4bf7-b46f-9a853afce3e6\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-g98cz" Dec 01 14:29:11 crc kubenswrapper[4585]: I1201 14:29:11.802505 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/27a8adc5-7598-4bf7-b46f-9a853afce3e6-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-g98cz\" (UID: \"27a8adc5-7598-4bf7-b46f-9a853afce3e6\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-g98cz" Dec 01 14:29:11 crc kubenswrapper[4585]: I1201 14:29:11.810590 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4m9s4\" (UniqueName: \"kubernetes.io/projected/27a8adc5-7598-4bf7-b46f-9a853afce3e6-kube-api-access-4m9s4\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-g98cz\" (UID: \"27a8adc5-7598-4bf7-b46f-9a853afce3e6\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-g98cz" Dec 01 14:29:11 crc kubenswrapper[4585]: I1201 14:29:11.916665 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-g98cz" Dec 01 14:29:12 crc kubenswrapper[4585]: I1201 14:29:12.480778 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-g98cz"] Dec 01 14:29:12 crc kubenswrapper[4585]: I1201 14:29:12.511043 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-g98cz" event={"ID":"27a8adc5-7598-4bf7-b46f-9a853afce3e6","Type":"ContainerStarted","Data":"2d3cd761521ae580d7dcd75b02d7bdb61abc1ff3d76cb32ddbc484c2e4541b28"} Dec 01 14:29:13 crc kubenswrapper[4585]: I1201 14:29:13.520281 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-g98cz" event={"ID":"27a8adc5-7598-4bf7-b46f-9a853afce3e6","Type":"ContainerStarted","Data":"4ab3726136abb27a154a404262ec482679181b8672f8defbf834e9d49a53d292"} Dec 01 14:29:13 crc kubenswrapper[4585]: I1201 14:29:13.552813 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-g98cz" podStartSLOduration=2.002627624 podStartE2EDuration="2.552790284s" podCreationTimestamp="2025-12-01 14:29:11 +0000 UTC" firstStartedPulling="2025-12-01 14:29:12.492224634 +0000 UTC m=+1866.476438489" lastFinishedPulling="2025-12-01 14:29:13.042387294 +0000 UTC m=+1867.026601149" observedRunningTime="2025-12-01 14:29:13.55077537 +0000 UTC m=+1867.534989225" watchObservedRunningTime="2025-12-01 14:29:13.552790284 +0000 UTC m=+1867.537004139" Dec 01 14:29:23 crc kubenswrapper[4585]: I1201 14:29:23.412401 4585 scope.go:117] "RemoveContainer" containerID="8681a4c58b14dec4c6ae76a371a009a12a89609fe6403c1612097d1fadf88593" Dec 01 14:29:23 crc kubenswrapper[4585]: E1201 14:29:23.413170 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:29:23 crc kubenswrapper[4585]: I1201 14:29:23.599164 4585 generic.go:334] "Generic (PLEG): container finished" podID="27a8adc5-7598-4bf7-b46f-9a853afce3e6" containerID="4ab3726136abb27a154a404262ec482679181b8672f8defbf834e9d49a53d292" exitCode=0 Dec 01 14:29:23 crc kubenswrapper[4585]: I1201 14:29:23.599207 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-g98cz" event={"ID":"27a8adc5-7598-4bf7-b46f-9a853afce3e6","Type":"ContainerDied","Data":"4ab3726136abb27a154a404262ec482679181b8672f8defbf834e9d49a53d292"} Dec 01 14:29:25 crc kubenswrapper[4585]: I1201 14:29:25.075432 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-g98cz" Dec 01 14:29:25 crc kubenswrapper[4585]: I1201 14:29:25.173250 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/27a8adc5-7598-4bf7-b46f-9a853afce3e6-ssh-key\") pod \"27a8adc5-7598-4bf7-b46f-9a853afce3e6\" (UID: \"27a8adc5-7598-4bf7-b46f-9a853afce3e6\") " Dec 01 14:29:25 crc kubenswrapper[4585]: I1201 14:29:25.173650 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/27a8adc5-7598-4bf7-b46f-9a853afce3e6-inventory\") pod \"27a8adc5-7598-4bf7-b46f-9a853afce3e6\" (UID: \"27a8adc5-7598-4bf7-b46f-9a853afce3e6\") " Dec 01 14:29:25 crc kubenswrapper[4585]: I1201 14:29:25.173875 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4m9s4\" (UniqueName: \"kubernetes.io/projected/27a8adc5-7598-4bf7-b46f-9a853afce3e6-kube-api-access-4m9s4\") pod \"27a8adc5-7598-4bf7-b46f-9a853afce3e6\" (UID: \"27a8adc5-7598-4bf7-b46f-9a853afce3e6\") " Dec 01 14:29:25 crc kubenswrapper[4585]: I1201 14:29:25.180268 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27a8adc5-7598-4bf7-b46f-9a853afce3e6-kube-api-access-4m9s4" (OuterVolumeSpecName: "kube-api-access-4m9s4") pod "27a8adc5-7598-4bf7-b46f-9a853afce3e6" (UID: "27a8adc5-7598-4bf7-b46f-9a853afce3e6"). InnerVolumeSpecName "kube-api-access-4m9s4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:29:25 crc kubenswrapper[4585]: I1201 14:29:25.201938 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27a8adc5-7598-4bf7-b46f-9a853afce3e6-inventory" (OuterVolumeSpecName: "inventory") pod "27a8adc5-7598-4bf7-b46f-9a853afce3e6" (UID: "27a8adc5-7598-4bf7-b46f-9a853afce3e6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:29:25 crc kubenswrapper[4585]: I1201 14:29:25.206602 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27a8adc5-7598-4bf7-b46f-9a853afce3e6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "27a8adc5-7598-4bf7-b46f-9a853afce3e6" (UID: "27a8adc5-7598-4bf7-b46f-9a853afce3e6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:29:25 crc kubenswrapper[4585]: I1201 14:29:25.276074 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4m9s4\" (UniqueName: \"kubernetes.io/projected/27a8adc5-7598-4bf7-b46f-9a853afce3e6-kube-api-access-4m9s4\") on node \"crc\" DevicePath \"\"" Dec 01 14:29:25 crc kubenswrapper[4585]: I1201 14:29:25.276236 4585 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/27a8adc5-7598-4bf7-b46f-9a853afce3e6-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 14:29:25 crc kubenswrapper[4585]: I1201 14:29:25.276299 4585 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/27a8adc5-7598-4bf7-b46f-9a853afce3e6-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 14:29:25 crc kubenswrapper[4585]: I1201 14:29:25.619768 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-g98cz" event={"ID":"27a8adc5-7598-4bf7-b46f-9a853afce3e6","Type":"ContainerDied","Data":"2d3cd761521ae580d7dcd75b02d7bdb61abc1ff3d76cb32ddbc484c2e4541b28"} Dec 01 14:29:25 crc kubenswrapper[4585]: I1201 14:29:25.619805 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d3cd761521ae580d7dcd75b02d7bdb61abc1ff3d76cb32ddbc484c2e4541b28" Dec 01 14:29:25 crc kubenswrapper[4585]: I1201 14:29:25.619864 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-g98cz" Dec 01 14:29:25 crc kubenswrapper[4585]: I1201 14:29:25.787736 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-29vjp"] Dec 01 14:29:25 crc kubenswrapper[4585]: E1201 14:29:25.788177 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27a8adc5-7598-4bf7-b46f-9a853afce3e6" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 01 14:29:25 crc kubenswrapper[4585]: I1201 14:29:25.788198 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="27a8adc5-7598-4bf7-b46f-9a853afce3e6" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 01 14:29:25 crc kubenswrapper[4585]: I1201 14:29:25.788375 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="27a8adc5-7598-4bf7-b46f-9a853afce3e6" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 01 14:29:25 crc kubenswrapper[4585]: I1201 14:29:25.789813 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-29vjp" Dec 01 14:29:25 crc kubenswrapper[4585]: I1201 14:29:25.792056 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Dec 01 14:29:25 crc kubenswrapper[4585]: I1201 14:29:25.792265 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Dec 01 14:29:25 crc kubenswrapper[4585]: I1201 14:29:25.793505 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Dec 01 14:29:25 crc kubenswrapper[4585]: I1201 14:29:25.793826 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 14:29:25 crc kubenswrapper[4585]: I1201 14:29:25.793991 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-z6vpw" Dec 01 14:29:25 crc kubenswrapper[4585]: I1201 14:29:25.794131 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 14:29:25 crc kubenswrapper[4585]: I1201 14:29:25.794306 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Dec 01 14:29:25 crc kubenswrapper[4585]: I1201 14:29:25.794583 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 14:29:25 crc kubenswrapper[4585]: I1201 14:29:25.815234 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-29vjp"] Dec 01 14:29:25 crc kubenswrapper[4585]: I1201 14:29:25.940831 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-29vjp\" (UID: \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-29vjp" Dec 01 14:29:25 crc kubenswrapper[4585]: I1201 14:29:25.940887 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-29vjp\" (UID: \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-29vjp" Dec 01 14:29:25 crc kubenswrapper[4585]: I1201 14:29:25.941020 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-29vjp\" (UID: \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-29vjp" Dec 01 14:29:25 crc kubenswrapper[4585]: I1201 14:29:25.941043 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-29vjp\" (UID: \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-29vjp" Dec 01 14:29:25 crc kubenswrapper[4585]: I1201 14:29:25.941063 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-29vjp\" (UID: \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-29vjp" Dec 01 14:29:25 crc kubenswrapper[4585]: I1201 14:29:25.941082 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mqmr\" (UniqueName: \"kubernetes.io/projected/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-kube-api-access-6mqmr\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-29vjp\" (UID: \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-29vjp" Dec 01 14:29:25 crc kubenswrapper[4585]: I1201 14:29:25.941273 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-29vjp\" (UID: \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-29vjp" Dec 01 14:29:25 crc kubenswrapper[4585]: I1201 14:29:25.941422 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-29vjp\" (UID: \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-29vjp" Dec 01 14:29:25 crc kubenswrapper[4585]: I1201 14:29:25.941478 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-29vjp\" (UID: \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-29vjp" Dec 01 14:29:25 crc kubenswrapper[4585]: I1201 14:29:25.941525 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-29vjp\" (UID: \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-29vjp" Dec 01 14:29:25 crc kubenswrapper[4585]: I1201 14:29:25.941549 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-29vjp\" (UID: \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-29vjp" Dec 01 14:29:25 crc kubenswrapper[4585]: I1201 14:29:25.941576 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-29vjp\" (UID: \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-29vjp" Dec 01 14:29:25 crc kubenswrapper[4585]: I1201 14:29:25.941677 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-29vjp\" (UID: \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-29vjp" Dec 01 14:29:25 crc kubenswrapper[4585]: I1201 14:29:25.941724 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-29vjp\" (UID: \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-29vjp" Dec 01 14:29:26 crc kubenswrapper[4585]: I1201 14:29:26.042941 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-29vjp\" (UID: \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-29vjp" Dec 01 14:29:26 crc kubenswrapper[4585]: I1201 14:29:26.043043 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-29vjp\" (UID: \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-29vjp" Dec 01 14:29:26 crc kubenswrapper[4585]: I1201 14:29:26.043101 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-29vjp\" (UID: \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-29vjp" Dec 01 14:29:26 crc kubenswrapper[4585]: I1201 14:29:26.043126 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-29vjp\" (UID: \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-29vjp" Dec 01 14:29:26 crc kubenswrapper[4585]: I1201 14:29:26.043148 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-29vjp\" (UID: \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-29vjp" Dec 01 14:29:26 crc kubenswrapper[4585]: I1201 14:29:26.043169 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mqmr\" (UniqueName: \"kubernetes.io/projected/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-kube-api-access-6mqmr\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-29vjp\" (UID: \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-29vjp" Dec 01 14:29:26 crc kubenswrapper[4585]: I1201 14:29:26.043210 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-29vjp\" (UID: \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-29vjp" Dec 01 14:29:26 crc kubenswrapper[4585]: I1201 14:29:26.043248 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-29vjp\" (UID: \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-29vjp" Dec 01 14:29:26 crc kubenswrapper[4585]: I1201 14:29:26.043269 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-29vjp\" (UID: \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-29vjp" Dec 01 14:29:26 crc kubenswrapper[4585]: I1201 14:29:26.043291 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-29vjp\" (UID: \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-29vjp" Dec 01 14:29:26 crc kubenswrapper[4585]: I1201 14:29:26.043307 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-29vjp\" (UID: \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-29vjp" Dec 01 14:29:26 crc kubenswrapper[4585]: I1201 14:29:26.043329 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-29vjp\" (UID: \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-29vjp" Dec 01 14:29:26 crc kubenswrapper[4585]: I1201 14:29:26.043366 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-29vjp\" (UID: \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-29vjp" Dec 01 14:29:26 crc kubenswrapper[4585]: I1201 14:29:26.043385 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-29vjp\" (UID: \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-29vjp" Dec 01 14:29:26 crc kubenswrapper[4585]: I1201 14:29:26.049177 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-29vjp\" (UID: \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-29vjp" Dec 01 14:29:26 crc kubenswrapper[4585]: I1201 14:29:26.049510 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-29vjp\" (UID: \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-29vjp" Dec 01 14:29:26 crc kubenswrapper[4585]: I1201 14:29:26.051647 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-29vjp\" (UID: \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-29vjp" Dec 01 14:29:26 crc kubenswrapper[4585]: I1201 14:29:26.051773 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-29vjp\" (UID: \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-29vjp" Dec 01 14:29:26 crc kubenswrapper[4585]: I1201 14:29:26.052624 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-29vjp\" (UID: \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-29vjp" Dec 01 14:29:26 crc kubenswrapper[4585]: I1201 14:29:26.053123 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-29vjp\" (UID: \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-29vjp" Dec 01 14:29:26 crc kubenswrapper[4585]: I1201 14:29:26.053346 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-29vjp\" (UID: \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-29vjp" Dec 01 14:29:26 crc kubenswrapper[4585]: I1201 14:29:26.055210 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-29vjp\" (UID: \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-29vjp" Dec 01 14:29:26 crc kubenswrapper[4585]: I1201 14:29:26.057710 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-29vjp\" (UID: \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-29vjp" Dec 01 14:29:26 crc kubenswrapper[4585]: I1201 14:29:26.058321 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-29vjp\" (UID: \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-29vjp" Dec 01 14:29:26 crc kubenswrapper[4585]: I1201 14:29:26.058623 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-29vjp\" (UID: \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-29vjp" Dec 01 14:29:26 crc kubenswrapper[4585]: I1201 14:29:26.059375 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-29vjp\" (UID: \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-29vjp" Dec 01 14:29:26 crc kubenswrapper[4585]: I1201 14:29:26.066732 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-29vjp\" (UID: \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-29vjp" Dec 01 14:29:26 crc kubenswrapper[4585]: I1201 14:29:26.071856 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mqmr\" (UniqueName: \"kubernetes.io/projected/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-kube-api-access-6mqmr\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-29vjp\" (UID: \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-29vjp" Dec 01 14:29:26 crc kubenswrapper[4585]: I1201 14:29:26.110987 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-29vjp" Dec 01 14:29:26 crc kubenswrapper[4585]: I1201 14:29:26.676784 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-29vjp"] Dec 01 14:29:27 crc kubenswrapper[4585]: I1201 14:29:27.636243 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-29vjp" event={"ID":"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f","Type":"ContainerStarted","Data":"7b0da903b49d64208c47614363a846e72c24d0e1f4aedfa169624982adeb3843"} Dec 01 14:29:27 crc kubenswrapper[4585]: I1201 14:29:27.636689 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-29vjp" event={"ID":"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f","Type":"ContainerStarted","Data":"46bdda8e0311c1c9c2d40f809a879828c3059e41abdbe833ecf9387c562dca56"} Dec 01 14:29:27 crc kubenswrapper[4585]: I1201 14:29:27.657055 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-29vjp" podStartSLOduration=2.027202739 podStartE2EDuration="2.657034906s" podCreationTimestamp="2025-12-01 14:29:25 +0000 UTC" firstStartedPulling="2025-12-01 14:29:26.687860974 +0000 UTC m=+1880.672074829" lastFinishedPulling="2025-12-01 14:29:27.317693141 +0000 UTC m=+1881.301906996" observedRunningTime="2025-12-01 14:29:27.651811697 +0000 UTC m=+1881.636025552" watchObservedRunningTime="2025-12-01 14:29:27.657034906 +0000 UTC m=+1881.641248781" Dec 01 14:29:35 crc kubenswrapper[4585]: I1201 14:29:35.412462 4585 scope.go:117] "RemoveContainer" containerID="8681a4c58b14dec4c6ae76a371a009a12a89609fe6403c1612097d1fadf88593" Dec 01 14:29:35 crc kubenswrapper[4585]: E1201 14:29:35.413239 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:29:46 crc kubenswrapper[4585]: I1201 14:29:46.417700 4585 scope.go:117] "RemoveContainer" containerID="8681a4c58b14dec4c6ae76a371a009a12a89609fe6403c1612097d1fadf88593" Dec 01 14:29:46 crc kubenswrapper[4585]: E1201 14:29:46.418373 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:30:00 crc kubenswrapper[4585]: I1201 14:30:00.151760 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409990-kkdvm"] Dec 01 14:30:00 crc kubenswrapper[4585]: I1201 14:30:00.153717 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409990-kkdvm" Dec 01 14:30:00 crc kubenswrapper[4585]: I1201 14:30:00.156048 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 14:30:00 crc kubenswrapper[4585]: I1201 14:30:00.156215 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 14:30:00 crc kubenswrapper[4585]: I1201 14:30:00.161492 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409990-kkdvm"] Dec 01 14:30:00 crc kubenswrapper[4585]: I1201 14:30:00.241207 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9b2b45f3-13cd-4528-ba33-bf2cecc7ee56-config-volume\") pod \"collect-profiles-29409990-kkdvm\" (UID: \"9b2b45f3-13cd-4528-ba33-bf2cecc7ee56\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409990-kkdvm" Dec 01 14:30:00 crc kubenswrapper[4585]: I1201 14:30:00.241260 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbmh2\" (UniqueName: \"kubernetes.io/projected/9b2b45f3-13cd-4528-ba33-bf2cecc7ee56-kube-api-access-lbmh2\") pod \"collect-profiles-29409990-kkdvm\" (UID: \"9b2b45f3-13cd-4528-ba33-bf2cecc7ee56\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409990-kkdvm" Dec 01 14:30:00 crc kubenswrapper[4585]: I1201 14:30:00.241696 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9b2b45f3-13cd-4528-ba33-bf2cecc7ee56-secret-volume\") pod \"collect-profiles-29409990-kkdvm\" (UID: \"9b2b45f3-13cd-4528-ba33-bf2cecc7ee56\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409990-kkdvm" Dec 01 14:30:00 crc kubenswrapper[4585]: I1201 14:30:00.343650 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9b2b45f3-13cd-4528-ba33-bf2cecc7ee56-secret-volume\") pod \"collect-profiles-29409990-kkdvm\" (UID: \"9b2b45f3-13cd-4528-ba33-bf2cecc7ee56\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409990-kkdvm" Dec 01 14:30:00 crc kubenswrapper[4585]: I1201 14:30:00.343916 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9b2b45f3-13cd-4528-ba33-bf2cecc7ee56-config-volume\") pod \"collect-profiles-29409990-kkdvm\" (UID: \"9b2b45f3-13cd-4528-ba33-bf2cecc7ee56\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409990-kkdvm" Dec 01 14:30:00 crc kubenswrapper[4585]: I1201 14:30:00.344074 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbmh2\" (UniqueName: \"kubernetes.io/projected/9b2b45f3-13cd-4528-ba33-bf2cecc7ee56-kube-api-access-lbmh2\") pod \"collect-profiles-29409990-kkdvm\" (UID: \"9b2b45f3-13cd-4528-ba33-bf2cecc7ee56\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409990-kkdvm" Dec 01 14:30:00 crc kubenswrapper[4585]: I1201 14:30:00.344782 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9b2b45f3-13cd-4528-ba33-bf2cecc7ee56-config-volume\") pod \"collect-profiles-29409990-kkdvm\" (UID: \"9b2b45f3-13cd-4528-ba33-bf2cecc7ee56\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409990-kkdvm" Dec 01 14:30:00 crc kubenswrapper[4585]: I1201 14:30:00.357699 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9b2b45f3-13cd-4528-ba33-bf2cecc7ee56-secret-volume\") pod \"collect-profiles-29409990-kkdvm\" (UID: \"9b2b45f3-13cd-4528-ba33-bf2cecc7ee56\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409990-kkdvm" Dec 01 14:30:00 crc kubenswrapper[4585]: I1201 14:30:00.363031 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbmh2\" (UniqueName: \"kubernetes.io/projected/9b2b45f3-13cd-4528-ba33-bf2cecc7ee56-kube-api-access-lbmh2\") pod \"collect-profiles-29409990-kkdvm\" (UID: \"9b2b45f3-13cd-4528-ba33-bf2cecc7ee56\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409990-kkdvm" Dec 01 14:30:00 crc kubenswrapper[4585]: I1201 14:30:00.412616 4585 scope.go:117] "RemoveContainer" containerID="8681a4c58b14dec4c6ae76a371a009a12a89609fe6403c1612097d1fadf88593" Dec 01 14:30:00 crc kubenswrapper[4585]: E1201 14:30:00.412916 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:30:00 crc kubenswrapper[4585]: I1201 14:30:00.481737 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409990-kkdvm" Dec 01 14:30:00 crc kubenswrapper[4585]: I1201 14:30:00.943027 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409990-kkdvm"] Dec 01 14:30:01 crc kubenswrapper[4585]: I1201 14:30:01.938367 4585 generic.go:334] "Generic (PLEG): container finished" podID="9b2b45f3-13cd-4528-ba33-bf2cecc7ee56" containerID="f2b0be5e15f62934438b93074ee18ee781010a8a27c42321005c2362c20f309d" exitCode=0 Dec 01 14:30:01 crc kubenswrapper[4585]: I1201 14:30:01.938629 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409990-kkdvm" event={"ID":"9b2b45f3-13cd-4528-ba33-bf2cecc7ee56","Type":"ContainerDied","Data":"f2b0be5e15f62934438b93074ee18ee781010a8a27c42321005c2362c20f309d"} Dec 01 14:30:01 crc kubenswrapper[4585]: I1201 14:30:01.938653 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409990-kkdvm" event={"ID":"9b2b45f3-13cd-4528-ba33-bf2cecc7ee56","Type":"ContainerStarted","Data":"3c717161770cb77674d0ce0ebdc8580d9f311cc9794986c6a064fbda2cf2d10c"} Dec 01 14:30:03 crc kubenswrapper[4585]: I1201 14:30:03.322273 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409990-kkdvm" Dec 01 14:30:03 crc kubenswrapper[4585]: I1201 14:30:03.406420 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9b2b45f3-13cd-4528-ba33-bf2cecc7ee56-secret-volume\") pod \"9b2b45f3-13cd-4528-ba33-bf2cecc7ee56\" (UID: \"9b2b45f3-13cd-4528-ba33-bf2cecc7ee56\") " Dec 01 14:30:03 crc kubenswrapper[4585]: I1201 14:30:03.406497 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbmh2\" (UniqueName: \"kubernetes.io/projected/9b2b45f3-13cd-4528-ba33-bf2cecc7ee56-kube-api-access-lbmh2\") pod \"9b2b45f3-13cd-4528-ba33-bf2cecc7ee56\" (UID: \"9b2b45f3-13cd-4528-ba33-bf2cecc7ee56\") " Dec 01 14:30:03 crc kubenswrapper[4585]: I1201 14:30:03.406622 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9b2b45f3-13cd-4528-ba33-bf2cecc7ee56-config-volume\") pod \"9b2b45f3-13cd-4528-ba33-bf2cecc7ee56\" (UID: \"9b2b45f3-13cd-4528-ba33-bf2cecc7ee56\") " Dec 01 14:30:03 crc kubenswrapper[4585]: I1201 14:30:03.407285 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b2b45f3-13cd-4528-ba33-bf2cecc7ee56-config-volume" (OuterVolumeSpecName: "config-volume") pod "9b2b45f3-13cd-4528-ba33-bf2cecc7ee56" (UID: "9b2b45f3-13cd-4528-ba33-bf2cecc7ee56"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:30:03 crc kubenswrapper[4585]: I1201 14:30:03.419203 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b2b45f3-13cd-4528-ba33-bf2cecc7ee56-kube-api-access-lbmh2" (OuterVolumeSpecName: "kube-api-access-lbmh2") pod "9b2b45f3-13cd-4528-ba33-bf2cecc7ee56" (UID: "9b2b45f3-13cd-4528-ba33-bf2cecc7ee56"). InnerVolumeSpecName "kube-api-access-lbmh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:30:03 crc kubenswrapper[4585]: I1201 14:30:03.427047 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b2b45f3-13cd-4528-ba33-bf2cecc7ee56-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9b2b45f3-13cd-4528-ba33-bf2cecc7ee56" (UID: "9b2b45f3-13cd-4528-ba33-bf2cecc7ee56"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:30:03 crc kubenswrapper[4585]: I1201 14:30:03.508923 4585 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9b2b45f3-13cd-4528-ba33-bf2cecc7ee56-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 14:30:03 crc kubenswrapper[4585]: I1201 14:30:03.508956 4585 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9b2b45f3-13cd-4528-ba33-bf2cecc7ee56-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 14:30:03 crc kubenswrapper[4585]: I1201 14:30:03.508966 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbmh2\" (UniqueName: \"kubernetes.io/projected/9b2b45f3-13cd-4528-ba33-bf2cecc7ee56-kube-api-access-lbmh2\") on node \"crc\" DevicePath \"\"" Dec 01 14:30:03 crc kubenswrapper[4585]: I1201 14:30:03.959963 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409990-kkdvm" event={"ID":"9b2b45f3-13cd-4528-ba33-bf2cecc7ee56","Type":"ContainerDied","Data":"3c717161770cb77674d0ce0ebdc8580d9f311cc9794986c6a064fbda2cf2d10c"} Dec 01 14:30:03 crc kubenswrapper[4585]: I1201 14:30:03.960020 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c717161770cb77674d0ce0ebdc8580d9f311cc9794986c6a064fbda2cf2d10c" Dec 01 14:30:03 crc kubenswrapper[4585]: I1201 14:30:03.960073 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409990-kkdvm" Dec 01 14:30:09 crc kubenswrapper[4585]: I1201 14:30:09.003359 4585 generic.go:334] "Generic (PLEG): container finished" podID="f27fcf2c-32e5-488b-b1f2-3f61b3096f4f" containerID="7b0da903b49d64208c47614363a846e72c24d0e1f4aedfa169624982adeb3843" exitCode=0 Dec 01 14:30:09 crc kubenswrapper[4585]: I1201 14:30:09.004093 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-29vjp" event={"ID":"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f","Type":"ContainerDied","Data":"7b0da903b49d64208c47614363a846e72c24d0e1f4aedfa169624982adeb3843"} Dec 01 14:30:10 crc kubenswrapper[4585]: I1201 14:30:10.492124 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-29vjp" Dec 01 14:30:10 crc kubenswrapper[4585]: I1201 14:30:10.543164 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-nova-combined-ca-bundle\") pod \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\" (UID: \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\") " Dec 01 14:30:10 crc kubenswrapper[4585]: I1201 14:30:10.543289 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\" (UID: \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\") " Dec 01 14:30:10 crc kubenswrapper[4585]: I1201 14:30:10.543337 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-inventory\") pod \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\" (UID: \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\") " Dec 01 14:30:10 crc kubenswrapper[4585]: I1201 14:30:10.544103 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-openstack-edpm-ipam-ovn-default-certs-0\") pod \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\" (UID: \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\") " Dec 01 14:30:10 crc kubenswrapper[4585]: I1201 14:30:10.544190 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-telemetry-combined-ca-bundle\") pod \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\" (UID: \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\") " Dec 01 14:30:10 crc kubenswrapper[4585]: I1201 14:30:10.544233 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-ssh-key\") pod \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\" (UID: \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\") " Dec 01 14:30:10 crc kubenswrapper[4585]: I1201 14:30:10.544320 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-ovn-combined-ca-bundle\") pod \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\" (UID: \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\") " Dec 01 14:30:10 crc kubenswrapper[4585]: I1201 14:30:10.544367 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\" (UID: \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\") " Dec 01 14:30:10 crc kubenswrapper[4585]: I1201 14:30:10.544417 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-bootstrap-combined-ca-bundle\") pod \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\" (UID: \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\") " Dec 01 14:30:10 crc kubenswrapper[4585]: I1201 14:30:10.544440 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-neutron-metadata-combined-ca-bundle\") pod \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\" (UID: \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\") " Dec 01 14:30:10 crc kubenswrapper[4585]: I1201 14:30:10.544485 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mqmr\" (UniqueName: \"kubernetes.io/projected/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-kube-api-access-6mqmr\") pod \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\" (UID: \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\") " Dec 01 14:30:10 crc kubenswrapper[4585]: I1201 14:30:10.544588 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\" (UID: \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\") " Dec 01 14:30:10 crc kubenswrapper[4585]: I1201 14:30:10.544639 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-libvirt-combined-ca-bundle\") pod \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\" (UID: \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\") " Dec 01 14:30:10 crc kubenswrapper[4585]: I1201 14:30:10.544669 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-repo-setup-combined-ca-bundle\") pod \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\" (UID: \"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f\") " Dec 01 14:30:10 crc kubenswrapper[4585]: I1201 14:30:10.555196 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "f27fcf2c-32e5-488b-b1f2-3f61b3096f4f" (UID: "f27fcf2c-32e5-488b-b1f2-3f61b3096f4f"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:30:10 crc kubenswrapper[4585]: I1201 14:30:10.555748 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-kube-api-access-6mqmr" (OuterVolumeSpecName: "kube-api-access-6mqmr") pod "f27fcf2c-32e5-488b-b1f2-3f61b3096f4f" (UID: "f27fcf2c-32e5-488b-b1f2-3f61b3096f4f"). InnerVolumeSpecName "kube-api-access-6mqmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:30:10 crc kubenswrapper[4585]: I1201 14:30:10.560359 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "f27fcf2c-32e5-488b-b1f2-3f61b3096f4f" (UID: "f27fcf2c-32e5-488b-b1f2-3f61b3096f4f"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:30:10 crc kubenswrapper[4585]: I1201 14:30:10.574282 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "f27fcf2c-32e5-488b-b1f2-3f61b3096f4f" (UID: "f27fcf2c-32e5-488b-b1f2-3f61b3096f4f"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:30:10 crc kubenswrapper[4585]: I1201 14:30:10.574339 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "f27fcf2c-32e5-488b-b1f2-3f61b3096f4f" (UID: "f27fcf2c-32e5-488b-b1f2-3f61b3096f4f"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:30:10 crc kubenswrapper[4585]: I1201 14:30:10.574342 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "f27fcf2c-32e5-488b-b1f2-3f61b3096f4f" (UID: "f27fcf2c-32e5-488b-b1f2-3f61b3096f4f"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:30:10 crc kubenswrapper[4585]: I1201 14:30:10.574419 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "f27fcf2c-32e5-488b-b1f2-3f61b3096f4f" (UID: "f27fcf2c-32e5-488b-b1f2-3f61b3096f4f"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:30:10 crc kubenswrapper[4585]: I1201 14:30:10.574577 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "f27fcf2c-32e5-488b-b1f2-3f61b3096f4f" (UID: "f27fcf2c-32e5-488b-b1f2-3f61b3096f4f"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:30:10 crc kubenswrapper[4585]: I1201 14:30:10.574611 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "f27fcf2c-32e5-488b-b1f2-3f61b3096f4f" (UID: "f27fcf2c-32e5-488b-b1f2-3f61b3096f4f"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:30:10 crc kubenswrapper[4585]: I1201 14:30:10.575844 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "f27fcf2c-32e5-488b-b1f2-3f61b3096f4f" (UID: "f27fcf2c-32e5-488b-b1f2-3f61b3096f4f"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:30:10 crc kubenswrapper[4585]: I1201 14:30:10.577179 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "f27fcf2c-32e5-488b-b1f2-3f61b3096f4f" (UID: "f27fcf2c-32e5-488b-b1f2-3f61b3096f4f"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:30:10 crc kubenswrapper[4585]: I1201 14:30:10.580500 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "f27fcf2c-32e5-488b-b1f2-3f61b3096f4f" (UID: "f27fcf2c-32e5-488b-b1f2-3f61b3096f4f"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:30:10 crc kubenswrapper[4585]: I1201 14:30:10.595930 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-inventory" (OuterVolumeSpecName: "inventory") pod "f27fcf2c-32e5-488b-b1f2-3f61b3096f4f" (UID: "f27fcf2c-32e5-488b-b1f2-3f61b3096f4f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:30:10 crc kubenswrapper[4585]: I1201 14:30:10.596452 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f27fcf2c-32e5-488b-b1f2-3f61b3096f4f" (UID: "f27fcf2c-32e5-488b-b1f2-3f61b3096f4f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:30:10 crc kubenswrapper[4585]: I1201 14:30:10.646480 4585 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 14:30:10 crc kubenswrapper[4585]: I1201 14:30:10.646516 4585 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 01 14:30:10 crc kubenswrapper[4585]: I1201 14:30:10.646528 4585 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 14:30:10 crc kubenswrapper[4585]: I1201 14:30:10.646541 4585 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 14:30:10 crc kubenswrapper[4585]: I1201 14:30:10.646550 4585 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 14:30:10 crc kubenswrapper[4585]: I1201 14:30:10.646558 4585 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 01 14:30:10 crc kubenswrapper[4585]: I1201 14:30:10.646568 4585 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 14:30:10 crc kubenswrapper[4585]: I1201 14:30:10.646578 4585 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 14:30:10 crc kubenswrapper[4585]: I1201 14:30:10.646590 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mqmr\" (UniqueName: \"kubernetes.io/projected/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-kube-api-access-6mqmr\") on node \"crc\" DevicePath \"\"" Dec 01 14:30:10 crc kubenswrapper[4585]: I1201 14:30:10.646602 4585 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 01 14:30:10 crc kubenswrapper[4585]: I1201 14:30:10.646615 4585 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 14:30:10 crc kubenswrapper[4585]: I1201 14:30:10.646627 4585 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 14:30:10 crc kubenswrapper[4585]: I1201 14:30:10.646636 4585 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 14:30:10 crc kubenswrapper[4585]: I1201 14:30:10.646646 4585 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f27fcf2c-32e5-488b-b1f2-3f61b3096f4f-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 01 14:30:11 crc kubenswrapper[4585]: I1201 14:30:11.026255 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-29vjp" event={"ID":"f27fcf2c-32e5-488b-b1f2-3f61b3096f4f","Type":"ContainerDied","Data":"46bdda8e0311c1c9c2d40f809a879828c3059e41abdbe833ecf9387c562dca56"} Dec 01 14:30:11 crc kubenswrapper[4585]: I1201 14:30:11.026292 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46bdda8e0311c1c9c2d40f809a879828c3059e41abdbe833ecf9387c562dca56" Dec 01 14:30:11 crc kubenswrapper[4585]: I1201 14:30:11.026322 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-29vjp" Dec 01 14:30:11 crc kubenswrapper[4585]: I1201 14:30:11.161398 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-n7w8n"] Dec 01 14:30:11 crc kubenswrapper[4585]: E1201 14:30:11.161776 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f27fcf2c-32e5-488b-b1f2-3f61b3096f4f" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 01 14:30:11 crc kubenswrapper[4585]: I1201 14:30:11.161794 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="f27fcf2c-32e5-488b-b1f2-3f61b3096f4f" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 01 14:30:11 crc kubenswrapper[4585]: E1201 14:30:11.161820 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b2b45f3-13cd-4528-ba33-bf2cecc7ee56" containerName="collect-profiles" Dec 01 14:30:11 crc kubenswrapper[4585]: I1201 14:30:11.161827 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b2b45f3-13cd-4528-ba33-bf2cecc7ee56" containerName="collect-profiles" Dec 01 14:30:11 crc kubenswrapper[4585]: I1201 14:30:11.162008 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="f27fcf2c-32e5-488b-b1f2-3f61b3096f4f" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 01 14:30:11 crc kubenswrapper[4585]: I1201 14:30:11.162025 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b2b45f3-13cd-4528-ba33-bf2cecc7ee56" containerName="collect-profiles" Dec 01 14:30:11 crc kubenswrapper[4585]: I1201 14:30:11.162685 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-n7w8n" Dec 01 14:30:11 crc kubenswrapper[4585]: I1201 14:30:11.167068 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 14:30:11 crc kubenswrapper[4585]: I1201 14:30:11.167511 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 14:30:11 crc kubenswrapper[4585]: I1201 14:30:11.167801 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Dec 01 14:30:11 crc kubenswrapper[4585]: I1201 14:30:11.167918 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 14:30:11 crc kubenswrapper[4585]: I1201 14:30:11.169070 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-z6vpw" Dec 01 14:30:11 crc kubenswrapper[4585]: I1201 14:30:11.173555 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-n7w8n"] Dec 01 14:30:11 crc kubenswrapper[4585]: I1201 14:30:11.262736 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxtgk\" (UniqueName: \"kubernetes.io/projected/3a6c0545-e1eb-412f-afee-4764733eff64-kube-api-access-gxtgk\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-n7w8n\" (UID: \"3a6c0545-e1eb-412f-afee-4764733eff64\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-n7w8n" Dec 01 14:30:11 crc kubenswrapper[4585]: I1201 14:30:11.262816 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3a6c0545-e1eb-412f-afee-4764733eff64-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-n7w8n\" (UID: \"3a6c0545-e1eb-412f-afee-4764733eff64\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-n7w8n" Dec 01 14:30:11 crc kubenswrapper[4585]: I1201 14:30:11.262878 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a6c0545-e1eb-412f-afee-4764733eff64-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-n7w8n\" (UID: \"3a6c0545-e1eb-412f-afee-4764733eff64\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-n7w8n" Dec 01 14:30:11 crc kubenswrapper[4585]: I1201 14:30:11.262911 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3a6c0545-e1eb-412f-afee-4764733eff64-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-n7w8n\" (UID: \"3a6c0545-e1eb-412f-afee-4764733eff64\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-n7w8n" Dec 01 14:30:11 crc kubenswrapper[4585]: I1201 14:30:11.263033 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3a6c0545-e1eb-412f-afee-4764733eff64-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-n7w8n\" (UID: \"3a6c0545-e1eb-412f-afee-4764733eff64\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-n7w8n" Dec 01 14:30:11 crc kubenswrapper[4585]: I1201 14:30:11.365043 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3a6c0545-e1eb-412f-afee-4764733eff64-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-n7w8n\" (UID: \"3a6c0545-e1eb-412f-afee-4764733eff64\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-n7w8n" Dec 01 14:30:11 crc kubenswrapper[4585]: I1201 14:30:11.365688 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a6c0545-e1eb-412f-afee-4764733eff64-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-n7w8n\" (UID: \"3a6c0545-e1eb-412f-afee-4764733eff64\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-n7w8n" Dec 01 14:30:11 crc kubenswrapper[4585]: I1201 14:30:11.365796 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3a6c0545-e1eb-412f-afee-4764733eff64-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-n7w8n\" (UID: \"3a6c0545-e1eb-412f-afee-4764733eff64\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-n7w8n" Dec 01 14:30:11 crc kubenswrapper[4585]: I1201 14:30:11.365900 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3a6c0545-e1eb-412f-afee-4764733eff64-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-n7w8n\" (UID: \"3a6c0545-e1eb-412f-afee-4764733eff64\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-n7w8n" Dec 01 14:30:11 crc kubenswrapper[4585]: I1201 14:30:11.366056 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxtgk\" (UniqueName: \"kubernetes.io/projected/3a6c0545-e1eb-412f-afee-4764733eff64-kube-api-access-gxtgk\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-n7w8n\" (UID: \"3a6c0545-e1eb-412f-afee-4764733eff64\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-n7w8n" Dec 01 14:30:11 crc kubenswrapper[4585]: I1201 14:30:11.366846 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3a6c0545-e1eb-412f-afee-4764733eff64-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-n7w8n\" (UID: \"3a6c0545-e1eb-412f-afee-4764733eff64\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-n7w8n" Dec 01 14:30:11 crc kubenswrapper[4585]: I1201 14:30:11.371212 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3a6c0545-e1eb-412f-afee-4764733eff64-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-n7w8n\" (UID: \"3a6c0545-e1eb-412f-afee-4764733eff64\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-n7w8n" Dec 01 14:30:11 crc kubenswrapper[4585]: I1201 14:30:11.379256 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3a6c0545-e1eb-412f-afee-4764733eff64-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-n7w8n\" (UID: \"3a6c0545-e1eb-412f-afee-4764733eff64\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-n7w8n" Dec 01 14:30:11 crc kubenswrapper[4585]: I1201 14:30:11.379942 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a6c0545-e1eb-412f-afee-4764733eff64-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-n7w8n\" (UID: \"3a6c0545-e1eb-412f-afee-4764733eff64\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-n7w8n" Dec 01 14:30:11 crc kubenswrapper[4585]: I1201 14:30:11.382399 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxtgk\" (UniqueName: \"kubernetes.io/projected/3a6c0545-e1eb-412f-afee-4764733eff64-kube-api-access-gxtgk\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-n7w8n\" (UID: \"3a6c0545-e1eb-412f-afee-4764733eff64\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-n7w8n" Dec 01 14:30:11 crc kubenswrapper[4585]: I1201 14:30:11.495706 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-n7w8n" Dec 01 14:30:12 crc kubenswrapper[4585]: I1201 14:30:12.024869 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-n7w8n"] Dec 01 14:30:13 crc kubenswrapper[4585]: I1201 14:30:13.044118 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-n7w8n" event={"ID":"3a6c0545-e1eb-412f-afee-4764733eff64","Type":"ContainerStarted","Data":"15d39454c165def551565f21c3913a28c3ac782bc9434925cf98344417422ec8"} Dec 01 14:30:13 crc kubenswrapper[4585]: I1201 14:30:13.044173 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-n7w8n" event={"ID":"3a6c0545-e1eb-412f-afee-4764733eff64","Type":"ContainerStarted","Data":"64c64fd9a182e6b7e5df38712523263ab216c5c66c959967eaffac38326a5ff4"} Dec 01 14:30:13 crc kubenswrapper[4585]: I1201 14:30:13.063372 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-n7w8n" podStartSLOduration=1.420544862 podStartE2EDuration="2.063342147s" podCreationTimestamp="2025-12-01 14:30:11 +0000 UTC" firstStartedPulling="2025-12-01 14:30:12.034409195 +0000 UTC m=+1926.018623050" lastFinishedPulling="2025-12-01 14:30:12.67720648 +0000 UTC m=+1926.661420335" observedRunningTime="2025-12-01 14:30:13.062757692 +0000 UTC m=+1927.046971547" watchObservedRunningTime="2025-12-01 14:30:13.063342147 +0000 UTC m=+1927.047556002" Dec 01 14:30:14 crc kubenswrapper[4585]: I1201 14:30:14.413278 4585 scope.go:117] "RemoveContainer" containerID="8681a4c58b14dec4c6ae76a371a009a12a89609fe6403c1612097d1fadf88593" Dec 01 14:30:14 crc kubenswrapper[4585]: E1201 14:30:14.413962 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:30:28 crc kubenswrapper[4585]: I1201 14:30:28.413030 4585 scope.go:117] "RemoveContainer" containerID="8681a4c58b14dec4c6ae76a371a009a12a89609fe6403c1612097d1fadf88593" Dec 01 14:30:28 crc kubenswrapper[4585]: E1201 14:30:28.413608 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:30:43 crc kubenswrapper[4585]: I1201 14:30:43.412716 4585 scope.go:117] "RemoveContainer" containerID="8681a4c58b14dec4c6ae76a371a009a12a89609fe6403c1612097d1fadf88593" Dec 01 14:30:43 crc kubenswrapper[4585]: E1201 14:30:43.414539 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:30:55 crc kubenswrapper[4585]: I1201 14:30:55.412612 4585 scope.go:117] "RemoveContainer" containerID="8681a4c58b14dec4c6ae76a371a009a12a89609fe6403c1612097d1fadf88593" Dec 01 14:30:56 crc kubenswrapper[4585]: I1201 14:30:56.465646 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" event={"ID":"f7beb40d-bcd0-43c8-a9fe-c32408790a4c","Type":"ContainerStarted","Data":"7cb7e396508a578a9169a8bc04f1f203090334b7c546cab0b808836f498dcef4"} Dec 01 14:31:19 crc kubenswrapper[4585]: I1201 14:31:19.655409 4585 generic.go:334] "Generic (PLEG): container finished" podID="3a6c0545-e1eb-412f-afee-4764733eff64" containerID="15d39454c165def551565f21c3913a28c3ac782bc9434925cf98344417422ec8" exitCode=0 Dec 01 14:31:19 crc kubenswrapper[4585]: I1201 14:31:19.655559 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-n7w8n" event={"ID":"3a6c0545-e1eb-412f-afee-4764733eff64","Type":"ContainerDied","Data":"15d39454c165def551565f21c3913a28c3ac782bc9434925cf98344417422ec8"} Dec 01 14:31:21 crc kubenswrapper[4585]: I1201 14:31:21.076417 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-n7w8n" Dec 01 14:31:21 crc kubenswrapper[4585]: I1201 14:31:21.153058 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3a6c0545-e1eb-412f-afee-4764733eff64-ovncontroller-config-0\") pod \"3a6c0545-e1eb-412f-afee-4764733eff64\" (UID: \"3a6c0545-e1eb-412f-afee-4764733eff64\") " Dec 01 14:31:21 crc kubenswrapper[4585]: I1201 14:31:21.153199 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3a6c0545-e1eb-412f-afee-4764733eff64-ssh-key\") pod \"3a6c0545-e1eb-412f-afee-4764733eff64\" (UID: \"3a6c0545-e1eb-412f-afee-4764733eff64\") " Dec 01 14:31:21 crc kubenswrapper[4585]: I1201 14:31:21.153325 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxtgk\" (UniqueName: \"kubernetes.io/projected/3a6c0545-e1eb-412f-afee-4764733eff64-kube-api-access-gxtgk\") pod \"3a6c0545-e1eb-412f-afee-4764733eff64\" (UID: \"3a6c0545-e1eb-412f-afee-4764733eff64\") " Dec 01 14:31:21 crc kubenswrapper[4585]: I1201 14:31:21.153372 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3a6c0545-e1eb-412f-afee-4764733eff64-inventory\") pod \"3a6c0545-e1eb-412f-afee-4764733eff64\" (UID: \"3a6c0545-e1eb-412f-afee-4764733eff64\") " Dec 01 14:31:21 crc kubenswrapper[4585]: I1201 14:31:21.153455 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a6c0545-e1eb-412f-afee-4764733eff64-ovn-combined-ca-bundle\") pod \"3a6c0545-e1eb-412f-afee-4764733eff64\" (UID: \"3a6c0545-e1eb-412f-afee-4764733eff64\") " Dec 01 14:31:21 crc kubenswrapper[4585]: I1201 14:31:21.168622 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a6c0545-e1eb-412f-afee-4764733eff64-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "3a6c0545-e1eb-412f-afee-4764733eff64" (UID: "3a6c0545-e1eb-412f-afee-4764733eff64"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:31:21 crc kubenswrapper[4585]: I1201 14:31:21.168961 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a6c0545-e1eb-412f-afee-4764733eff64-kube-api-access-gxtgk" (OuterVolumeSpecName: "kube-api-access-gxtgk") pod "3a6c0545-e1eb-412f-afee-4764733eff64" (UID: "3a6c0545-e1eb-412f-afee-4764733eff64"). InnerVolumeSpecName "kube-api-access-gxtgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:31:21 crc kubenswrapper[4585]: I1201 14:31:21.184300 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a6c0545-e1eb-412f-afee-4764733eff64-inventory" (OuterVolumeSpecName: "inventory") pod "3a6c0545-e1eb-412f-afee-4764733eff64" (UID: "3a6c0545-e1eb-412f-afee-4764733eff64"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:31:21 crc kubenswrapper[4585]: I1201 14:31:21.187134 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a6c0545-e1eb-412f-afee-4764733eff64-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "3a6c0545-e1eb-412f-afee-4764733eff64" (UID: "3a6c0545-e1eb-412f-afee-4764733eff64"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:31:21 crc kubenswrapper[4585]: I1201 14:31:21.191175 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a6c0545-e1eb-412f-afee-4764733eff64-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3a6c0545-e1eb-412f-afee-4764733eff64" (UID: "3a6c0545-e1eb-412f-afee-4764733eff64"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:31:21 crc kubenswrapper[4585]: I1201 14:31:21.255834 4585 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3a6c0545-e1eb-412f-afee-4764733eff64-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 14:31:21 crc kubenswrapper[4585]: I1201 14:31:21.255868 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxtgk\" (UniqueName: \"kubernetes.io/projected/3a6c0545-e1eb-412f-afee-4764733eff64-kube-api-access-gxtgk\") on node \"crc\" DevicePath \"\"" Dec 01 14:31:21 crc kubenswrapper[4585]: I1201 14:31:21.255878 4585 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3a6c0545-e1eb-412f-afee-4764733eff64-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 14:31:21 crc kubenswrapper[4585]: I1201 14:31:21.255904 4585 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a6c0545-e1eb-412f-afee-4764733eff64-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 14:31:21 crc kubenswrapper[4585]: I1201 14:31:21.255913 4585 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3a6c0545-e1eb-412f-afee-4764733eff64-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Dec 01 14:31:21 crc kubenswrapper[4585]: I1201 14:31:21.679797 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-n7w8n" event={"ID":"3a6c0545-e1eb-412f-afee-4764733eff64","Type":"ContainerDied","Data":"64c64fd9a182e6b7e5df38712523263ab216c5c66c959967eaffac38326a5ff4"} Dec 01 14:31:21 crc kubenswrapper[4585]: I1201 14:31:21.680097 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64c64fd9a182e6b7e5df38712523263ab216c5c66c959967eaffac38326a5ff4" Dec 01 14:31:21 crc kubenswrapper[4585]: I1201 14:31:21.680230 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-n7w8n" Dec 01 14:31:21 crc kubenswrapper[4585]: I1201 14:31:21.884242 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nw4f8"] Dec 01 14:31:21 crc kubenswrapper[4585]: E1201 14:31:21.884720 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a6c0545-e1eb-412f-afee-4764733eff64" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 01 14:31:21 crc kubenswrapper[4585]: I1201 14:31:21.884744 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a6c0545-e1eb-412f-afee-4764733eff64" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 01 14:31:21 crc kubenswrapper[4585]: I1201 14:31:21.885032 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a6c0545-e1eb-412f-afee-4764733eff64" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 01 14:31:21 crc kubenswrapper[4585]: I1201 14:31:21.885946 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nw4f8" Dec 01 14:31:21 crc kubenswrapper[4585]: I1201 14:31:21.890031 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 14:31:21 crc kubenswrapper[4585]: I1201 14:31:21.891499 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Dec 01 14:31:21 crc kubenswrapper[4585]: I1201 14:31:21.892349 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 14:31:21 crc kubenswrapper[4585]: I1201 14:31:21.894947 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-z6vpw" Dec 01 14:31:21 crc kubenswrapper[4585]: I1201 14:31:21.895017 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 14:31:21 crc kubenswrapper[4585]: I1201 14:31:21.895033 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Dec 01 14:31:21 crc kubenswrapper[4585]: I1201 14:31:21.907341 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nw4f8"] Dec 01 14:31:21 crc kubenswrapper[4585]: I1201 14:31:21.968012 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1a9f93f2-8afc-428b-9578-dd3353f8a43b-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nw4f8\" (UID: \"1a9f93f2-8afc-428b-9578-dd3353f8a43b\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nw4f8" Dec 01 14:31:21 crc kubenswrapper[4585]: I1201 14:31:21.968054 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1a9f93f2-8afc-428b-9578-dd3353f8a43b-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nw4f8\" (UID: \"1a9f93f2-8afc-428b-9578-dd3353f8a43b\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nw4f8" Dec 01 14:31:21 crc kubenswrapper[4585]: I1201 14:31:21.968117 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bw69c\" (UniqueName: \"kubernetes.io/projected/1a9f93f2-8afc-428b-9578-dd3353f8a43b-kube-api-access-bw69c\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nw4f8\" (UID: \"1a9f93f2-8afc-428b-9578-dd3353f8a43b\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nw4f8" Dec 01 14:31:21 crc kubenswrapper[4585]: I1201 14:31:21.968155 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1a9f93f2-8afc-428b-9578-dd3353f8a43b-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nw4f8\" (UID: \"1a9f93f2-8afc-428b-9578-dd3353f8a43b\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nw4f8" Dec 01 14:31:21 crc kubenswrapper[4585]: I1201 14:31:21.968201 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1a9f93f2-8afc-428b-9578-dd3353f8a43b-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nw4f8\" (UID: \"1a9f93f2-8afc-428b-9578-dd3353f8a43b\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nw4f8" Dec 01 14:31:21 crc kubenswrapper[4585]: I1201 14:31:21.968235 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a9f93f2-8afc-428b-9578-dd3353f8a43b-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nw4f8\" (UID: \"1a9f93f2-8afc-428b-9578-dd3353f8a43b\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nw4f8" Dec 01 14:31:22 crc kubenswrapper[4585]: I1201 14:31:22.070035 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1a9f93f2-8afc-428b-9578-dd3353f8a43b-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nw4f8\" (UID: \"1a9f93f2-8afc-428b-9578-dd3353f8a43b\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nw4f8" Dec 01 14:31:22 crc kubenswrapper[4585]: I1201 14:31:22.070121 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bw69c\" (UniqueName: \"kubernetes.io/projected/1a9f93f2-8afc-428b-9578-dd3353f8a43b-kube-api-access-bw69c\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nw4f8\" (UID: \"1a9f93f2-8afc-428b-9578-dd3353f8a43b\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nw4f8" Dec 01 14:31:22 crc kubenswrapper[4585]: I1201 14:31:22.070164 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1a9f93f2-8afc-428b-9578-dd3353f8a43b-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nw4f8\" (UID: \"1a9f93f2-8afc-428b-9578-dd3353f8a43b\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nw4f8" Dec 01 14:31:22 crc kubenswrapper[4585]: I1201 14:31:22.070207 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1a9f93f2-8afc-428b-9578-dd3353f8a43b-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nw4f8\" (UID: \"1a9f93f2-8afc-428b-9578-dd3353f8a43b\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nw4f8" Dec 01 14:31:22 crc kubenswrapper[4585]: I1201 14:31:22.070239 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a9f93f2-8afc-428b-9578-dd3353f8a43b-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nw4f8\" (UID: \"1a9f93f2-8afc-428b-9578-dd3353f8a43b\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nw4f8" Dec 01 14:31:22 crc kubenswrapper[4585]: I1201 14:31:22.070287 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1a9f93f2-8afc-428b-9578-dd3353f8a43b-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nw4f8\" (UID: \"1a9f93f2-8afc-428b-9578-dd3353f8a43b\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nw4f8" Dec 01 14:31:22 crc kubenswrapper[4585]: I1201 14:31:22.079856 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1a9f93f2-8afc-428b-9578-dd3353f8a43b-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nw4f8\" (UID: \"1a9f93f2-8afc-428b-9578-dd3353f8a43b\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nw4f8" Dec 01 14:31:22 crc kubenswrapper[4585]: I1201 14:31:22.079856 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1a9f93f2-8afc-428b-9578-dd3353f8a43b-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nw4f8\" (UID: \"1a9f93f2-8afc-428b-9578-dd3353f8a43b\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nw4f8" Dec 01 14:31:22 crc kubenswrapper[4585]: I1201 14:31:22.080485 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1a9f93f2-8afc-428b-9578-dd3353f8a43b-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nw4f8\" (UID: \"1a9f93f2-8afc-428b-9578-dd3353f8a43b\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nw4f8" Dec 01 14:31:22 crc kubenswrapper[4585]: I1201 14:31:22.083675 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a9f93f2-8afc-428b-9578-dd3353f8a43b-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nw4f8\" (UID: \"1a9f93f2-8afc-428b-9578-dd3353f8a43b\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nw4f8" Dec 01 14:31:22 crc kubenswrapper[4585]: I1201 14:31:22.084836 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1a9f93f2-8afc-428b-9578-dd3353f8a43b-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nw4f8\" (UID: \"1a9f93f2-8afc-428b-9578-dd3353f8a43b\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nw4f8" Dec 01 14:31:22 crc kubenswrapper[4585]: I1201 14:31:22.091708 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bw69c\" (UniqueName: \"kubernetes.io/projected/1a9f93f2-8afc-428b-9578-dd3353f8a43b-kube-api-access-bw69c\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-nw4f8\" (UID: \"1a9f93f2-8afc-428b-9578-dd3353f8a43b\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nw4f8" Dec 01 14:31:22 crc kubenswrapper[4585]: I1201 14:31:22.204374 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nw4f8" Dec 01 14:31:22 crc kubenswrapper[4585]: I1201 14:31:22.744042 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nw4f8"] Dec 01 14:31:23 crc kubenswrapper[4585]: I1201 14:31:23.695321 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nw4f8" event={"ID":"1a9f93f2-8afc-428b-9578-dd3353f8a43b","Type":"ContainerStarted","Data":"39b6eb376c8562588ba4cb91c77d2aae8e0f449e437c920b9f14c576e5b0578d"} Dec 01 14:31:23 crc kubenswrapper[4585]: I1201 14:31:23.695894 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nw4f8" event={"ID":"1a9f93f2-8afc-428b-9578-dd3353f8a43b","Type":"ContainerStarted","Data":"76b040bf41040d086e8b6279fa7398b60d3df37d54dd2b5f15de1a6edbfed2ef"} Dec 01 14:31:23 crc kubenswrapper[4585]: I1201 14:31:23.716732 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nw4f8" podStartSLOduration=2.054160138 podStartE2EDuration="2.71671027s" podCreationTimestamp="2025-12-01 14:31:21 +0000 UTC" firstStartedPulling="2025-12-01 14:31:22.74303883 +0000 UTC m=+1996.727252675" lastFinishedPulling="2025-12-01 14:31:23.405588952 +0000 UTC m=+1997.389802807" observedRunningTime="2025-12-01 14:31:23.710568177 +0000 UTC m=+1997.694782062" watchObservedRunningTime="2025-12-01 14:31:23.71671027 +0000 UTC m=+1997.700924135" Dec 01 14:32:15 crc kubenswrapper[4585]: I1201 14:32:15.175144 4585 generic.go:334] "Generic (PLEG): container finished" podID="1a9f93f2-8afc-428b-9578-dd3353f8a43b" containerID="39b6eb376c8562588ba4cb91c77d2aae8e0f449e437c920b9f14c576e5b0578d" exitCode=0 Dec 01 14:32:15 crc kubenswrapper[4585]: I1201 14:32:15.175213 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nw4f8" event={"ID":"1a9f93f2-8afc-428b-9578-dd3353f8a43b","Type":"ContainerDied","Data":"39b6eb376c8562588ba4cb91c77d2aae8e0f449e437c920b9f14c576e5b0578d"} Dec 01 14:32:16 crc kubenswrapper[4585]: I1201 14:32:16.564833 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nw4f8" Dec 01 14:32:16 crc kubenswrapper[4585]: I1201 14:32:16.659494 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1a9f93f2-8afc-428b-9578-dd3353f8a43b-ssh-key\") pod \"1a9f93f2-8afc-428b-9578-dd3353f8a43b\" (UID: \"1a9f93f2-8afc-428b-9578-dd3353f8a43b\") " Dec 01 14:32:16 crc kubenswrapper[4585]: I1201 14:32:16.659549 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1a9f93f2-8afc-428b-9578-dd3353f8a43b-nova-metadata-neutron-config-0\") pod \"1a9f93f2-8afc-428b-9578-dd3353f8a43b\" (UID: \"1a9f93f2-8afc-428b-9578-dd3353f8a43b\") " Dec 01 14:32:16 crc kubenswrapper[4585]: I1201 14:32:16.659584 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bw69c\" (UniqueName: \"kubernetes.io/projected/1a9f93f2-8afc-428b-9578-dd3353f8a43b-kube-api-access-bw69c\") pod \"1a9f93f2-8afc-428b-9578-dd3353f8a43b\" (UID: \"1a9f93f2-8afc-428b-9578-dd3353f8a43b\") " Dec 01 14:32:16 crc kubenswrapper[4585]: I1201 14:32:16.659614 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1a9f93f2-8afc-428b-9578-dd3353f8a43b-neutron-ovn-metadata-agent-neutron-config-0\") pod \"1a9f93f2-8afc-428b-9578-dd3353f8a43b\" (UID: \"1a9f93f2-8afc-428b-9578-dd3353f8a43b\") " Dec 01 14:32:16 crc kubenswrapper[4585]: I1201 14:32:16.659670 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1a9f93f2-8afc-428b-9578-dd3353f8a43b-inventory\") pod \"1a9f93f2-8afc-428b-9578-dd3353f8a43b\" (UID: \"1a9f93f2-8afc-428b-9578-dd3353f8a43b\") " Dec 01 14:32:16 crc kubenswrapper[4585]: I1201 14:32:16.659787 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a9f93f2-8afc-428b-9578-dd3353f8a43b-neutron-metadata-combined-ca-bundle\") pod \"1a9f93f2-8afc-428b-9578-dd3353f8a43b\" (UID: \"1a9f93f2-8afc-428b-9578-dd3353f8a43b\") " Dec 01 14:32:16 crc kubenswrapper[4585]: I1201 14:32:16.668736 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a9f93f2-8afc-428b-9578-dd3353f8a43b-kube-api-access-bw69c" (OuterVolumeSpecName: "kube-api-access-bw69c") pod "1a9f93f2-8afc-428b-9578-dd3353f8a43b" (UID: "1a9f93f2-8afc-428b-9578-dd3353f8a43b"). InnerVolumeSpecName "kube-api-access-bw69c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:32:16 crc kubenswrapper[4585]: I1201 14:32:16.690011 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a9f93f2-8afc-428b-9578-dd3353f8a43b-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "1a9f93f2-8afc-428b-9578-dd3353f8a43b" (UID: "1a9f93f2-8afc-428b-9578-dd3353f8a43b"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:32:16 crc kubenswrapper[4585]: I1201 14:32:16.694228 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a9f93f2-8afc-428b-9578-dd3353f8a43b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1a9f93f2-8afc-428b-9578-dd3353f8a43b" (UID: "1a9f93f2-8afc-428b-9578-dd3353f8a43b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:32:16 crc kubenswrapper[4585]: I1201 14:32:16.694645 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a9f93f2-8afc-428b-9578-dd3353f8a43b-inventory" (OuterVolumeSpecName: "inventory") pod "1a9f93f2-8afc-428b-9578-dd3353f8a43b" (UID: "1a9f93f2-8afc-428b-9578-dd3353f8a43b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:32:16 crc kubenswrapper[4585]: I1201 14:32:16.697516 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a9f93f2-8afc-428b-9578-dd3353f8a43b-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "1a9f93f2-8afc-428b-9578-dd3353f8a43b" (UID: "1a9f93f2-8afc-428b-9578-dd3353f8a43b"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:32:16 crc kubenswrapper[4585]: I1201 14:32:16.709567 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a9f93f2-8afc-428b-9578-dd3353f8a43b-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "1a9f93f2-8afc-428b-9578-dd3353f8a43b" (UID: "1a9f93f2-8afc-428b-9578-dd3353f8a43b"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:32:16 crc kubenswrapper[4585]: I1201 14:32:16.761437 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bw69c\" (UniqueName: \"kubernetes.io/projected/1a9f93f2-8afc-428b-9578-dd3353f8a43b-kube-api-access-bw69c\") on node \"crc\" DevicePath \"\"" Dec 01 14:32:16 crc kubenswrapper[4585]: I1201 14:32:16.761466 4585 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1a9f93f2-8afc-428b-9578-dd3353f8a43b-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 01 14:32:16 crc kubenswrapper[4585]: I1201 14:32:16.761478 4585 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1a9f93f2-8afc-428b-9578-dd3353f8a43b-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 14:32:16 crc kubenswrapper[4585]: I1201 14:32:16.761487 4585 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a9f93f2-8afc-428b-9578-dd3353f8a43b-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 14:32:16 crc kubenswrapper[4585]: I1201 14:32:16.761497 4585 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1a9f93f2-8afc-428b-9578-dd3353f8a43b-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 14:32:16 crc kubenswrapper[4585]: I1201 14:32:16.761507 4585 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1a9f93f2-8afc-428b-9578-dd3353f8a43b-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 01 14:32:17 crc kubenswrapper[4585]: I1201 14:32:17.193113 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nw4f8" event={"ID":"1a9f93f2-8afc-428b-9578-dd3353f8a43b","Type":"ContainerDied","Data":"76b040bf41040d086e8b6279fa7398b60d3df37d54dd2b5f15de1a6edbfed2ef"} Dec 01 14:32:17 crc kubenswrapper[4585]: I1201 14:32:17.193148 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-nw4f8" Dec 01 14:32:17 crc kubenswrapper[4585]: I1201 14:32:17.193161 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76b040bf41040d086e8b6279fa7398b60d3df37d54dd2b5f15de1a6edbfed2ef" Dec 01 14:32:17 crc kubenswrapper[4585]: I1201 14:32:17.347274 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6x5zw"] Dec 01 14:32:17 crc kubenswrapper[4585]: E1201 14:32:17.347698 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a9f93f2-8afc-428b-9578-dd3353f8a43b" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 01 14:32:17 crc kubenswrapper[4585]: I1201 14:32:17.347716 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a9f93f2-8afc-428b-9578-dd3353f8a43b" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 01 14:32:17 crc kubenswrapper[4585]: I1201 14:32:17.347919 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a9f93f2-8afc-428b-9578-dd3353f8a43b" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 01 14:32:17 crc kubenswrapper[4585]: I1201 14:32:17.348570 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6x5zw" Dec 01 14:32:17 crc kubenswrapper[4585]: I1201 14:32:17.351534 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-z6vpw" Dec 01 14:32:17 crc kubenswrapper[4585]: I1201 14:32:17.351748 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 14:32:17 crc kubenswrapper[4585]: I1201 14:32:17.354399 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 14:32:17 crc kubenswrapper[4585]: I1201 14:32:17.354737 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 14:32:17 crc kubenswrapper[4585]: I1201 14:32:17.354884 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 01 14:32:17 crc kubenswrapper[4585]: I1201 14:32:17.362534 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6x5zw"] Dec 01 14:32:17 crc kubenswrapper[4585]: I1201 14:32:17.473260 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a592f160-6520-4d70-94bd-5064e63fa1a0-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6x5zw\" (UID: \"a592f160-6520-4d70-94bd-5064e63fa1a0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6x5zw" Dec 01 14:32:17 crc kubenswrapper[4585]: I1201 14:32:17.473435 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a592f160-6520-4d70-94bd-5064e63fa1a0-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6x5zw\" (UID: \"a592f160-6520-4d70-94bd-5064e63fa1a0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6x5zw" Dec 01 14:32:17 crc kubenswrapper[4585]: I1201 14:32:17.473473 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a592f160-6520-4d70-94bd-5064e63fa1a0-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6x5zw\" (UID: \"a592f160-6520-4d70-94bd-5064e63fa1a0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6x5zw" Dec 01 14:32:17 crc kubenswrapper[4585]: I1201 14:32:17.473535 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a592f160-6520-4d70-94bd-5064e63fa1a0-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6x5zw\" (UID: \"a592f160-6520-4d70-94bd-5064e63fa1a0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6x5zw" Dec 01 14:32:17 crc kubenswrapper[4585]: I1201 14:32:17.473553 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g48lj\" (UniqueName: \"kubernetes.io/projected/a592f160-6520-4d70-94bd-5064e63fa1a0-kube-api-access-g48lj\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6x5zw\" (UID: \"a592f160-6520-4d70-94bd-5064e63fa1a0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6x5zw" Dec 01 14:32:17 crc kubenswrapper[4585]: I1201 14:32:17.575126 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a592f160-6520-4d70-94bd-5064e63fa1a0-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6x5zw\" (UID: \"a592f160-6520-4d70-94bd-5064e63fa1a0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6x5zw" Dec 01 14:32:17 crc kubenswrapper[4585]: I1201 14:32:17.575410 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g48lj\" (UniqueName: \"kubernetes.io/projected/a592f160-6520-4d70-94bd-5064e63fa1a0-kube-api-access-g48lj\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6x5zw\" (UID: \"a592f160-6520-4d70-94bd-5064e63fa1a0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6x5zw" Dec 01 14:32:17 crc kubenswrapper[4585]: I1201 14:32:17.575496 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a592f160-6520-4d70-94bd-5064e63fa1a0-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6x5zw\" (UID: \"a592f160-6520-4d70-94bd-5064e63fa1a0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6x5zw" Dec 01 14:32:17 crc kubenswrapper[4585]: I1201 14:32:17.575570 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a592f160-6520-4d70-94bd-5064e63fa1a0-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6x5zw\" (UID: \"a592f160-6520-4d70-94bd-5064e63fa1a0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6x5zw" Dec 01 14:32:17 crc kubenswrapper[4585]: I1201 14:32:17.575593 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a592f160-6520-4d70-94bd-5064e63fa1a0-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6x5zw\" (UID: \"a592f160-6520-4d70-94bd-5064e63fa1a0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6x5zw" Dec 01 14:32:17 crc kubenswrapper[4585]: I1201 14:32:17.579010 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a592f160-6520-4d70-94bd-5064e63fa1a0-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6x5zw\" (UID: \"a592f160-6520-4d70-94bd-5064e63fa1a0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6x5zw" Dec 01 14:32:17 crc kubenswrapper[4585]: I1201 14:32:17.579092 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a592f160-6520-4d70-94bd-5064e63fa1a0-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6x5zw\" (UID: \"a592f160-6520-4d70-94bd-5064e63fa1a0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6x5zw" Dec 01 14:32:17 crc kubenswrapper[4585]: I1201 14:32:17.579594 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a592f160-6520-4d70-94bd-5064e63fa1a0-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6x5zw\" (UID: \"a592f160-6520-4d70-94bd-5064e63fa1a0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6x5zw" Dec 01 14:32:17 crc kubenswrapper[4585]: I1201 14:32:17.581523 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a592f160-6520-4d70-94bd-5064e63fa1a0-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6x5zw\" (UID: \"a592f160-6520-4d70-94bd-5064e63fa1a0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6x5zw" Dec 01 14:32:17 crc kubenswrapper[4585]: I1201 14:32:17.598913 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g48lj\" (UniqueName: \"kubernetes.io/projected/a592f160-6520-4d70-94bd-5064e63fa1a0-kube-api-access-g48lj\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6x5zw\" (UID: \"a592f160-6520-4d70-94bd-5064e63fa1a0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6x5zw" Dec 01 14:32:17 crc kubenswrapper[4585]: I1201 14:32:17.668217 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6x5zw" Dec 01 14:32:18 crc kubenswrapper[4585]: I1201 14:32:18.280317 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6x5zw"] Dec 01 14:32:19 crc kubenswrapper[4585]: I1201 14:32:19.214738 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6x5zw" event={"ID":"a592f160-6520-4d70-94bd-5064e63fa1a0","Type":"ContainerStarted","Data":"e80f8061361772cc42007240cb577371fd2eedec441fb4d02b13227a2255d45a"} Dec 01 14:32:20 crc kubenswrapper[4585]: I1201 14:32:20.224741 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6x5zw" event={"ID":"a592f160-6520-4d70-94bd-5064e63fa1a0","Type":"ContainerStarted","Data":"7a206e31e4699bff1824816dcbea0a69b3bf03290314939aa1b794405d5cae12"} Dec 01 14:32:20 crc kubenswrapper[4585]: I1201 14:32:20.249264 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6x5zw" podStartSLOduration=2.432731219 podStartE2EDuration="3.249242838s" podCreationTimestamp="2025-12-01 14:32:17 +0000 UTC" firstStartedPulling="2025-12-01 14:32:18.298885091 +0000 UTC m=+2052.283098946" lastFinishedPulling="2025-12-01 14:32:19.11539671 +0000 UTC m=+2053.099610565" observedRunningTime="2025-12-01 14:32:20.240720532 +0000 UTC m=+2054.224934397" watchObservedRunningTime="2025-12-01 14:32:20.249242838 +0000 UTC m=+2054.233456693" Dec 01 14:33:13 crc kubenswrapper[4585]: I1201 14:33:13.716213 4585 patch_prober.go:28] interesting pod/machine-config-daemon-lj9gs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 14:33:13 crc kubenswrapper[4585]: I1201 14:33:13.716619 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 14:33:43 crc kubenswrapper[4585]: I1201 14:33:43.715989 4585 patch_prober.go:28] interesting pod/machine-config-daemon-lj9gs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 14:33:43 crc kubenswrapper[4585]: I1201 14:33:43.716580 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 14:33:56 crc kubenswrapper[4585]: I1201 14:33:56.289063 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6wnf6"] Dec 01 14:33:56 crc kubenswrapper[4585]: I1201 14:33:56.292196 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6wnf6" Dec 01 14:33:56 crc kubenswrapper[4585]: I1201 14:33:56.306847 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6wnf6"] Dec 01 14:33:56 crc kubenswrapper[4585]: I1201 14:33:56.318347 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bp4pb\" (UniqueName: \"kubernetes.io/projected/b56d6404-1646-44e9-b8e1-f54202fcbebe-kube-api-access-bp4pb\") pod \"redhat-operators-6wnf6\" (UID: \"b56d6404-1646-44e9-b8e1-f54202fcbebe\") " pod="openshift-marketplace/redhat-operators-6wnf6" Dec 01 14:33:56 crc kubenswrapper[4585]: I1201 14:33:56.318420 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b56d6404-1646-44e9-b8e1-f54202fcbebe-utilities\") pod \"redhat-operators-6wnf6\" (UID: \"b56d6404-1646-44e9-b8e1-f54202fcbebe\") " pod="openshift-marketplace/redhat-operators-6wnf6" Dec 01 14:33:56 crc kubenswrapper[4585]: I1201 14:33:56.318837 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b56d6404-1646-44e9-b8e1-f54202fcbebe-catalog-content\") pod \"redhat-operators-6wnf6\" (UID: \"b56d6404-1646-44e9-b8e1-f54202fcbebe\") " pod="openshift-marketplace/redhat-operators-6wnf6" Dec 01 14:33:56 crc kubenswrapper[4585]: I1201 14:33:56.442847 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bp4pb\" (UniqueName: \"kubernetes.io/projected/b56d6404-1646-44e9-b8e1-f54202fcbebe-kube-api-access-bp4pb\") pod \"redhat-operators-6wnf6\" (UID: \"b56d6404-1646-44e9-b8e1-f54202fcbebe\") " pod="openshift-marketplace/redhat-operators-6wnf6" Dec 01 14:33:56 crc kubenswrapper[4585]: I1201 14:33:56.442903 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b56d6404-1646-44e9-b8e1-f54202fcbebe-utilities\") pod \"redhat-operators-6wnf6\" (UID: \"b56d6404-1646-44e9-b8e1-f54202fcbebe\") " pod="openshift-marketplace/redhat-operators-6wnf6" Dec 01 14:33:56 crc kubenswrapper[4585]: I1201 14:33:56.443067 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b56d6404-1646-44e9-b8e1-f54202fcbebe-catalog-content\") pod \"redhat-operators-6wnf6\" (UID: \"b56d6404-1646-44e9-b8e1-f54202fcbebe\") " pod="openshift-marketplace/redhat-operators-6wnf6" Dec 01 14:33:56 crc kubenswrapper[4585]: I1201 14:33:56.443697 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b56d6404-1646-44e9-b8e1-f54202fcbebe-utilities\") pod \"redhat-operators-6wnf6\" (UID: \"b56d6404-1646-44e9-b8e1-f54202fcbebe\") " pod="openshift-marketplace/redhat-operators-6wnf6" Dec 01 14:33:56 crc kubenswrapper[4585]: I1201 14:33:56.444283 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b56d6404-1646-44e9-b8e1-f54202fcbebe-catalog-content\") pod \"redhat-operators-6wnf6\" (UID: \"b56d6404-1646-44e9-b8e1-f54202fcbebe\") " pod="openshift-marketplace/redhat-operators-6wnf6" Dec 01 14:33:56 crc kubenswrapper[4585]: I1201 14:33:56.464544 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bp4pb\" (UniqueName: \"kubernetes.io/projected/b56d6404-1646-44e9-b8e1-f54202fcbebe-kube-api-access-bp4pb\") pod \"redhat-operators-6wnf6\" (UID: \"b56d6404-1646-44e9-b8e1-f54202fcbebe\") " pod="openshift-marketplace/redhat-operators-6wnf6" Dec 01 14:33:56 crc kubenswrapper[4585]: I1201 14:33:56.615635 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6wnf6" Dec 01 14:33:57 crc kubenswrapper[4585]: W1201 14:33:57.075273 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb56d6404_1646_44e9_b8e1_f54202fcbebe.slice/crio-b742245520093400fab95aa7c63ed8b7a6555fdb78947e3cbac8c2edd4161556 WatchSource:0}: Error finding container b742245520093400fab95aa7c63ed8b7a6555fdb78947e3cbac8c2edd4161556: Status 404 returned error can't find the container with id b742245520093400fab95aa7c63ed8b7a6555fdb78947e3cbac8c2edd4161556 Dec 01 14:33:57 crc kubenswrapper[4585]: I1201 14:33:57.083288 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6wnf6"] Dec 01 14:33:57 crc kubenswrapper[4585]: I1201 14:33:57.103690 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6wnf6" event={"ID":"b56d6404-1646-44e9-b8e1-f54202fcbebe","Type":"ContainerStarted","Data":"b742245520093400fab95aa7c63ed8b7a6555fdb78947e3cbac8c2edd4161556"} Dec 01 14:33:58 crc kubenswrapper[4585]: I1201 14:33:58.113992 4585 generic.go:334] "Generic (PLEG): container finished" podID="b56d6404-1646-44e9-b8e1-f54202fcbebe" containerID="816148b30ccd1e8e05bca373656bcdd13434dfdb3e239e7195f8eba196c77ac0" exitCode=0 Dec 01 14:33:58 crc kubenswrapper[4585]: I1201 14:33:58.114183 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6wnf6" event={"ID":"b56d6404-1646-44e9-b8e1-f54202fcbebe","Type":"ContainerDied","Data":"816148b30ccd1e8e05bca373656bcdd13434dfdb3e239e7195f8eba196c77ac0"} Dec 01 14:33:58 crc kubenswrapper[4585]: I1201 14:33:58.116067 4585 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 14:34:00 crc kubenswrapper[4585]: I1201 14:34:00.134009 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6wnf6" event={"ID":"b56d6404-1646-44e9-b8e1-f54202fcbebe","Type":"ContainerStarted","Data":"6dd5a2ec7a62013eb34eac48efedd4a7cf6333ecea7c85de16dd63d09c2dee6b"} Dec 01 14:34:02 crc kubenswrapper[4585]: I1201 14:34:02.153057 4585 generic.go:334] "Generic (PLEG): container finished" podID="b56d6404-1646-44e9-b8e1-f54202fcbebe" containerID="6dd5a2ec7a62013eb34eac48efedd4a7cf6333ecea7c85de16dd63d09c2dee6b" exitCode=0 Dec 01 14:34:02 crc kubenswrapper[4585]: I1201 14:34:02.153094 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6wnf6" event={"ID":"b56d6404-1646-44e9-b8e1-f54202fcbebe","Type":"ContainerDied","Data":"6dd5a2ec7a62013eb34eac48efedd4a7cf6333ecea7c85de16dd63d09c2dee6b"} Dec 01 14:34:03 crc kubenswrapper[4585]: I1201 14:34:03.167870 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6wnf6" event={"ID":"b56d6404-1646-44e9-b8e1-f54202fcbebe","Type":"ContainerStarted","Data":"93bfe64a381f87a7e70b8a23064b4ba175166c9f94f59dbc69c8b32aac2cfd0d"} Dec 01 14:34:03 crc kubenswrapper[4585]: I1201 14:34:03.202938 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6wnf6" podStartSLOduration=2.419591302 podStartE2EDuration="7.202919589s" podCreationTimestamp="2025-12-01 14:33:56 +0000 UTC" firstStartedPulling="2025-12-01 14:33:58.115854763 +0000 UTC m=+2152.100068618" lastFinishedPulling="2025-12-01 14:34:02.89918305 +0000 UTC m=+2156.883396905" observedRunningTime="2025-12-01 14:34:03.194932188 +0000 UTC m=+2157.179146053" watchObservedRunningTime="2025-12-01 14:34:03.202919589 +0000 UTC m=+2157.187133444" Dec 01 14:34:06 crc kubenswrapper[4585]: I1201 14:34:06.616181 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6wnf6" Dec 01 14:34:06 crc kubenswrapper[4585]: I1201 14:34:06.616862 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6wnf6" Dec 01 14:34:07 crc kubenswrapper[4585]: I1201 14:34:07.670996 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6wnf6" podUID="b56d6404-1646-44e9-b8e1-f54202fcbebe" containerName="registry-server" probeResult="failure" output=< Dec 01 14:34:07 crc kubenswrapper[4585]: timeout: failed to connect service ":50051" within 1s Dec 01 14:34:07 crc kubenswrapper[4585]: > Dec 01 14:34:13 crc kubenswrapper[4585]: I1201 14:34:13.716292 4585 patch_prober.go:28] interesting pod/machine-config-daemon-lj9gs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 14:34:13 crc kubenswrapper[4585]: I1201 14:34:13.716864 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 14:34:13 crc kubenswrapper[4585]: I1201 14:34:13.716909 4585 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" Dec 01 14:34:13 crc kubenswrapper[4585]: I1201 14:34:13.717506 4585 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7cb7e396508a578a9169a8bc04f1f203090334b7c546cab0b808836f498dcef4"} pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 14:34:13 crc kubenswrapper[4585]: I1201 14:34:13.717554 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerName="machine-config-daemon" containerID="cri-o://7cb7e396508a578a9169a8bc04f1f203090334b7c546cab0b808836f498dcef4" gracePeriod=600 Dec 01 14:34:14 crc kubenswrapper[4585]: I1201 14:34:14.263893 4585 generic.go:334] "Generic (PLEG): container finished" podID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerID="7cb7e396508a578a9169a8bc04f1f203090334b7c546cab0b808836f498dcef4" exitCode=0 Dec 01 14:34:14 crc kubenswrapper[4585]: I1201 14:34:14.264118 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" event={"ID":"f7beb40d-bcd0-43c8-a9fe-c32408790a4c","Type":"ContainerDied","Data":"7cb7e396508a578a9169a8bc04f1f203090334b7c546cab0b808836f498dcef4"} Dec 01 14:34:14 crc kubenswrapper[4585]: I1201 14:34:14.264595 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" event={"ID":"f7beb40d-bcd0-43c8-a9fe-c32408790a4c","Type":"ContainerStarted","Data":"a1dbd00cb2d9328b688322869d95c509a2164260f953ca463a70d2042af9535f"} Dec 01 14:34:14 crc kubenswrapper[4585]: I1201 14:34:14.264664 4585 scope.go:117] "RemoveContainer" containerID="8681a4c58b14dec4c6ae76a371a009a12a89609fe6403c1612097d1fadf88593" Dec 01 14:34:16 crc kubenswrapper[4585]: I1201 14:34:16.663182 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6wnf6" Dec 01 14:34:16 crc kubenswrapper[4585]: I1201 14:34:16.715139 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6wnf6" Dec 01 14:34:16 crc kubenswrapper[4585]: I1201 14:34:16.913821 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6wnf6"] Dec 01 14:34:18 crc kubenswrapper[4585]: I1201 14:34:18.299741 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6wnf6" podUID="b56d6404-1646-44e9-b8e1-f54202fcbebe" containerName="registry-server" containerID="cri-o://93bfe64a381f87a7e70b8a23064b4ba175166c9f94f59dbc69c8b32aac2cfd0d" gracePeriod=2 Dec 01 14:34:18 crc kubenswrapper[4585]: E1201 14:34:18.436213 4585 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb56d6404_1646_44e9_b8e1_f54202fcbebe.slice/crio-93bfe64a381f87a7e70b8a23064b4ba175166c9f94f59dbc69c8b32aac2cfd0d.scope\": RecentStats: unable to find data in memory cache]" Dec 01 14:34:18 crc kubenswrapper[4585]: I1201 14:34:18.772044 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6wnf6" Dec 01 14:34:18 crc kubenswrapper[4585]: I1201 14:34:18.927826 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b56d6404-1646-44e9-b8e1-f54202fcbebe-utilities\") pod \"b56d6404-1646-44e9-b8e1-f54202fcbebe\" (UID: \"b56d6404-1646-44e9-b8e1-f54202fcbebe\") " Dec 01 14:34:18 crc kubenswrapper[4585]: I1201 14:34:18.927900 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bp4pb\" (UniqueName: \"kubernetes.io/projected/b56d6404-1646-44e9-b8e1-f54202fcbebe-kube-api-access-bp4pb\") pod \"b56d6404-1646-44e9-b8e1-f54202fcbebe\" (UID: \"b56d6404-1646-44e9-b8e1-f54202fcbebe\") " Dec 01 14:34:18 crc kubenswrapper[4585]: I1201 14:34:18.928058 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b56d6404-1646-44e9-b8e1-f54202fcbebe-catalog-content\") pod \"b56d6404-1646-44e9-b8e1-f54202fcbebe\" (UID: \"b56d6404-1646-44e9-b8e1-f54202fcbebe\") " Dec 01 14:34:18 crc kubenswrapper[4585]: I1201 14:34:18.928797 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b56d6404-1646-44e9-b8e1-f54202fcbebe-utilities" (OuterVolumeSpecName: "utilities") pod "b56d6404-1646-44e9-b8e1-f54202fcbebe" (UID: "b56d6404-1646-44e9-b8e1-f54202fcbebe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:34:18 crc kubenswrapper[4585]: I1201 14:34:18.933738 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b56d6404-1646-44e9-b8e1-f54202fcbebe-kube-api-access-bp4pb" (OuterVolumeSpecName: "kube-api-access-bp4pb") pod "b56d6404-1646-44e9-b8e1-f54202fcbebe" (UID: "b56d6404-1646-44e9-b8e1-f54202fcbebe"). InnerVolumeSpecName "kube-api-access-bp4pb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:34:19 crc kubenswrapper[4585]: I1201 14:34:19.030751 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bp4pb\" (UniqueName: \"kubernetes.io/projected/b56d6404-1646-44e9-b8e1-f54202fcbebe-kube-api-access-bp4pb\") on node \"crc\" DevicePath \"\"" Dec 01 14:34:19 crc kubenswrapper[4585]: I1201 14:34:19.030790 4585 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b56d6404-1646-44e9-b8e1-f54202fcbebe-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 14:34:19 crc kubenswrapper[4585]: I1201 14:34:19.044233 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b56d6404-1646-44e9-b8e1-f54202fcbebe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b56d6404-1646-44e9-b8e1-f54202fcbebe" (UID: "b56d6404-1646-44e9-b8e1-f54202fcbebe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:34:19 crc kubenswrapper[4585]: I1201 14:34:19.134307 4585 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b56d6404-1646-44e9-b8e1-f54202fcbebe-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 14:34:19 crc kubenswrapper[4585]: I1201 14:34:19.311043 4585 generic.go:334] "Generic (PLEG): container finished" podID="b56d6404-1646-44e9-b8e1-f54202fcbebe" containerID="93bfe64a381f87a7e70b8a23064b4ba175166c9f94f59dbc69c8b32aac2cfd0d" exitCode=0 Dec 01 14:34:19 crc kubenswrapper[4585]: I1201 14:34:19.311089 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6wnf6" event={"ID":"b56d6404-1646-44e9-b8e1-f54202fcbebe","Type":"ContainerDied","Data":"93bfe64a381f87a7e70b8a23064b4ba175166c9f94f59dbc69c8b32aac2cfd0d"} Dec 01 14:34:19 crc kubenswrapper[4585]: I1201 14:34:19.311154 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6wnf6" event={"ID":"b56d6404-1646-44e9-b8e1-f54202fcbebe","Type":"ContainerDied","Data":"b742245520093400fab95aa7c63ed8b7a6555fdb78947e3cbac8c2edd4161556"} Dec 01 14:34:19 crc kubenswrapper[4585]: I1201 14:34:19.311157 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6wnf6" Dec 01 14:34:19 crc kubenswrapper[4585]: I1201 14:34:19.311175 4585 scope.go:117] "RemoveContainer" containerID="93bfe64a381f87a7e70b8a23064b4ba175166c9f94f59dbc69c8b32aac2cfd0d" Dec 01 14:34:19 crc kubenswrapper[4585]: I1201 14:34:19.339232 4585 scope.go:117] "RemoveContainer" containerID="6dd5a2ec7a62013eb34eac48efedd4a7cf6333ecea7c85de16dd63d09c2dee6b" Dec 01 14:34:19 crc kubenswrapper[4585]: I1201 14:34:19.343249 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6wnf6"] Dec 01 14:34:19 crc kubenswrapper[4585]: I1201 14:34:19.355352 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6wnf6"] Dec 01 14:34:19 crc kubenswrapper[4585]: I1201 14:34:19.372604 4585 scope.go:117] "RemoveContainer" containerID="816148b30ccd1e8e05bca373656bcdd13434dfdb3e239e7195f8eba196c77ac0" Dec 01 14:34:19 crc kubenswrapper[4585]: I1201 14:34:19.409840 4585 scope.go:117] "RemoveContainer" containerID="93bfe64a381f87a7e70b8a23064b4ba175166c9f94f59dbc69c8b32aac2cfd0d" Dec 01 14:34:19 crc kubenswrapper[4585]: E1201 14:34:19.410246 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93bfe64a381f87a7e70b8a23064b4ba175166c9f94f59dbc69c8b32aac2cfd0d\": container with ID starting with 93bfe64a381f87a7e70b8a23064b4ba175166c9f94f59dbc69c8b32aac2cfd0d not found: ID does not exist" containerID="93bfe64a381f87a7e70b8a23064b4ba175166c9f94f59dbc69c8b32aac2cfd0d" Dec 01 14:34:19 crc kubenswrapper[4585]: I1201 14:34:19.410287 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93bfe64a381f87a7e70b8a23064b4ba175166c9f94f59dbc69c8b32aac2cfd0d"} err="failed to get container status \"93bfe64a381f87a7e70b8a23064b4ba175166c9f94f59dbc69c8b32aac2cfd0d\": rpc error: code = NotFound desc = could not find container \"93bfe64a381f87a7e70b8a23064b4ba175166c9f94f59dbc69c8b32aac2cfd0d\": container with ID starting with 93bfe64a381f87a7e70b8a23064b4ba175166c9f94f59dbc69c8b32aac2cfd0d not found: ID does not exist" Dec 01 14:34:19 crc kubenswrapper[4585]: I1201 14:34:19.410315 4585 scope.go:117] "RemoveContainer" containerID="6dd5a2ec7a62013eb34eac48efedd4a7cf6333ecea7c85de16dd63d09c2dee6b" Dec 01 14:34:19 crc kubenswrapper[4585]: E1201 14:34:19.410565 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6dd5a2ec7a62013eb34eac48efedd4a7cf6333ecea7c85de16dd63d09c2dee6b\": container with ID starting with 6dd5a2ec7a62013eb34eac48efedd4a7cf6333ecea7c85de16dd63d09c2dee6b not found: ID does not exist" containerID="6dd5a2ec7a62013eb34eac48efedd4a7cf6333ecea7c85de16dd63d09c2dee6b" Dec 01 14:34:19 crc kubenswrapper[4585]: I1201 14:34:19.410592 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dd5a2ec7a62013eb34eac48efedd4a7cf6333ecea7c85de16dd63d09c2dee6b"} err="failed to get container status \"6dd5a2ec7a62013eb34eac48efedd4a7cf6333ecea7c85de16dd63d09c2dee6b\": rpc error: code = NotFound desc = could not find container \"6dd5a2ec7a62013eb34eac48efedd4a7cf6333ecea7c85de16dd63d09c2dee6b\": container with ID starting with 6dd5a2ec7a62013eb34eac48efedd4a7cf6333ecea7c85de16dd63d09c2dee6b not found: ID does not exist" Dec 01 14:34:19 crc kubenswrapper[4585]: I1201 14:34:19.410608 4585 scope.go:117] "RemoveContainer" containerID="816148b30ccd1e8e05bca373656bcdd13434dfdb3e239e7195f8eba196c77ac0" Dec 01 14:34:19 crc kubenswrapper[4585]: E1201 14:34:19.410833 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"816148b30ccd1e8e05bca373656bcdd13434dfdb3e239e7195f8eba196c77ac0\": container with ID starting with 816148b30ccd1e8e05bca373656bcdd13434dfdb3e239e7195f8eba196c77ac0 not found: ID does not exist" containerID="816148b30ccd1e8e05bca373656bcdd13434dfdb3e239e7195f8eba196c77ac0" Dec 01 14:34:19 crc kubenswrapper[4585]: I1201 14:34:19.410855 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"816148b30ccd1e8e05bca373656bcdd13434dfdb3e239e7195f8eba196c77ac0"} err="failed to get container status \"816148b30ccd1e8e05bca373656bcdd13434dfdb3e239e7195f8eba196c77ac0\": rpc error: code = NotFound desc = could not find container \"816148b30ccd1e8e05bca373656bcdd13434dfdb3e239e7195f8eba196c77ac0\": container with ID starting with 816148b30ccd1e8e05bca373656bcdd13434dfdb3e239e7195f8eba196c77ac0 not found: ID does not exist" Dec 01 14:34:20 crc kubenswrapper[4585]: I1201 14:34:20.424506 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b56d6404-1646-44e9-b8e1-f54202fcbebe" path="/var/lib/kubelet/pods/b56d6404-1646-44e9-b8e1-f54202fcbebe/volumes" Dec 01 14:35:25 crc kubenswrapper[4585]: I1201 14:35:25.983456 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gksb5"] Dec 01 14:35:25 crc kubenswrapper[4585]: E1201 14:35:25.984246 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b56d6404-1646-44e9-b8e1-f54202fcbebe" containerName="extract-utilities" Dec 01 14:35:25 crc kubenswrapper[4585]: I1201 14:35:25.984258 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="b56d6404-1646-44e9-b8e1-f54202fcbebe" containerName="extract-utilities" Dec 01 14:35:25 crc kubenswrapper[4585]: E1201 14:35:25.984269 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b56d6404-1646-44e9-b8e1-f54202fcbebe" containerName="registry-server" Dec 01 14:35:25 crc kubenswrapper[4585]: I1201 14:35:25.984276 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="b56d6404-1646-44e9-b8e1-f54202fcbebe" containerName="registry-server" Dec 01 14:35:25 crc kubenswrapper[4585]: E1201 14:35:25.984296 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b56d6404-1646-44e9-b8e1-f54202fcbebe" containerName="extract-content" Dec 01 14:35:25 crc kubenswrapper[4585]: I1201 14:35:25.984302 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="b56d6404-1646-44e9-b8e1-f54202fcbebe" containerName="extract-content" Dec 01 14:35:25 crc kubenswrapper[4585]: I1201 14:35:25.984477 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="b56d6404-1646-44e9-b8e1-f54202fcbebe" containerName="registry-server" Dec 01 14:35:25 crc kubenswrapper[4585]: I1201 14:35:25.985816 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gksb5" Dec 01 14:35:26 crc kubenswrapper[4585]: I1201 14:35:26.014965 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gksb5"] Dec 01 14:35:26 crc kubenswrapper[4585]: I1201 14:35:26.046809 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5fd6fd9b-5d0c-4c51-86ae-be67d49dc2ed-catalog-content\") pod \"community-operators-gksb5\" (UID: \"5fd6fd9b-5d0c-4c51-86ae-be67d49dc2ed\") " pod="openshift-marketplace/community-operators-gksb5" Dec 01 14:35:26 crc kubenswrapper[4585]: I1201 14:35:26.046912 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5fd6fd9b-5d0c-4c51-86ae-be67d49dc2ed-utilities\") pod \"community-operators-gksb5\" (UID: \"5fd6fd9b-5d0c-4c51-86ae-be67d49dc2ed\") " pod="openshift-marketplace/community-operators-gksb5" Dec 01 14:35:26 crc kubenswrapper[4585]: I1201 14:35:26.046988 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g44x5\" (UniqueName: \"kubernetes.io/projected/5fd6fd9b-5d0c-4c51-86ae-be67d49dc2ed-kube-api-access-g44x5\") pod \"community-operators-gksb5\" (UID: \"5fd6fd9b-5d0c-4c51-86ae-be67d49dc2ed\") " pod="openshift-marketplace/community-operators-gksb5" Dec 01 14:35:26 crc kubenswrapper[4585]: I1201 14:35:26.147560 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g44x5\" (UniqueName: \"kubernetes.io/projected/5fd6fd9b-5d0c-4c51-86ae-be67d49dc2ed-kube-api-access-g44x5\") pod \"community-operators-gksb5\" (UID: \"5fd6fd9b-5d0c-4c51-86ae-be67d49dc2ed\") " pod="openshift-marketplace/community-operators-gksb5" Dec 01 14:35:26 crc kubenswrapper[4585]: I1201 14:35:26.147645 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5fd6fd9b-5d0c-4c51-86ae-be67d49dc2ed-catalog-content\") pod \"community-operators-gksb5\" (UID: \"5fd6fd9b-5d0c-4c51-86ae-be67d49dc2ed\") " pod="openshift-marketplace/community-operators-gksb5" Dec 01 14:35:26 crc kubenswrapper[4585]: I1201 14:35:26.147718 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5fd6fd9b-5d0c-4c51-86ae-be67d49dc2ed-utilities\") pod \"community-operators-gksb5\" (UID: \"5fd6fd9b-5d0c-4c51-86ae-be67d49dc2ed\") " pod="openshift-marketplace/community-operators-gksb5" Dec 01 14:35:26 crc kubenswrapper[4585]: I1201 14:35:26.148215 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5fd6fd9b-5d0c-4c51-86ae-be67d49dc2ed-utilities\") pod \"community-operators-gksb5\" (UID: \"5fd6fd9b-5d0c-4c51-86ae-be67d49dc2ed\") " pod="openshift-marketplace/community-operators-gksb5" Dec 01 14:35:26 crc kubenswrapper[4585]: I1201 14:35:26.148373 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5fd6fd9b-5d0c-4c51-86ae-be67d49dc2ed-catalog-content\") pod \"community-operators-gksb5\" (UID: \"5fd6fd9b-5d0c-4c51-86ae-be67d49dc2ed\") " pod="openshift-marketplace/community-operators-gksb5" Dec 01 14:35:26 crc kubenswrapper[4585]: I1201 14:35:26.166850 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g44x5\" (UniqueName: \"kubernetes.io/projected/5fd6fd9b-5d0c-4c51-86ae-be67d49dc2ed-kube-api-access-g44x5\") pod \"community-operators-gksb5\" (UID: \"5fd6fd9b-5d0c-4c51-86ae-be67d49dc2ed\") " pod="openshift-marketplace/community-operators-gksb5" Dec 01 14:35:26 crc kubenswrapper[4585]: I1201 14:35:26.314241 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gksb5" Dec 01 14:35:26 crc kubenswrapper[4585]: I1201 14:35:26.947254 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gksb5"] Dec 01 14:35:27 crc kubenswrapper[4585]: I1201 14:35:27.952437 4585 generic.go:334] "Generic (PLEG): container finished" podID="5fd6fd9b-5d0c-4c51-86ae-be67d49dc2ed" containerID="2d76df25b08380b72e9867d118963356a9be1d9947e0b49a530901f1a356cbb2" exitCode=0 Dec 01 14:35:27 crc kubenswrapper[4585]: I1201 14:35:27.952482 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gksb5" event={"ID":"5fd6fd9b-5d0c-4c51-86ae-be67d49dc2ed","Type":"ContainerDied","Data":"2d76df25b08380b72e9867d118963356a9be1d9947e0b49a530901f1a356cbb2"} Dec 01 14:35:27 crc kubenswrapper[4585]: I1201 14:35:27.952696 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gksb5" event={"ID":"5fd6fd9b-5d0c-4c51-86ae-be67d49dc2ed","Type":"ContainerStarted","Data":"c2c4efdb8d2a0513e953e7bbbe50bd6cf905bff8c5f1814aecea423fa2f72bf8"} Dec 01 14:35:28 crc kubenswrapper[4585]: I1201 14:35:28.963870 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gksb5" event={"ID":"5fd6fd9b-5d0c-4c51-86ae-be67d49dc2ed","Type":"ContainerStarted","Data":"02bc4247cbb2c9d76528228e801d11bb7b33b0ded9263eff7929350b675e85ea"} Dec 01 14:35:29 crc kubenswrapper[4585]: I1201 14:35:29.974392 4585 generic.go:334] "Generic (PLEG): container finished" podID="5fd6fd9b-5d0c-4c51-86ae-be67d49dc2ed" containerID="02bc4247cbb2c9d76528228e801d11bb7b33b0ded9263eff7929350b675e85ea" exitCode=0 Dec 01 14:35:29 crc kubenswrapper[4585]: I1201 14:35:29.974489 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gksb5" event={"ID":"5fd6fd9b-5d0c-4c51-86ae-be67d49dc2ed","Type":"ContainerDied","Data":"02bc4247cbb2c9d76528228e801d11bb7b33b0ded9263eff7929350b675e85ea"} Dec 01 14:35:30 crc kubenswrapper[4585]: I1201 14:35:30.988548 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gksb5" event={"ID":"5fd6fd9b-5d0c-4c51-86ae-be67d49dc2ed","Type":"ContainerStarted","Data":"553e4cc2bd13a1931b2ec66371bc11f1c62f6fbc1f85e9c55fed0604dc86797f"} Dec 01 14:35:31 crc kubenswrapper[4585]: I1201 14:35:31.010441 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gksb5" podStartSLOduration=3.579106448 podStartE2EDuration="6.010423984s" podCreationTimestamp="2025-12-01 14:35:25 +0000 UTC" firstStartedPulling="2025-12-01 14:35:27.954496777 +0000 UTC m=+2241.938710632" lastFinishedPulling="2025-12-01 14:35:30.385814293 +0000 UTC m=+2244.370028168" observedRunningTime="2025-12-01 14:35:31.003814189 +0000 UTC m=+2244.988028074" watchObservedRunningTime="2025-12-01 14:35:31.010423984 +0000 UTC m=+2244.994637839" Dec 01 14:35:36 crc kubenswrapper[4585]: I1201 14:35:36.314495 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gksb5" Dec 01 14:35:36 crc kubenswrapper[4585]: I1201 14:35:36.315081 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gksb5" Dec 01 14:35:36 crc kubenswrapper[4585]: I1201 14:35:36.375987 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gksb5" Dec 01 14:35:37 crc kubenswrapper[4585]: I1201 14:35:37.107838 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gksb5" Dec 01 14:35:37 crc kubenswrapper[4585]: I1201 14:35:37.158150 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gksb5"] Dec 01 14:35:39 crc kubenswrapper[4585]: I1201 14:35:39.062860 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gksb5" podUID="5fd6fd9b-5d0c-4c51-86ae-be67d49dc2ed" containerName="registry-server" containerID="cri-o://553e4cc2bd13a1931b2ec66371bc11f1c62f6fbc1f85e9c55fed0604dc86797f" gracePeriod=2 Dec 01 14:35:40 crc kubenswrapper[4585]: I1201 14:35:40.074512 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gksb5" Dec 01 14:35:40 crc kubenswrapper[4585]: I1201 14:35:40.074581 4585 generic.go:334] "Generic (PLEG): container finished" podID="5fd6fd9b-5d0c-4c51-86ae-be67d49dc2ed" containerID="553e4cc2bd13a1931b2ec66371bc11f1c62f6fbc1f85e9c55fed0604dc86797f" exitCode=0 Dec 01 14:35:40 crc kubenswrapper[4585]: I1201 14:35:40.074603 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gksb5" event={"ID":"5fd6fd9b-5d0c-4c51-86ae-be67d49dc2ed","Type":"ContainerDied","Data":"553e4cc2bd13a1931b2ec66371bc11f1c62f6fbc1f85e9c55fed0604dc86797f"} Dec 01 14:35:40 crc kubenswrapper[4585]: I1201 14:35:40.075549 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gksb5" event={"ID":"5fd6fd9b-5d0c-4c51-86ae-be67d49dc2ed","Type":"ContainerDied","Data":"c2c4efdb8d2a0513e953e7bbbe50bd6cf905bff8c5f1814aecea423fa2f72bf8"} Dec 01 14:35:40 crc kubenswrapper[4585]: I1201 14:35:40.075586 4585 scope.go:117] "RemoveContainer" containerID="553e4cc2bd13a1931b2ec66371bc11f1c62f6fbc1f85e9c55fed0604dc86797f" Dec 01 14:35:40 crc kubenswrapper[4585]: I1201 14:35:40.099694 4585 scope.go:117] "RemoveContainer" containerID="02bc4247cbb2c9d76528228e801d11bb7b33b0ded9263eff7929350b675e85ea" Dec 01 14:35:40 crc kubenswrapper[4585]: I1201 14:35:40.136228 4585 scope.go:117] "RemoveContainer" containerID="2d76df25b08380b72e9867d118963356a9be1d9947e0b49a530901f1a356cbb2" Dec 01 14:35:40 crc kubenswrapper[4585]: I1201 14:35:40.173052 4585 scope.go:117] "RemoveContainer" containerID="553e4cc2bd13a1931b2ec66371bc11f1c62f6fbc1f85e9c55fed0604dc86797f" Dec 01 14:35:40 crc kubenswrapper[4585]: E1201 14:35:40.173533 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"553e4cc2bd13a1931b2ec66371bc11f1c62f6fbc1f85e9c55fed0604dc86797f\": container with ID starting with 553e4cc2bd13a1931b2ec66371bc11f1c62f6fbc1f85e9c55fed0604dc86797f not found: ID does not exist" containerID="553e4cc2bd13a1931b2ec66371bc11f1c62f6fbc1f85e9c55fed0604dc86797f" Dec 01 14:35:40 crc kubenswrapper[4585]: I1201 14:35:40.173663 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"553e4cc2bd13a1931b2ec66371bc11f1c62f6fbc1f85e9c55fed0604dc86797f"} err="failed to get container status \"553e4cc2bd13a1931b2ec66371bc11f1c62f6fbc1f85e9c55fed0604dc86797f\": rpc error: code = NotFound desc = could not find container \"553e4cc2bd13a1931b2ec66371bc11f1c62f6fbc1f85e9c55fed0604dc86797f\": container with ID starting with 553e4cc2bd13a1931b2ec66371bc11f1c62f6fbc1f85e9c55fed0604dc86797f not found: ID does not exist" Dec 01 14:35:40 crc kubenswrapper[4585]: I1201 14:35:40.173773 4585 scope.go:117] "RemoveContainer" containerID="02bc4247cbb2c9d76528228e801d11bb7b33b0ded9263eff7929350b675e85ea" Dec 01 14:35:40 crc kubenswrapper[4585]: E1201 14:35:40.174133 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02bc4247cbb2c9d76528228e801d11bb7b33b0ded9263eff7929350b675e85ea\": container with ID starting with 02bc4247cbb2c9d76528228e801d11bb7b33b0ded9263eff7929350b675e85ea not found: ID does not exist" containerID="02bc4247cbb2c9d76528228e801d11bb7b33b0ded9263eff7929350b675e85ea" Dec 01 14:35:40 crc kubenswrapper[4585]: I1201 14:35:40.174244 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02bc4247cbb2c9d76528228e801d11bb7b33b0ded9263eff7929350b675e85ea"} err="failed to get container status \"02bc4247cbb2c9d76528228e801d11bb7b33b0ded9263eff7929350b675e85ea\": rpc error: code = NotFound desc = could not find container \"02bc4247cbb2c9d76528228e801d11bb7b33b0ded9263eff7929350b675e85ea\": container with ID starting with 02bc4247cbb2c9d76528228e801d11bb7b33b0ded9263eff7929350b675e85ea not found: ID does not exist" Dec 01 14:35:40 crc kubenswrapper[4585]: I1201 14:35:40.174351 4585 scope.go:117] "RemoveContainer" containerID="2d76df25b08380b72e9867d118963356a9be1d9947e0b49a530901f1a356cbb2" Dec 01 14:35:40 crc kubenswrapper[4585]: E1201 14:35:40.174904 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d76df25b08380b72e9867d118963356a9be1d9947e0b49a530901f1a356cbb2\": container with ID starting with 2d76df25b08380b72e9867d118963356a9be1d9947e0b49a530901f1a356cbb2 not found: ID does not exist" containerID="2d76df25b08380b72e9867d118963356a9be1d9947e0b49a530901f1a356cbb2" Dec 01 14:35:40 crc kubenswrapper[4585]: I1201 14:35:40.174963 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d76df25b08380b72e9867d118963356a9be1d9947e0b49a530901f1a356cbb2"} err="failed to get container status \"2d76df25b08380b72e9867d118963356a9be1d9947e0b49a530901f1a356cbb2\": rpc error: code = NotFound desc = could not find container \"2d76df25b08380b72e9867d118963356a9be1d9947e0b49a530901f1a356cbb2\": container with ID starting with 2d76df25b08380b72e9867d118963356a9be1d9947e0b49a530901f1a356cbb2 not found: ID does not exist" Dec 01 14:35:40 crc kubenswrapper[4585]: I1201 14:35:40.210156 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5fd6fd9b-5d0c-4c51-86ae-be67d49dc2ed-utilities\") pod \"5fd6fd9b-5d0c-4c51-86ae-be67d49dc2ed\" (UID: \"5fd6fd9b-5d0c-4c51-86ae-be67d49dc2ed\") " Dec 01 14:35:40 crc kubenswrapper[4585]: I1201 14:35:40.210220 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g44x5\" (UniqueName: \"kubernetes.io/projected/5fd6fd9b-5d0c-4c51-86ae-be67d49dc2ed-kube-api-access-g44x5\") pod \"5fd6fd9b-5d0c-4c51-86ae-be67d49dc2ed\" (UID: \"5fd6fd9b-5d0c-4c51-86ae-be67d49dc2ed\") " Dec 01 14:35:40 crc kubenswrapper[4585]: I1201 14:35:40.210325 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5fd6fd9b-5d0c-4c51-86ae-be67d49dc2ed-catalog-content\") pod \"5fd6fd9b-5d0c-4c51-86ae-be67d49dc2ed\" (UID: \"5fd6fd9b-5d0c-4c51-86ae-be67d49dc2ed\") " Dec 01 14:35:40 crc kubenswrapper[4585]: I1201 14:35:40.211362 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5fd6fd9b-5d0c-4c51-86ae-be67d49dc2ed-utilities" (OuterVolumeSpecName: "utilities") pod "5fd6fd9b-5d0c-4c51-86ae-be67d49dc2ed" (UID: "5fd6fd9b-5d0c-4c51-86ae-be67d49dc2ed"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:35:40 crc kubenswrapper[4585]: I1201 14:35:40.217498 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fd6fd9b-5d0c-4c51-86ae-be67d49dc2ed-kube-api-access-g44x5" (OuterVolumeSpecName: "kube-api-access-g44x5") pod "5fd6fd9b-5d0c-4c51-86ae-be67d49dc2ed" (UID: "5fd6fd9b-5d0c-4c51-86ae-be67d49dc2ed"). InnerVolumeSpecName "kube-api-access-g44x5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:35:40 crc kubenswrapper[4585]: I1201 14:35:40.271864 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5fd6fd9b-5d0c-4c51-86ae-be67d49dc2ed-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5fd6fd9b-5d0c-4c51-86ae-be67d49dc2ed" (UID: "5fd6fd9b-5d0c-4c51-86ae-be67d49dc2ed"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:35:40 crc kubenswrapper[4585]: I1201 14:35:40.312004 4585 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5fd6fd9b-5d0c-4c51-86ae-be67d49dc2ed-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 14:35:40 crc kubenswrapper[4585]: I1201 14:35:40.312033 4585 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5fd6fd9b-5d0c-4c51-86ae-be67d49dc2ed-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 14:35:40 crc kubenswrapper[4585]: I1201 14:35:40.312043 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g44x5\" (UniqueName: \"kubernetes.io/projected/5fd6fd9b-5d0c-4c51-86ae-be67d49dc2ed-kube-api-access-g44x5\") on node \"crc\" DevicePath \"\"" Dec 01 14:35:41 crc kubenswrapper[4585]: I1201 14:35:41.086876 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gksb5" Dec 01 14:35:41 crc kubenswrapper[4585]: I1201 14:35:41.117026 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gksb5"] Dec 01 14:35:41 crc kubenswrapper[4585]: I1201 14:35:41.129105 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gksb5"] Dec 01 14:35:42 crc kubenswrapper[4585]: I1201 14:35:42.425255 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fd6fd9b-5d0c-4c51-86ae-be67d49dc2ed" path="/var/lib/kubelet/pods/5fd6fd9b-5d0c-4c51-86ae-be67d49dc2ed/volumes" Dec 01 14:36:43 crc kubenswrapper[4585]: I1201 14:36:43.715953 4585 patch_prober.go:28] interesting pod/machine-config-daemon-lj9gs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 14:36:43 crc kubenswrapper[4585]: I1201 14:36:43.716528 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 14:36:55 crc kubenswrapper[4585]: I1201 14:36:55.873398 4585 generic.go:334] "Generic (PLEG): container finished" podID="a592f160-6520-4d70-94bd-5064e63fa1a0" containerID="7a206e31e4699bff1824816dcbea0a69b3bf03290314939aa1b794405d5cae12" exitCode=0 Dec 01 14:36:55 crc kubenswrapper[4585]: I1201 14:36:55.874282 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6x5zw" event={"ID":"a592f160-6520-4d70-94bd-5064e63fa1a0","Type":"ContainerDied","Data":"7a206e31e4699bff1824816dcbea0a69b3bf03290314939aa1b794405d5cae12"} Dec 01 14:36:57 crc kubenswrapper[4585]: I1201 14:36:57.306441 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6x5zw" Dec 01 14:36:57 crc kubenswrapper[4585]: I1201 14:36:57.505385 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a592f160-6520-4d70-94bd-5064e63fa1a0-libvirt-combined-ca-bundle\") pod \"a592f160-6520-4d70-94bd-5064e63fa1a0\" (UID: \"a592f160-6520-4d70-94bd-5064e63fa1a0\") " Dec 01 14:36:57 crc kubenswrapper[4585]: I1201 14:36:57.505422 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a592f160-6520-4d70-94bd-5064e63fa1a0-ssh-key\") pod \"a592f160-6520-4d70-94bd-5064e63fa1a0\" (UID: \"a592f160-6520-4d70-94bd-5064e63fa1a0\") " Dec 01 14:36:57 crc kubenswrapper[4585]: I1201 14:36:57.505491 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g48lj\" (UniqueName: \"kubernetes.io/projected/a592f160-6520-4d70-94bd-5064e63fa1a0-kube-api-access-g48lj\") pod \"a592f160-6520-4d70-94bd-5064e63fa1a0\" (UID: \"a592f160-6520-4d70-94bd-5064e63fa1a0\") " Dec 01 14:36:57 crc kubenswrapper[4585]: I1201 14:36:57.505518 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a592f160-6520-4d70-94bd-5064e63fa1a0-inventory\") pod \"a592f160-6520-4d70-94bd-5064e63fa1a0\" (UID: \"a592f160-6520-4d70-94bd-5064e63fa1a0\") " Dec 01 14:36:57 crc kubenswrapper[4585]: I1201 14:36:57.505672 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a592f160-6520-4d70-94bd-5064e63fa1a0-libvirt-secret-0\") pod \"a592f160-6520-4d70-94bd-5064e63fa1a0\" (UID: \"a592f160-6520-4d70-94bd-5064e63fa1a0\") " Dec 01 14:36:57 crc kubenswrapper[4585]: I1201 14:36:57.513814 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a592f160-6520-4d70-94bd-5064e63fa1a0-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "a592f160-6520-4d70-94bd-5064e63fa1a0" (UID: "a592f160-6520-4d70-94bd-5064e63fa1a0"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:36:57 crc kubenswrapper[4585]: I1201 14:36:57.514471 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a592f160-6520-4d70-94bd-5064e63fa1a0-kube-api-access-g48lj" (OuterVolumeSpecName: "kube-api-access-g48lj") pod "a592f160-6520-4d70-94bd-5064e63fa1a0" (UID: "a592f160-6520-4d70-94bd-5064e63fa1a0"). InnerVolumeSpecName "kube-api-access-g48lj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:36:57 crc kubenswrapper[4585]: I1201 14:36:57.537533 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a592f160-6520-4d70-94bd-5064e63fa1a0-inventory" (OuterVolumeSpecName: "inventory") pod "a592f160-6520-4d70-94bd-5064e63fa1a0" (UID: "a592f160-6520-4d70-94bd-5064e63fa1a0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:36:57 crc kubenswrapper[4585]: I1201 14:36:57.541227 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a592f160-6520-4d70-94bd-5064e63fa1a0-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "a592f160-6520-4d70-94bd-5064e63fa1a0" (UID: "a592f160-6520-4d70-94bd-5064e63fa1a0"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:36:57 crc kubenswrapper[4585]: I1201 14:36:57.545861 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a592f160-6520-4d70-94bd-5064e63fa1a0-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a592f160-6520-4d70-94bd-5064e63fa1a0" (UID: "a592f160-6520-4d70-94bd-5064e63fa1a0"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:36:57 crc kubenswrapper[4585]: I1201 14:36:57.612389 4585 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a592f160-6520-4d70-94bd-5064e63fa1a0-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 14:36:57 crc kubenswrapper[4585]: I1201 14:36:57.612420 4585 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a592f160-6520-4d70-94bd-5064e63fa1a0-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 14:36:57 crc kubenswrapper[4585]: I1201 14:36:57.612430 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g48lj\" (UniqueName: \"kubernetes.io/projected/a592f160-6520-4d70-94bd-5064e63fa1a0-kube-api-access-g48lj\") on node \"crc\" DevicePath \"\"" Dec 01 14:36:57 crc kubenswrapper[4585]: I1201 14:36:57.612438 4585 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a592f160-6520-4d70-94bd-5064e63fa1a0-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 14:36:57 crc kubenswrapper[4585]: I1201 14:36:57.612448 4585 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a592f160-6520-4d70-94bd-5064e63fa1a0-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 01 14:36:57 crc kubenswrapper[4585]: I1201 14:36:57.895653 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6x5zw" event={"ID":"a592f160-6520-4d70-94bd-5064e63fa1a0","Type":"ContainerDied","Data":"e80f8061361772cc42007240cb577371fd2eedec441fb4d02b13227a2255d45a"} Dec 01 14:36:57 crc kubenswrapper[4585]: I1201 14:36:57.896003 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e80f8061361772cc42007240cb577371fd2eedec441fb4d02b13227a2255d45a" Dec 01 14:36:57 crc kubenswrapper[4585]: I1201 14:36:57.895702 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6x5zw" Dec 01 14:36:58 crc kubenswrapper[4585]: I1201 14:36:58.104185 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-b48gn"] Dec 01 14:36:58 crc kubenswrapper[4585]: E1201 14:36:58.104631 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fd6fd9b-5d0c-4c51-86ae-be67d49dc2ed" containerName="registry-server" Dec 01 14:36:58 crc kubenswrapper[4585]: I1201 14:36:58.104649 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fd6fd9b-5d0c-4c51-86ae-be67d49dc2ed" containerName="registry-server" Dec 01 14:36:58 crc kubenswrapper[4585]: E1201 14:36:58.104669 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a592f160-6520-4d70-94bd-5064e63fa1a0" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 01 14:36:58 crc kubenswrapper[4585]: I1201 14:36:58.104677 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="a592f160-6520-4d70-94bd-5064e63fa1a0" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 01 14:36:58 crc kubenswrapper[4585]: E1201 14:36:58.104702 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fd6fd9b-5d0c-4c51-86ae-be67d49dc2ed" containerName="extract-content" Dec 01 14:36:58 crc kubenswrapper[4585]: I1201 14:36:58.104709 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fd6fd9b-5d0c-4c51-86ae-be67d49dc2ed" containerName="extract-content" Dec 01 14:36:58 crc kubenswrapper[4585]: E1201 14:36:58.104726 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fd6fd9b-5d0c-4c51-86ae-be67d49dc2ed" containerName="extract-utilities" Dec 01 14:36:58 crc kubenswrapper[4585]: I1201 14:36:58.104732 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fd6fd9b-5d0c-4c51-86ae-be67d49dc2ed" containerName="extract-utilities" Dec 01 14:36:58 crc kubenswrapper[4585]: I1201 14:36:58.104938 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fd6fd9b-5d0c-4c51-86ae-be67d49dc2ed" containerName="registry-server" Dec 01 14:36:58 crc kubenswrapper[4585]: I1201 14:36:58.104963 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="a592f160-6520-4d70-94bd-5064e63fa1a0" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 01 14:36:58 crc kubenswrapper[4585]: I1201 14:36:58.112033 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-b48gn" Dec 01 14:36:58 crc kubenswrapper[4585]: I1201 14:36:58.116121 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 01 14:36:58 crc kubenswrapper[4585]: I1201 14:36:58.116190 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 14:36:58 crc kubenswrapper[4585]: I1201 14:36:58.119361 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-z6vpw" Dec 01 14:36:58 crc kubenswrapper[4585]: I1201 14:36:58.119641 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Dec 01 14:36:58 crc kubenswrapper[4585]: I1201 14:36:58.119781 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 14:36:58 crc kubenswrapper[4585]: I1201 14:36:58.119896 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 14:36:58 crc kubenswrapper[4585]: I1201 14:36:58.121804 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/a10b857d-29b7-46a5-9c12-775200f3ab74-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-b48gn\" (UID: \"a10b857d-29b7-46a5-9c12-775200f3ab74\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-b48gn" Dec 01 14:36:58 crc kubenswrapper[4585]: I1201 14:36:58.121845 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/a10b857d-29b7-46a5-9c12-775200f3ab74-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-b48gn\" (UID: \"a10b857d-29b7-46a5-9c12-775200f3ab74\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-b48gn" Dec 01 14:36:58 crc kubenswrapper[4585]: I1201 14:36:58.121870 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/a10b857d-29b7-46a5-9c12-775200f3ab74-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-b48gn\" (UID: \"a10b857d-29b7-46a5-9c12-775200f3ab74\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-b48gn" Dec 01 14:36:58 crc kubenswrapper[4585]: I1201 14:36:58.121886 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a10b857d-29b7-46a5-9c12-775200f3ab74-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-b48gn\" (UID: \"a10b857d-29b7-46a5-9c12-775200f3ab74\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-b48gn" Dec 01 14:36:58 crc kubenswrapper[4585]: I1201 14:36:58.121915 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/a10b857d-29b7-46a5-9c12-775200f3ab74-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-b48gn\" (UID: \"a10b857d-29b7-46a5-9c12-775200f3ab74\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-b48gn" Dec 01 14:36:58 crc kubenswrapper[4585]: I1201 14:36:58.121931 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbgk5\" (UniqueName: \"kubernetes.io/projected/a10b857d-29b7-46a5-9c12-775200f3ab74-kube-api-access-dbgk5\") pod \"nova-edpm-deployment-openstack-edpm-ipam-b48gn\" (UID: \"a10b857d-29b7-46a5-9c12-775200f3ab74\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-b48gn" Dec 01 14:36:58 crc kubenswrapper[4585]: I1201 14:36:58.122011 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a10b857d-29b7-46a5-9c12-775200f3ab74-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-b48gn\" (UID: \"a10b857d-29b7-46a5-9c12-775200f3ab74\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-b48gn" Dec 01 14:36:58 crc kubenswrapper[4585]: I1201 14:36:58.122067 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a10b857d-29b7-46a5-9c12-775200f3ab74-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-b48gn\" (UID: \"a10b857d-29b7-46a5-9c12-775200f3ab74\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-b48gn" Dec 01 14:36:58 crc kubenswrapper[4585]: I1201 14:36:58.122094 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/a10b857d-29b7-46a5-9c12-775200f3ab74-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-b48gn\" (UID: \"a10b857d-29b7-46a5-9c12-775200f3ab74\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-b48gn" Dec 01 14:36:58 crc kubenswrapper[4585]: I1201 14:36:58.129725 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-b48gn"] Dec 01 14:36:58 crc kubenswrapper[4585]: I1201 14:36:58.135610 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 01 14:36:58 crc kubenswrapper[4585]: I1201 14:36:58.225353 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/a10b857d-29b7-46a5-9c12-775200f3ab74-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-b48gn\" (UID: \"a10b857d-29b7-46a5-9c12-775200f3ab74\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-b48gn" Dec 01 14:36:58 crc kubenswrapper[4585]: I1201 14:36:58.225432 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/a10b857d-29b7-46a5-9c12-775200f3ab74-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-b48gn\" (UID: \"a10b857d-29b7-46a5-9c12-775200f3ab74\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-b48gn" Dec 01 14:36:58 crc kubenswrapper[4585]: I1201 14:36:58.225459 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/a10b857d-29b7-46a5-9c12-775200f3ab74-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-b48gn\" (UID: \"a10b857d-29b7-46a5-9c12-775200f3ab74\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-b48gn" Dec 01 14:36:58 crc kubenswrapper[4585]: I1201 14:36:58.225484 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a10b857d-29b7-46a5-9c12-775200f3ab74-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-b48gn\" (UID: \"a10b857d-29b7-46a5-9c12-775200f3ab74\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-b48gn" Dec 01 14:36:58 crc kubenswrapper[4585]: I1201 14:36:58.225525 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/a10b857d-29b7-46a5-9c12-775200f3ab74-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-b48gn\" (UID: \"a10b857d-29b7-46a5-9c12-775200f3ab74\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-b48gn" Dec 01 14:36:58 crc kubenswrapper[4585]: I1201 14:36:58.225546 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbgk5\" (UniqueName: \"kubernetes.io/projected/a10b857d-29b7-46a5-9c12-775200f3ab74-kube-api-access-dbgk5\") pod \"nova-edpm-deployment-openstack-edpm-ipam-b48gn\" (UID: \"a10b857d-29b7-46a5-9c12-775200f3ab74\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-b48gn" Dec 01 14:36:58 crc kubenswrapper[4585]: I1201 14:36:58.225612 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a10b857d-29b7-46a5-9c12-775200f3ab74-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-b48gn\" (UID: \"a10b857d-29b7-46a5-9c12-775200f3ab74\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-b48gn" Dec 01 14:36:58 crc kubenswrapper[4585]: I1201 14:36:58.225664 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a10b857d-29b7-46a5-9c12-775200f3ab74-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-b48gn\" (UID: \"a10b857d-29b7-46a5-9c12-775200f3ab74\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-b48gn" Dec 01 14:36:58 crc kubenswrapper[4585]: I1201 14:36:58.225694 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/a10b857d-29b7-46a5-9c12-775200f3ab74-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-b48gn\" (UID: \"a10b857d-29b7-46a5-9c12-775200f3ab74\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-b48gn" Dec 01 14:36:58 crc kubenswrapper[4585]: I1201 14:36:58.232926 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a10b857d-29b7-46a5-9c12-775200f3ab74-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-b48gn\" (UID: \"a10b857d-29b7-46a5-9c12-775200f3ab74\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-b48gn" Dec 01 14:36:58 crc kubenswrapper[4585]: I1201 14:36:58.235429 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/a10b857d-29b7-46a5-9c12-775200f3ab74-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-b48gn\" (UID: \"a10b857d-29b7-46a5-9c12-775200f3ab74\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-b48gn" Dec 01 14:36:58 crc kubenswrapper[4585]: I1201 14:36:58.237659 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/a10b857d-29b7-46a5-9c12-775200f3ab74-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-b48gn\" (UID: \"a10b857d-29b7-46a5-9c12-775200f3ab74\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-b48gn" Dec 01 14:36:58 crc kubenswrapper[4585]: I1201 14:36:58.242820 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a10b857d-29b7-46a5-9c12-775200f3ab74-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-b48gn\" (UID: \"a10b857d-29b7-46a5-9c12-775200f3ab74\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-b48gn" Dec 01 14:36:58 crc kubenswrapper[4585]: I1201 14:36:58.243599 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/a10b857d-29b7-46a5-9c12-775200f3ab74-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-b48gn\" (UID: \"a10b857d-29b7-46a5-9c12-775200f3ab74\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-b48gn" Dec 01 14:36:58 crc kubenswrapper[4585]: I1201 14:36:58.258647 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a10b857d-29b7-46a5-9c12-775200f3ab74-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-b48gn\" (UID: \"a10b857d-29b7-46a5-9c12-775200f3ab74\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-b48gn" Dec 01 14:36:58 crc kubenswrapper[4585]: I1201 14:36:58.259114 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/a10b857d-29b7-46a5-9c12-775200f3ab74-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-b48gn\" (UID: \"a10b857d-29b7-46a5-9c12-775200f3ab74\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-b48gn" Dec 01 14:36:58 crc kubenswrapper[4585]: I1201 14:36:58.259729 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/a10b857d-29b7-46a5-9c12-775200f3ab74-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-b48gn\" (UID: \"a10b857d-29b7-46a5-9c12-775200f3ab74\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-b48gn" Dec 01 14:36:58 crc kubenswrapper[4585]: I1201 14:36:58.316001 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbgk5\" (UniqueName: \"kubernetes.io/projected/a10b857d-29b7-46a5-9c12-775200f3ab74-kube-api-access-dbgk5\") pod \"nova-edpm-deployment-openstack-edpm-ipam-b48gn\" (UID: \"a10b857d-29b7-46a5-9c12-775200f3ab74\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-b48gn" Dec 01 14:36:58 crc kubenswrapper[4585]: I1201 14:36:58.429571 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-b48gn" Dec 01 14:36:58 crc kubenswrapper[4585]: I1201 14:36:58.867492 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-b48gn"] Dec 01 14:36:58 crc kubenswrapper[4585]: I1201 14:36:58.910351 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-b48gn" event={"ID":"a10b857d-29b7-46a5-9c12-775200f3ab74","Type":"ContainerStarted","Data":"d8415b1fa58ed198a5d25d32a5712a87fc5f7cb907c991a1725d797e67721943"} Dec 01 14:37:00 crc kubenswrapper[4585]: I1201 14:37:00.927401 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-b48gn" event={"ID":"a10b857d-29b7-46a5-9c12-775200f3ab74","Type":"ContainerStarted","Data":"5ba3a7e8bcd89104998f5adcbe09def5b8f6f4f1fe1b1b66f60e5d1094659e8b"} Dec 01 14:37:00 crc kubenswrapper[4585]: I1201 14:37:00.953102 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-b48gn" podStartSLOduration=2.157831282 podStartE2EDuration="2.953079234s" podCreationTimestamp="2025-12-01 14:36:58 +0000 UTC" firstStartedPulling="2025-12-01 14:36:58.873265141 +0000 UTC m=+2332.857478996" lastFinishedPulling="2025-12-01 14:36:59.668513093 +0000 UTC m=+2333.652726948" observedRunningTime="2025-12-01 14:37:00.946099579 +0000 UTC m=+2334.930313434" watchObservedRunningTime="2025-12-01 14:37:00.953079234 +0000 UTC m=+2334.937293089" Dec 01 14:37:13 crc kubenswrapper[4585]: I1201 14:37:13.716283 4585 patch_prober.go:28] interesting pod/machine-config-daemon-lj9gs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 14:37:13 crc kubenswrapper[4585]: I1201 14:37:13.717100 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 14:37:43 crc kubenswrapper[4585]: I1201 14:37:43.716360 4585 patch_prober.go:28] interesting pod/machine-config-daemon-lj9gs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 14:37:43 crc kubenswrapper[4585]: I1201 14:37:43.716961 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 14:37:43 crc kubenswrapper[4585]: I1201 14:37:43.717122 4585 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" Dec 01 14:37:43 crc kubenswrapper[4585]: I1201 14:37:43.718016 4585 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a1dbd00cb2d9328b688322869d95c509a2164260f953ca463a70d2042af9535f"} pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 14:37:43 crc kubenswrapper[4585]: I1201 14:37:43.718076 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerName="machine-config-daemon" containerID="cri-o://a1dbd00cb2d9328b688322869d95c509a2164260f953ca463a70d2042af9535f" gracePeriod=600 Dec 01 14:37:43 crc kubenswrapper[4585]: E1201 14:37:43.840954 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:37:44 crc kubenswrapper[4585]: I1201 14:37:44.363097 4585 generic.go:334] "Generic (PLEG): container finished" podID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerID="a1dbd00cb2d9328b688322869d95c509a2164260f953ca463a70d2042af9535f" exitCode=0 Dec 01 14:37:44 crc kubenswrapper[4585]: I1201 14:37:44.363189 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" event={"ID":"f7beb40d-bcd0-43c8-a9fe-c32408790a4c","Type":"ContainerDied","Data":"a1dbd00cb2d9328b688322869d95c509a2164260f953ca463a70d2042af9535f"} Dec 01 14:37:44 crc kubenswrapper[4585]: I1201 14:37:44.363423 4585 scope.go:117] "RemoveContainer" containerID="7cb7e396508a578a9169a8bc04f1f203090334b7c546cab0b808836f498dcef4" Dec 01 14:37:44 crc kubenswrapper[4585]: I1201 14:37:44.364040 4585 scope.go:117] "RemoveContainer" containerID="a1dbd00cb2d9328b688322869d95c509a2164260f953ca463a70d2042af9535f" Dec 01 14:37:44 crc kubenswrapper[4585]: E1201 14:37:44.364500 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:37:57 crc kubenswrapper[4585]: I1201 14:37:57.416042 4585 scope.go:117] "RemoveContainer" containerID="a1dbd00cb2d9328b688322869d95c509a2164260f953ca463a70d2042af9535f" Dec 01 14:37:57 crc kubenswrapper[4585]: E1201 14:37:57.417707 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:38:11 crc kubenswrapper[4585]: I1201 14:38:11.412647 4585 scope.go:117] "RemoveContainer" containerID="a1dbd00cb2d9328b688322869d95c509a2164260f953ca463a70d2042af9535f" Dec 01 14:38:11 crc kubenswrapper[4585]: E1201 14:38:11.413748 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:38:24 crc kubenswrapper[4585]: I1201 14:38:24.413661 4585 scope.go:117] "RemoveContainer" containerID="a1dbd00cb2d9328b688322869d95c509a2164260f953ca463a70d2042af9535f" Dec 01 14:38:24 crc kubenswrapper[4585]: E1201 14:38:24.415221 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:38:36 crc kubenswrapper[4585]: I1201 14:38:36.419155 4585 scope.go:117] "RemoveContainer" containerID="a1dbd00cb2d9328b688322869d95c509a2164260f953ca463a70d2042af9535f" Dec 01 14:38:36 crc kubenswrapper[4585]: E1201 14:38:36.420225 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:38:48 crc kubenswrapper[4585]: I1201 14:38:48.413254 4585 scope.go:117] "RemoveContainer" containerID="a1dbd00cb2d9328b688322869d95c509a2164260f953ca463a70d2042af9535f" Dec 01 14:38:48 crc kubenswrapper[4585]: E1201 14:38:48.414113 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:39:00 crc kubenswrapper[4585]: I1201 14:39:00.413110 4585 scope.go:117] "RemoveContainer" containerID="a1dbd00cb2d9328b688322869d95c509a2164260f953ca463a70d2042af9535f" Dec 01 14:39:00 crc kubenswrapper[4585]: E1201 14:39:00.413954 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:39:13 crc kubenswrapper[4585]: I1201 14:39:13.412563 4585 scope.go:117] "RemoveContainer" containerID="a1dbd00cb2d9328b688322869d95c509a2164260f953ca463a70d2042af9535f" Dec 01 14:39:13 crc kubenswrapper[4585]: E1201 14:39:13.413305 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:39:24 crc kubenswrapper[4585]: I1201 14:39:24.412710 4585 scope.go:117] "RemoveContainer" containerID="a1dbd00cb2d9328b688322869d95c509a2164260f953ca463a70d2042af9535f" Dec 01 14:39:24 crc kubenswrapper[4585]: E1201 14:39:24.413732 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:39:37 crc kubenswrapper[4585]: I1201 14:39:37.413268 4585 scope.go:117] "RemoveContainer" containerID="a1dbd00cb2d9328b688322869d95c509a2164260f953ca463a70d2042af9535f" Dec 01 14:39:37 crc kubenswrapper[4585]: E1201 14:39:37.415018 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:39:48 crc kubenswrapper[4585]: I1201 14:39:48.412369 4585 scope.go:117] "RemoveContainer" containerID="a1dbd00cb2d9328b688322869d95c509a2164260f953ca463a70d2042af9535f" Dec 01 14:39:48 crc kubenswrapper[4585]: E1201 14:39:48.413138 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:40:01 crc kubenswrapper[4585]: I1201 14:40:01.413186 4585 scope.go:117] "RemoveContainer" containerID="a1dbd00cb2d9328b688322869d95c509a2164260f953ca463a70d2042af9535f" Dec 01 14:40:01 crc kubenswrapper[4585]: E1201 14:40:01.414559 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:40:04 crc kubenswrapper[4585]: I1201 14:40:04.710822 4585 generic.go:334] "Generic (PLEG): container finished" podID="a10b857d-29b7-46a5-9c12-775200f3ab74" containerID="5ba3a7e8bcd89104998f5adcbe09def5b8f6f4f1fe1b1b66f60e5d1094659e8b" exitCode=0 Dec 01 14:40:04 crc kubenswrapper[4585]: I1201 14:40:04.711041 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-b48gn" event={"ID":"a10b857d-29b7-46a5-9c12-775200f3ab74","Type":"ContainerDied","Data":"5ba3a7e8bcd89104998f5adcbe09def5b8f6f4f1fe1b1b66f60e5d1094659e8b"} Dec 01 14:40:06 crc kubenswrapper[4585]: I1201 14:40:06.173920 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-b48gn" Dec 01 14:40:06 crc kubenswrapper[4585]: I1201 14:40:06.353254 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/a10b857d-29b7-46a5-9c12-775200f3ab74-nova-cell1-compute-config-1\") pod \"a10b857d-29b7-46a5-9c12-775200f3ab74\" (UID: \"a10b857d-29b7-46a5-9c12-775200f3ab74\") " Dec 01 14:40:06 crc kubenswrapper[4585]: I1201 14:40:06.353534 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/a10b857d-29b7-46a5-9c12-775200f3ab74-nova-extra-config-0\") pod \"a10b857d-29b7-46a5-9c12-775200f3ab74\" (UID: \"a10b857d-29b7-46a5-9c12-775200f3ab74\") " Dec 01 14:40:06 crc kubenswrapper[4585]: I1201 14:40:06.353579 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/a10b857d-29b7-46a5-9c12-775200f3ab74-nova-migration-ssh-key-0\") pod \"a10b857d-29b7-46a5-9c12-775200f3ab74\" (UID: \"a10b857d-29b7-46a5-9c12-775200f3ab74\") " Dec 01 14:40:06 crc kubenswrapper[4585]: I1201 14:40:06.353628 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbgk5\" (UniqueName: \"kubernetes.io/projected/a10b857d-29b7-46a5-9c12-775200f3ab74-kube-api-access-dbgk5\") pod \"a10b857d-29b7-46a5-9c12-775200f3ab74\" (UID: \"a10b857d-29b7-46a5-9c12-775200f3ab74\") " Dec 01 14:40:06 crc kubenswrapper[4585]: I1201 14:40:06.353664 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a10b857d-29b7-46a5-9c12-775200f3ab74-inventory\") pod \"a10b857d-29b7-46a5-9c12-775200f3ab74\" (UID: \"a10b857d-29b7-46a5-9c12-775200f3ab74\") " Dec 01 14:40:06 crc kubenswrapper[4585]: I1201 14:40:06.353959 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/a10b857d-29b7-46a5-9c12-775200f3ab74-nova-migration-ssh-key-1\") pod \"a10b857d-29b7-46a5-9c12-775200f3ab74\" (UID: \"a10b857d-29b7-46a5-9c12-775200f3ab74\") " Dec 01 14:40:06 crc kubenswrapper[4585]: I1201 14:40:06.354082 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a10b857d-29b7-46a5-9c12-775200f3ab74-ssh-key\") pod \"a10b857d-29b7-46a5-9c12-775200f3ab74\" (UID: \"a10b857d-29b7-46a5-9c12-775200f3ab74\") " Dec 01 14:40:06 crc kubenswrapper[4585]: I1201 14:40:06.354215 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a10b857d-29b7-46a5-9c12-775200f3ab74-nova-combined-ca-bundle\") pod \"a10b857d-29b7-46a5-9c12-775200f3ab74\" (UID: \"a10b857d-29b7-46a5-9c12-775200f3ab74\") " Dec 01 14:40:06 crc kubenswrapper[4585]: I1201 14:40:06.354265 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/a10b857d-29b7-46a5-9c12-775200f3ab74-nova-cell1-compute-config-0\") pod \"a10b857d-29b7-46a5-9c12-775200f3ab74\" (UID: \"a10b857d-29b7-46a5-9c12-775200f3ab74\") " Dec 01 14:40:06 crc kubenswrapper[4585]: I1201 14:40:06.358284 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a10b857d-29b7-46a5-9c12-775200f3ab74-kube-api-access-dbgk5" (OuterVolumeSpecName: "kube-api-access-dbgk5") pod "a10b857d-29b7-46a5-9c12-775200f3ab74" (UID: "a10b857d-29b7-46a5-9c12-775200f3ab74"). InnerVolumeSpecName "kube-api-access-dbgk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:40:06 crc kubenswrapper[4585]: I1201 14:40:06.376386 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a10b857d-29b7-46a5-9c12-775200f3ab74-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "a10b857d-29b7-46a5-9c12-775200f3ab74" (UID: "a10b857d-29b7-46a5-9c12-775200f3ab74"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:40:06 crc kubenswrapper[4585]: I1201 14:40:06.395155 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a10b857d-29b7-46a5-9c12-775200f3ab74-inventory" (OuterVolumeSpecName: "inventory") pod "a10b857d-29b7-46a5-9c12-775200f3ab74" (UID: "a10b857d-29b7-46a5-9c12-775200f3ab74"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:40:06 crc kubenswrapper[4585]: I1201 14:40:06.403234 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a10b857d-29b7-46a5-9c12-775200f3ab74-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "a10b857d-29b7-46a5-9c12-775200f3ab74" (UID: "a10b857d-29b7-46a5-9c12-775200f3ab74"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:40:06 crc kubenswrapper[4585]: I1201 14:40:06.406079 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a10b857d-29b7-46a5-9c12-775200f3ab74-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a10b857d-29b7-46a5-9c12-775200f3ab74" (UID: "a10b857d-29b7-46a5-9c12-775200f3ab74"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:40:06 crc kubenswrapper[4585]: I1201 14:40:06.408153 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a10b857d-29b7-46a5-9c12-775200f3ab74-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "a10b857d-29b7-46a5-9c12-775200f3ab74" (UID: "a10b857d-29b7-46a5-9c12-775200f3ab74"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:40:06 crc kubenswrapper[4585]: I1201 14:40:06.425035 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a10b857d-29b7-46a5-9c12-775200f3ab74-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "a10b857d-29b7-46a5-9c12-775200f3ab74" (UID: "a10b857d-29b7-46a5-9c12-775200f3ab74"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:40:06 crc kubenswrapper[4585]: I1201 14:40:06.428493 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a10b857d-29b7-46a5-9c12-775200f3ab74-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "a10b857d-29b7-46a5-9c12-775200f3ab74" (UID: "a10b857d-29b7-46a5-9c12-775200f3ab74"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:40:06 crc kubenswrapper[4585]: I1201 14:40:06.442953 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a10b857d-29b7-46a5-9c12-775200f3ab74-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "a10b857d-29b7-46a5-9c12-775200f3ab74" (UID: "a10b857d-29b7-46a5-9c12-775200f3ab74"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:40:06 crc kubenswrapper[4585]: I1201 14:40:06.463132 4585 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/a10b857d-29b7-46a5-9c12-775200f3ab74-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Dec 01 14:40:06 crc kubenswrapper[4585]: I1201 14:40:06.463166 4585 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/a10b857d-29b7-46a5-9c12-775200f3ab74-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 01 14:40:06 crc kubenswrapper[4585]: I1201 14:40:06.463204 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbgk5\" (UniqueName: \"kubernetes.io/projected/a10b857d-29b7-46a5-9c12-775200f3ab74-kube-api-access-dbgk5\") on node \"crc\" DevicePath \"\"" Dec 01 14:40:06 crc kubenswrapper[4585]: I1201 14:40:06.463217 4585 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a10b857d-29b7-46a5-9c12-775200f3ab74-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 14:40:06 crc kubenswrapper[4585]: I1201 14:40:06.463226 4585 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/a10b857d-29b7-46a5-9c12-775200f3ab74-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 01 14:40:06 crc kubenswrapper[4585]: I1201 14:40:06.463234 4585 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a10b857d-29b7-46a5-9c12-775200f3ab74-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 14:40:06 crc kubenswrapper[4585]: I1201 14:40:06.463242 4585 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a10b857d-29b7-46a5-9c12-775200f3ab74-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 14:40:06 crc kubenswrapper[4585]: I1201 14:40:06.463251 4585 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/a10b857d-29b7-46a5-9c12-775200f3ab74-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 01 14:40:06 crc kubenswrapper[4585]: I1201 14:40:06.463279 4585 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/a10b857d-29b7-46a5-9c12-775200f3ab74-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 01 14:40:06 crc kubenswrapper[4585]: I1201 14:40:06.731503 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-b48gn" event={"ID":"a10b857d-29b7-46a5-9c12-775200f3ab74","Type":"ContainerDied","Data":"d8415b1fa58ed198a5d25d32a5712a87fc5f7cb907c991a1725d797e67721943"} Dec 01 14:40:06 crc kubenswrapper[4585]: I1201 14:40:06.731788 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8415b1fa58ed198a5d25d32a5712a87fc5f7cb907c991a1725d797e67721943" Dec 01 14:40:06 crc kubenswrapper[4585]: I1201 14:40:06.731604 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-b48gn" Dec 01 14:40:06 crc kubenswrapper[4585]: I1201 14:40:06.835377 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8cmsb"] Dec 01 14:40:06 crc kubenswrapper[4585]: E1201 14:40:06.836066 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a10b857d-29b7-46a5-9c12-775200f3ab74" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 01 14:40:06 crc kubenswrapper[4585]: I1201 14:40:06.836135 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="a10b857d-29b7-46a5-9c12-775200f3ab74" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 01 14:40:06 crc kubenswrapper[4585]: I1201 14:40:06.836400 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="a10b857d-29b7-46a5-9c12-775200f3ab74" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 01 14:40:06 crc kubenswrapper[4585]: I1201 14:40:06.837331 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8cmsb" Dec 01 14:40:06 crc kubenswrapper[4585]: I1201 14:40:06.840023 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Dec 01 14:40:06 crc kubenswrapper[4585]: I1201 14:40:06.840101 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-z6vpw" Dec 01 14:40:06 crc kubenswrapper[4585]: I1201 14:40:06.840297 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 14:40:06 crc kubenswrapper[4585]: I1201 14:40:06.840355 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 14:40:06 crc kubenswrapper[4585]: I1201 14:40:06.840588 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 14:40:06 crc kubenswrapper[4585]: I1201 14:40:06.858512 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8cmsb"] Dec 01 14:40:06 crc kubenswrapper[4585]: I1201 14:40:06.975664 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/1232f97e-9bf9-4917-b806-e5de8f180f70-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8cmsb\" (UID: \"1232f97e-9bf9-4917-b806-e5de8f180f70\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8cmsb" Dec 01 14:40:06 crc kubenswrapper[4585]: I1201 14:40:06.975959 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/1232f97e-9bf9-4917-b806-e5de8f180f70-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8cmsb\" (UID: \"1232f97e-9bf9-4917-b806-e5de8f180f70\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8cmsb" Dec 01 14:40:06 crc kubenswrapper[4585]: I1201 14:40:06.976093 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1232f97e-9bf9-4917-b806-e5de8f180f70-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8cmsb\" (UID: \"1232f97e-9bf9-4917-b806-e5de8f180f70\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8cmsb" Dec 01 14:40:06 crc kubenswrapper[4585]: I1201 14:40:06.976249 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1232f97e-9bf9-4917-b806-e5de8f180f70-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8cmsb\" (UID: \"1232f97e-9bf9-4917-b806-e5de8f180f70\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8cmsb" Dec 01 14:40:06 crc kubenswrapper[4585]: I1201 14:40:06.976359 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/1232f97e-9bf9-4917-b806-e5de8f180f70-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8cmsb\" (UID: \"1232f97e-9bf9-4917-b806-e5de8f180f70\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8cmsb" Dec 01 14:40:06 crc kubenswrapper[4585]: I1201 14:40:06.976440 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1232f97e-9bf9-4917-b806-e5de8f180f70-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8cmsb\" (UID: \"1232f97e-9bf9-4917-b806-e5de8f180f70\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8cmsb" Dec 01 14:40:06 crc kubenswrapper[4585]: I1201 14:40:06.976585 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktx9n\" (UniqueName: \"kubernetes.io/projected/1232f97e-9bf9-4917-b806-e5de8f180f70-kube-api-access-ktx9n\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8cmsb\" (UID: \"1232f97e-9bf9-4917-b806-e5de8f180f70\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8cmsb" Dec 01 14:40:07 crc kubenswrapper[4585]: I1201 14:40:07.078328 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/1232f97e-9bf9-4917-b806-e5de8f180f70-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8cmsb\" (UID: \"1232f97e-9bf9-4917-b806-e5de8f180f70\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8cmsb" Dec 01 14:40:07 crc kubenswrapper[4585]: I1201 14:40:07.078714 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1232f97e-9bf9-4917-b806-e5de8f180f70-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8cmsb\" (UID: \"1232f97e-9bf9-4917-b806-e5de8f180f70\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8cmsb" Dec 01 14:40:07 crc kubenswrapper[4585]: I1201 14:40:07.078805 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktx9n\" (UniqueName: \"kubernetes.io/projected/1232f97e-9bf9-4917-b806-e5de8f180f70-kube-api-access-ktx9n\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8cmsb\" (UID: \"1232f97e-9bf9-4917-b806-e5de8f180f70\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8cmsb" Dec 01 14:40:07 crc kubenswrapper[4585]: I1201 14:40:07.078843 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/1232f97e-9bf9-4917-b806-e5de8f180f70-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8cmsb\" (UID: \"1232f97e-9bf9-4917-b806-e5de8f180f70\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8cmsb" Dec 01 14:40:07 crc kubenswrapper[4585]: I1201 14:40:07.078952 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/1232f97e-9bf9-4917-b806-e5de8f180f70-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8cmsb\" (UID: \"1232f97e-9bf9-4917-b806-e5de8f180f70\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8cmsb" Dec 01 14:40:07 crc kubenswrapper[4585]: I1201 14:40:07.079023 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1232f97e-9bf9-4917-b806-e5de8f180f70-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8cmsb\" (UID: \"1232f97e-9bf9-4917-b806-e5de8f180f70\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8cmsb" Dec 01 14:40:07 crc kubenswrapper[4585]: I1201 14:40:07.079136 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1232f97e-9bf9-4917-b806-e5de8f180f70-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8cmsb\" (UID: \"1232f97e-9bf9-4917-b806-e5de8f180f70\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8cmsb" Dec 01 14:40:07 crc kubenswrapper[4585]: I1201 14:40:07.084076 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1232f97e-9bf9-4917-b806-e5de8f180f70-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8cmsb\" (UID: \"1232f97e-9bf9-4917-b806-e5de8f180f70\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8cmsb" Dec 01 14:40:07 crc kubenswrapper[4585]: I1201 14:40:07.084765 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/1232f97e-9bf9-4917-b806-e5de8f180f70-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8cmsb\" (UID: \"1232f97e-9bf9-4917-b806-e5de8f180f70\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8cmsb" Dec 01 14:40:07 crc kubenswrapper[4585]: I1201 14:40:07.084823 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/1232f97e-9bf9-4917-b806-e5de8f180f70-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8cmsb\" (UID: \"1232f97e-9bf9-4917-b806-e5de8f180f70\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8cmsb" Dec 01 14:40:07 crc kubenswrapper[4585]: I1201 14:40:07.085429 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1232f97e-9bf9-4917-b806-e5de8f180f70-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8cmsb\" (UID: \"1232f97e-9bf9-4917-b806-e5de8f180f70\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8cmsb" Dec 01 14:40:07 crc kubenswrapper[4585]: I1201 14:40:07.088959 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/1232f97e-9bf9-4917-b806-e5de8f180f70-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8cmsb\" (UID: \"1232f97e-9bf9-4917-b806-e5de8f180f70\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8cmsb" Dec 01 14:40:07 crc kubenswrapper[4585]: I1201 14:40:07.094886 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1232f97e-9bf9-4917-b806-e5de8f180f70-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8cmsb\" (UID: \"1232f97e-9bf9-4917-b806-e5de8f180f70\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8cmsb" Dec 01 14:40:07 crc kubenswrapper[4585]: I1201 14:40:07.097374 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktx9n\" (UniqueName: \"kubernetes.io/projected/1232f97e-9bf9-4917-b806-e5de8f180f70-kube-api-access-ktx9n\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-8cmsb\" (UID: \"1232f97e-9bf9-4917-b806-e5de8f180f70\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8cmsb" Dec 01 14:40:07 crc kubenswrapper[4585]: I1201 14:40:07.156294 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8cmsb" Dec 01 14:40:07 crc kubenswrapper[4585]: I1201 14:40:07.698990 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8cmsb"] Dec 01 14:40:07 crc kubenswrapper[4585]: I1201 14:40:07.709775 4585 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 14:40:07 crc kubenswrapper[4585]: I1201 14:40:07.742239 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8cmsb" event={"ID":"1232f97e-9bf9-4917-b806-e5de8f180f70","Type":"ContainerStarted","Data":"b491654aeafaa1524ce7a008f5e73badeea6eb5810f207fc1aaae00925982ba1"} Dec 01 14:40:09 crc kubenswrapper[4585]: I1201 14:40:09.762435 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8cmsb" event={"ID":"1232f97e-9bf9-4917-b806-e5de8f180f70","Type":"ContainerStarted","Data":"aa3b1c19477fc7179f73c98682f65838b40123fe4a44339bcfdea62afc9d748b"} Dec 01 14:40:09 crc kubenswrapper[4585]: I1201 14:40:09.792415 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8cmsb" podStartSLOduration=2.601901252 podStartE2EDuration="3.792392211s" podCreationTimestamp="2025-12-01 14:40:06 +0000 UTC" firstStartedPulling="2025-12-01 14:40:07.709538781 +0000 UTC m=+2521.693752636" lastFinishedPulling="2025-12-01 14:40:08.90002974 +0000 UTC m=+2522.884243595" observedRunningTime="2025-12-01 14:40:09.784362238 +0000 UTC m=+2523.768576113" watchObservedRunningTime="2025-12-01 14:40:09.792392211 +0000 UTC m=+2523.776606066" Dec 01 14:40:13 crc kubenswrapper[4585]: I1201 14:40:13.412789 4585 scope.go:117] "RemoveContainer" containerID="a1dbd00cb2d9328b688322869d95c509a2164260f953ca463a70d2042af9535f" Dec 01 14:40:13 crc kubenswrapper[4585]: E1201 14:40:13.413457 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:40:27 crc kubenswrapper[4585]: I1201 14:40:27.412793 4585 scope.go:117] "RemoveContainer" containerID="a1dbd00cb2d9328b688322869d95c509a2164260f953ca463a70d2042af9535f" Dec 01 14:40:27 crc kubenswrapper[4585]: E1201 14:40:27.413649 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:40:38 crc kubenswrapper[4585]: I1201 14:40:38.412945 4585 scope.go:117] "RemoveContainer" containerID="a1dbd00cb2d9328b688322869d95c509a2164260f953ca463a70d2042af9535f" Dec 01 14:40:38 crc kubenswrapper[4585]: E1201 14:40:38.414543 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:40:49 crc kubenswrapper[4585]: I1201 14:40:49.412895 4585 scope.go:117] "RemoveContainer" containerID="a1dbd00cb2d9328b688322869d95c509a2164260f953ca463a70d2042af9535f" Dec 01 14:40:49 crc kubenswrapper[4585]: E1201 14:40:49.413893 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:41:02 crc kubenswrapper[4585]: I1201 14:41:02.413017 4585 scope.go:117] "RemoveContainer" containerID="a1dbd00cb2d9328b688322869d95c509a2164260f953ca463a70d2042af9535f" Dec 01 14:41:02 crc kubenswrapper[4585]: E1201 14:41:02.414257 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:41:14 crc kubenswrapper[4585]: I1201 14:41:14.413348 4585 scope.go:117] "RemoveContainer" containerID="a1dbd00cb2d9328b688322869d95c509a2164260f953ca463a70d2042af9535f" Dec 01 14:41:14 crc kubenswrapper[4585]: E1201 14:41:14.414113 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:41:26 crc kubenswrapper[4585]: I1201 14:41:26.420911 4585 scope.go:117] "RemoveContainer" containerID="a1dbd00cb2d9328b688322869d95c509a2164260f953ca463a70d2042af9535f" Dec 01 14:41:26 crc kubenswrapper[4585]: E1201 14:41:26.422156 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:41:41 crc kubenswrapper[4585]: I1201 14:41:41.412576 4585 scope.go:117] "RemoveContainer" containerID="a1dbd00cb2d9328b688322869d95c509a2164260f953ca463a70d2042af9535f" Dec 01 14:41:41 crc kubenswrapper[4585]: E1201 14:41:41.413459 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:41:54 crc kubenswrapper[4585]: I1201 14:41:54.413044 4585 scope.go:117] "RemoveContainer" containerID="a1dbd00cb2d9328b688322869d95c509a2164260f953ca463a70d2042af9535f" Dec 01 14:41:54 crc kubenswrapper[4585]: E1201 14:41:54.413925 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:42:06 crc kubenswrapper[4585]: I1201 14:42:06.708049 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8pws6"] Dec 01 14:42:06 crc kubenswrapper[4585]: I1201 14:42:06.710379 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8pws6" Dec 01 14:42:06 crc kubenswrapper[4585]: I1201 14:42:06.737593 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8pws6"] Dec 01 14:42:06 crc kubenswrapper[4585]: I1201 14:42:06.894371 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gqqt\" (UniqueName: \"kubernetes.io/projected/0304d577-354f-483a-b982-4d785eb3410d-kube-api-access-9gqqt\") pod \"redhat-marketplace-8pws6\" (UID: \"0304d577-354f-483a-b982-4d785eb3410d\") " pod="openshift-marketplace/redhat-marketplace-8pws6" Dec 01 14:42:06 crc kubenswrapper[4585]: I1201 14:42:06.895093 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0304d577-354f-483a-b982-4d785eb3410d-catalog-content\") pod \"redhat-marketplace-8pws6\" (UID: \"0304d577-354f-483a-b982-4d785eb3410d\") " pod="openshift-marketplace/redhat-marketplace-8pws6" Dec 01 14:42:06 crc kubenswrapper[4585]: I1201 14:42:06.895497 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0304d577-354f-483a-b982-4d785eb3410d-utilities\") pod \"redhat-marketplace-8pws6\" (UID: \"0304d577-354f-483a-b982-4d785eb3410d\") " pod="openshift-marketplace/redhat-marketplace-8pws6" Dec 01 14:42:06 crc kubenswrapper[4585]: I1201 14:42:06.903295 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4884k"] Dec 01 14:42:06 crc kubenswrapper[4585]: I1201 14:42:06.906803 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4884k" Dec 01 14:42:06 crc kubenswrapper[4585]: I1201 14:42:06.942469 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4884k"] Dec 01 14:42:06 crc kubenswrapper[4585]: I1201 14:42:06.996823 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gqqt\" (UniqueName: \"kubernetes.io/projected/0304d577-354f-483a-b982-4d785eb3410d-kube-api-access-9gqqt\") pod \"redhat-marketplace-8pws6\" (UID: \"0304d577-354f-483a-b982-4d785eb3410d\") " pod="openshift-marketplace/redhat-marketplace-8pws6" Dec 01 14:42:06 crc kubenswrapper[4585]: I1201 14:42:06.996879 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0304d577-354f-483a-b982-4d785eb3410d-catalog-content\") pod \"redhat-marketplace-8pws6\" (UID: \"0304d577-354f-483a-b982-4d785eb3410d\") " pod="openshift-marketplace/redhat-marketplace-8pws6" Dec 01 14:42:06 crc kubenswrapper[4585]: I1201 14:42:06.997101 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0304d577-354f-483a-b982-4d785eb3410d-utilities\") pod \"redhat-marketplace-8pws6\" (UID: \"0304d577-354f-483a-b982-4d785eb3410d\") " pod="openshift-marketplace/redhat-marketplace-8pws6" Dec 01 14:42:06 crc kubenswrapper[4585]: I1201 14:42:06.997579 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0304d577-354f-483a-b982-4d785eb3410d-utilities\") pod \"redhat-marketplace-8pws6\" (UID: \"0304d577-354f-483a-b982-4d785eb3410d\") " pod="openshift-marketplace/redhat-marketplace-8pws6" Dec 01 14:42:06 crc kubenswrapper[4585]: I1201 14:42:06.997656 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0304d577-354f-483a-b982-4d785eb3410d-catalog-content\") pod \"redhat-marketplace-8pws6\" (UID: \"0304d577-354f-483a-b982-4d785eb3410d\") " pod="openshift-marketplace/redhat-marketplace-8pws6" Dec 01 14:42:07 crc kubenswrapper[4585]: I1201 14:42:07.017582 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gqqt\" (UniqueName: \"kubernetes.io/projected/0304d577-354f-483a-b982-4d785eb3410d-kube-api-access-9gqqt\") pod \"redhat-marketplace-8pws6\" (UID: \"0304d577-354f-483a-b982-4d785eb3410d\") " pod="openshift-marketplace/redhat-marketplace-8pws6" Dec 01 14:42:07 crc kubenswrapper[4585]: I1201 14:42:07.028706 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8pws6" Dec 01 14:42:07 crc kubenswrapper[4585]: I1201 14:42:07.098320 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfe9e4d7-1b5c-441b-9d15-f65061f9fa9d-catalog-content\") pod \"certified-operators-4884k\" (UID: \"bfe9e4d7-1b5c-441b-9d15-f65061f9fa9d\") " pod="openshift-marketplace/certified-operators-4884k" Dec 01 14:42:07 crc kubenswrapper[4585]: I1201 14:42:07.098376 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mbps\" (UniqueName: \"kubernetes.io/projected/bfe9e4d7-1b5c-441b-9d15-f65061f9fa9d-kube-api-access-6mbps\") pod \"certified-operators-4884k\" (UID: \"bfe9e4d7-1b5c-441b-9d15-f65061f9fa9d\") " pod="openshift-marketplace/certified-operators-4884k" Dec 01 14:42:07 crc kubenswrapper[4585]: I1201 14:42:07.098405 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfe9e4d7-1b5c-441b-9d15-f65061f9fa9d-utilities\") pod \"certified-operators-4884k\" (UID: \"bfe9e4d7-1b5c-441b-9d15-f65061f9fa9d\") " pod="openshift-marketplace/certified-operators-4884k" Dec 01 14:42:07 crc kubenswrapper[4585]: I1201 14:42:07.200863 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfe9e4d7-1b5c-441b-9d15-f65061f9fa9d-catalog-content\") pod \"certified-operators-4884k\" (UID: \"bfe9e4d7-1b5c-441b-9d15-f65061f9fa9d\") " pod="openshift-marketplace/certified-operators-4884k" Dec 01 14:42:07 crc kubenswrapper[4585]: I1201 14:42:07.201492 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mbps\" (UniqueName: \"kubernetes.io/projected/bfe9e4d7-1b5c-441b-9d15-f65061f9fa9d-kube-api-access-6mbps\") pod \"certified-operators-4884k\" (UID: \"bfe9e4d7-1b5c-441b-9d15-f65061f9fa9d\") " pod="openshift-marketplace/certified-operators-4884k" Dec 01 14:42:07 crc kubenswrapper[4585]: I1201 14:42:07.201511 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfe9e4d7-1b5c-441b-9d15-f65061f9fa9d-utilities\") pod \"certified-operators-4884k\" (UID: \"bfe9e4d7-1b5c-441b-9d15-f65061f9fa9d\") " pod="openshift-marketplace/certified-operators-4884k" Dec 01 14:42:07 crc kubenswrapper[4585]: I1201 14:42:07.201966 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfe9e4d7-1b5c-441b-9d15-f65061f9fa9d-utilities\") pod \"certified-operators-4884k\" (UID: \"bfe9e4d7-1b5c-441b-9d15-f65061f9fa9d\") " pod="openshift-marketplace/certified-operators-4884k" Dec 01 14:42:07 crc kubenswrapper[4585]: I1201 14:42:07.202569 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfe9e4d7-1b5c-441b-9d15-f65061f9fa9d-catalog-content\") pod \"certified-operators-4884k\" (UID: \"bfe9e4d7-1b5c-441b-9d15-f65061f9fa9d\") " pod="openshift-marketplace/certified-operators-4884k" Dec 01 14:42:07 crc kubenswrapper[4585]: I1201 14:42:07.239451 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mbps\" (UniqueName: \"kubernetes.io/projected/bfe9e4d7-1b5c-441b-9d15-f65061f9fa9d-kube-api-access-6mbps\") pod \"certified-operators-4884k\" (UID: \"bfe9e4d7-1b5c-441b-9d15-f65061f9fa9d\") " pod="openshift-marketplace/certified-operators-4884k" Dec 01 14:42:07 crc kubenswrapper[4585]: I1201 14:42:07.244535 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4884k" Dec 01 14:42:07 crc kubenswrapper[4585]: I1201 14:42:07.670234 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8pws6"] Dec 01 14:42:07 crc kubenswrapper[4585]: I1201 14:42:07.903262 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4884k"] Dec 01 14:42:08 crc kubenswrapper[4585]: I1201 14:42:08.589182 4585 generic.go:334] "Generic (PLEG): container finished" podID="0304d577-354f-483a-b982-4d785eb3410d" containerID="905134ea3176bf8f285c2e12d1e65a86e3f886d1b7131e1e622ac96ebd6d5f0c" exitCode=0 Dec 01 14:42:08 crc kubenswrapper[4585]: I1201 14:42:08.589261 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8pws6" event={"ID":"0304d577-354f-483a-b982-4d785eb3410d","Type":"ContainerDied","Data":"905134ea3176bf8f285c2e12d1e65a86e3f886d1b7131e1e622ac96ebd6d5f0c"} Dec 01 14:42:08 crc kubenswrapper[4585]: I1201 14:42:08.589658 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8pws6" event={"ID":"0304d577-354f-483a-b982-4d785eb3410d","Type":"ContainerStarted","Data":"d73d67c5cde21cda0428c0db199360d58612af2f1790c33a1145da917c91df7c"} Dec 01 14:42:08 crc kubenswrapper[4585]: I1201 14:42:08.591242 4585 generic.go:334] "Generic (PLEG): container finished" podID="bfe9e4d7-1b5c-441b-9d15-f65061f9fa9d" containerID="217728b9c2b51c473199257de9befb3946af7acedfd4e770794dc3976ee7f667" exitCode=0 Dec 01 14:42:08 crc kubenswrapper[4585]: I1201 14:42:08.591275 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4884k" event={"ID":"bfe9e4d7-1b5c-441b-9d15-f65061f9fa9d","Type":"ContainerDied","Data":"217728b9c2b51c473199257de9befb3946af7acedfd4e770794dc3976ee7f667"} Dec 01 14:42:08 crc kubenswrapper[4585]: I1201 14:42:08.591293 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4884k" event={"ID":"bfe9e4d7-1b5c-441b-9d15-f65061f9fa9d","Type":"ContainerStarted","Data":"332a729d9a1fd2bd760e6ddec218a1c0179b14950a8e7a11f0fdde7e40392ef8"} Dec 01 14:42:09 crc kubenswrapper[4585]: I1201 14:42:09.412592 4585 scope.go:117] "RemoveContainer" containerID="a1dbd00cb2d9328b688322869d95c509a2164260f953ca463a70d2042af9535f" Dec 01 14:42:09 crc kubenswrapper[4585]: E1201 14:42:09.413035 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:42:10 crc kubenswrapper[4585]: I1201 14:42:10.610984 4585 generic.go:334] "Generic (PLEG): container finished" podID="0304d577-354f-483a-b982-4d785eb3410d" containerID="f9636c7970aa918960e2b1d03337e960bed9011403d070092e5181ebc59eb278" exitCode=0 Dec 01 14:42:10 crc kubenswrapper[4585]: I1201 14:42:10.611023 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8pws6" event={"ID":"0304d577-354f-483a-b982-4d785eb3410d","Type":"ContainerDied","Data":"f9636c7970aa918960e2b1d03337e960bed9011403d070092e5181ebc59eb278"} Dec 01 14:42:10 crc kubenswrapper[4585]: I1201 14:42:10.616949 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4884k" event={"ID":"bfe9e4d7-1b5c-441b-9d15-f65061f9fa9d","Type":"ContainerDied","Data":"45681f78708101fca599ffd3167dd087a5b9740a0e9e83b77ef2106e512d77f4"} Dec 01 14:42:10 crc kubenswrapper[4585]: I1201 14:42:10.617822 4585 generic.go:334] "Generic (PLEG): container finished" podID="bfe9e4d7-1b5c-441b-9d15-f65061f9fa9d" containerID="45681f78708101fca599ffd3167dd087a5b9740a0e9e83b77ef2106e512d77f4" exitCode=0 Dec 01 14:42:11 crc kubenswrapper[4585]: I1201 14:42:11.633235 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4884k" event={"ID":"bfe9e4d7-1b5c-441b-9d15-f65061f9fa9d","Type":"ContainerStarted","Data":"d3d47d50af21dcd7116784407b136953b9847e90e73cf132cc06926855c9041c"} Dec 01 14:42:11 crc kubenswrapper[4585]: I1201 14:42:11.667733 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4884k" podStartSLOduration=3.144974448 podStartE2EDuration="5.667714114s" podCreationTimestamp="2025-12-01 14:42:06 +0000 UTC" firstStartedPulling="2025-12-01 14:42:08.596712051 +0000 UTC m=+2642.580925896" lastFinishedPulling="2025-12-01 14:42:11.119451707 +0000 UTC m=+2645.103665562" observedRunningTime="2025-12-01 14:42:11.661095499 +0000 UTC m=+2645.645309354" watchObservedRunningTime="2025-12-01 14:42:11.667714114 +0000 UTC m=+2645.651927969" Dec 01 14:42:12 crc kubenswrapper[4585]: I1201 14:42:12.642696 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8pws6" event={"ID":"0304d577-354f-483a-b982-4d785eb3410d","Type":"ContainerStarted","Data":"457ac69e66642af2f347f72c8ea9f4faf5df67d896167d99ee9de10f47db325c"} Dec 01 14:42:12 crc kubenswrapper[4585]: I1201 14:42:12.668254 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8pws6" podStartSLOduration=3.419653988 podStartE2EDuration="6.668238674s" podCreationTimestamp="2025-12-01 14:42:06 +0000 UTC" firstStartedPulling="2025-12-01 14:42:08.591537744 +0000 UTC m=+2642.575751599" lastFinishedPulling="2025-12-01 14:42:11.84012243 +0000 UTC m=+2645.824336285" observedRunningTime="2025-12-01 14:42:12.662054841 +0000 UTC m=+2646.646268706" watchObservedRunningTime="2025-12-01 14:42:12.668238674 +0000 UTC m=+2646.652452529" Dec 01 14:42:17 crc kubenswrapper[4585]: I1201 14:42:17.029477 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8pws6" Dec 01 14:42:17 crc kubenswrapper[4585]: I1201 14:42:17.030092 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8pws6" Dec 01 14:42:17 crc kubenswrapper[4585]: I1201 14:42:17.075317 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8pws6" Dec 01 14:42:17 crc kubenswrapper[4585]: I1201 14:42:17.245092 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4884k" Dec 01 14:42:17 crc kubenswrapper[4585]: I1201 14:42:17.245140 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4884k" Dec 01 14:42:17 crc kubenswrapper[4585]: I1201 14:42:17.298617 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4884k" Dec 01 14:42:17 crc kubenswrapper[4585]: I1201 14:42:17.743133 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8pws6" Dec 01 14:42:17 crc kubenswrapper[4585]: I1201 14:42:17.756253 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4884k" Dec 01 14:42:19 crc kubenswrapper[4585]: I1201 14:42:19.089909 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8pws6"] Dec 01 14:42:19 crc kubenswrapper[4585]: I1201 14:42:19.707162 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8pws6" podUID="0304d577-354f-483a-b982-4d785eb3410d" containerName="registry-server" containerID="cri-o://457ac69e66642af2f347f72c8ea9f4faf5df67d896167d99ee9de10f47db325c" gracePeriod=2 Dec 01 14:42:20 crc kubenswrapper[4585]: I1201 14:42:20.097848 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4884k"] Dec 01 14:42:20 crc kubenswrapper[4585]: I1201 14:42:20.098342 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4884k" podUID="bfe9e4d7-1b5c-441b-9d15-f65061f9fa9d" containerName="registry-server" containerID="cri-o://d3d47d50af21dcd7116784407b136953b9847e90e73cf132cc06926855c9041c" gracePeriod=2 Dec 01 14:42:20 crc kubenswrapper[4585]: I1201 14:42:20.265075 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8pws6" Dec 01 14:42:20 crc kubenswrapper[4585]: I1201 14:42:20.319447 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0304d577-354f-483a-b982-4d785eb3410d-catalog-content\") pod \"0304d577-354f-483a-b982-4d785eb3410d\" (UID: \"0304d577-354f-483a-b982-4d785eb3410d\") " Dec 01 14:42:20 crc kubenswrapper[4585]: I1201 14:42:20.319636 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gqqt\" (UniqueName: \"kubernetes.io/projected/0304d577-354f-483a-b982-4d785eb3410d-kube-api-access-9gqqt\") pod \"0304d577-354f-483a-b982-4d785eb3410d\" (UID: \"0304d577-354f-483a-b982-4d785eb3410d\") " Dec 01 14:42:20 crc kubenswrapper[4585]: I1201 14:42:20.319716 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0304d577-354f-483a-b982-4d785eb3410d-utilities\") pod \"0304d577-354f-483a-b982-4d785eb3410d\" (UID: \"0304d577-354f-483a-b982-4d785eb3410d\") " Dec 01 14:42:20 crc kubenswrapper[4585]: I1201 14:42:20.320895 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0304d577-354f-483a-b982-4d785eb3410d-utilities" (OuterVolumeSpecName: "utilities") pod "0304d577-354f-483a-b982-4d785eb3410d" (UID: "0304d577-354f-483a-b982-4d785eb3410d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:42:20 crc kubenswrapper[4585]: I1201 14:42:20.347143 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0304d577-354f-483a-b982-4d785eb3410d-kube-api-access-9gqqt" (OuterVolumeSpecName: "kube-api-access-9gqqt") pod "0304d577-354f-483a-b982-4d785eb3410d" (UID: "0304d577-354f-483a-b982-4d785eb3410d"). InnerVolumeSpecName "kube-api-access-9gqqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:42:20 crc kubenswrapper[4585]: I1201 14:42:20.356443 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0304d577-354f-483a-b982-4d785eb3410d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0304d577-354f-483a-b982-4d785eb3410d" (UID: "0304d577-354f-483a-b982-4d785eb3410d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:42:20 crc kubenswrapper[4585]: I1201 14:42:20.443363 4585 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0304d577-354f-483a-b982-4d785eb3410d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 14:42:20 crc kubenswrapper[4585]: I1201 14:42:20.443492 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gqqt\" (UniqueName: \"kubernetes.io/projected/0304d577-354f-483a-b982-4d785eb3410d-kube-api-access-9gqqt\") on node \"crc\" DevicePath \"\"" Dec 01 14:42:20 crc kubenswrapper[4585]: I1201 14:42:20.443511 4585 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0304d577-354f-483a-b982-4d785eb3410d-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 14:42:20 crc kubenswrapper[4585]: I1201 14:42:20.544874 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4884k" Dec 01 14:42:20 crc kubenswrapper[4585]: I1201 14:42:20.647083 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfe9e4d7-1b5c-441b-9d15-f65061f9fa9d-utilities\") pod \"bfe9e4d7-1b5c-441b-9d15-f65061f9fa9d\" (UID: \"bfe9e4d7-1b5c-441b-9d15-f65061f9fa9d\") " Dec 01 14:42:20 crc kubenswrapper[4585]: I1201 14:42:20.647477 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfe9e4d7-1b5c-441b-9d15-f65061f9fa9d-catalog-content\") pod \"bfe9e4d7-1b5c-441b-9d15-f65061f9fa9d\" (UID: \"bfe9e4d7-1b5c-441b-9d15-f65061f9fa9d\") " Dec 01 14:42:20 crc kubenswrapper[4585]: I1201 14:42:20.647508 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mbps\" (UniqueName: \"kubernetes.io/projected/bfe9e4d7-1b5c-441b-9d15-f65061f9fa9d-kube-api-access-6mbps\") pod \"bfe9e4d7-1b5c-441b-9d15-f65061f9fa9d\" (UID: \"bfe9e4d7-1b5c-441b-9d15-f65061f9fa9d\") " Dec 01 14:42:20 crc kubenswrapper[4585]: I1201 14:42:20.648300 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfe9e4d7-1b5c-441b-9d15-f65061f9fa9d-utilities" (OuterVolumeSpecName: "utilities") pod "bfe9e4d7-1b5c-441b-9d15-f65061f9fa9d" (UID: "bfe9e4d7-1b5c-441b-9d15-f65061f9fa9d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:42:20 crc kubenswrapper[4585]: I1201 14:42:20.656711 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfe9e4d7-1b5c-441b-9d15-f65061f9fa9d-kube-api-access-6mbps" (OuterVolumeSpecName: "kube-api-access-6mbps") pod "bfe9e4d7-1b5c-441b-9d15-f65061f9fa9d" (UID: "bfe9e4d7-1b5c-441b-9d15-f65061f9fa9d"). InnerVolumeSpecName "kube-api-access-6mbps". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:42:20 crc kubenswrapper[4585]: I1201 14:42:20.700487 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfe9e4d7-1b5c-441b-9d15-f65061f9fa9d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bfe9e4d7-1b5c-441b-9d15-f65061f9fa9d" (UID: "bfe9e4d7-1b5c-441b-9d15-f65061f9fa9d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:42:20 crc kubenswrapper[4585]: I1201 14:42:20.718340 4585 generic.go:334] "Generic (PLEG): container finished" podID="bfe9e4d7-1b5c-441b-9d15-f65061f9fa9d" containerID="d3d47d50af21dcd7116784407b136953b9847e90e73cf132cc06926855c9041c" exitCode=0 Dec 01 14:42:20 crc kubenswrapper[4585]: I1201 14:42:20.718413 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4884k" event={"ID":"bfe9e4d7-1b5c-441b-9d15-f65061f9fa9d","Type":"ContainerDied","Data":"d3d47d50af21dcd7116784407b136953b9847e90e73cf132cc06926855c9041c"} Dec 01 14:42:20 crc kubenswrapper[4585]: I1201 14:42:20.718434 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4884k" Dec 01 14:42:20 crc kubenswrapper[4585]: I1201 14:42:20.718453 4585 scope.go:117] "RemoveContainer" containerID="d3d47d50af21dcd7116784407b136953b9847e90e73cf132cc06926855c9041c" Dec 01 14:42:20 crc kubenswrapper[4585]: I1201 14:42:20.718442 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4884k" event={"ID":"bfe9e4d7-1b5c-441b-9d15-f65061f9fa9d","Type":"ContainerDied","Data":"332a729d9a1fd2bd760e6ddec218a1c0179b14950a8e7a11f0fdde7e40392ef8"} Dec 01 14:42:20 crc kubenswrapper[4585]: I1201 14:42:20.721214 4585 generic.go:334] "Generic (PLEG): container finished" podID="0304d577-354f-483a-b982-4d785eb3410d" containerID="457ac69e66642af2f347f72c8ea9f4faf5df67d896167d99ee9de10f47db325c" exitCode=0 Dec 01 14:42:20 crc kubenswrapper[4585]: I1201 14:42:20.721285 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8pws6" Dec 01 14:42:20 crc kubenswrapper[4585]: I1201 14:42:20.721300 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8pws6" event={"ID":"0304d577-354f-483a-b982-4d785eb3410d","Type":"ContainerDied","Data":"457ac69e66642af2f347f72c8ea9f4faf5df67d896167d99ee9de10f47db325c"} Dec 01 14:42:20 crc kubenswrapper[4585]: I1201 14:42:20.722413 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8pws6" event={"ID":"0304d577-354f-483a-b982-4d785eb3410d","Type":"ContainerDied","Data":"d73d67c5cde21cda0428c0db199360d58612af2f1790c33a1145da917c91df7c"} Dec 01 14:42:20 crc kubenswrapper[4585]: I1201 14:42:20.748944 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8pws6"] Dec 01 14:42:20 crc kubenswrapper[4585]: I1201 14:42:20.750001 4585 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfe9e4d7-1b5c-441b-9d15-f65061f9fa9d-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 14:42:20 crc kubenswrapper[4585]: I1201 14:42:20.750023 4585 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfe9e4d7-1b5c-441b-9d15-f65061f9fa9d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 14:42:20 crc kubenswrapper[4585]: I1201 14:42:20.750033 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mbps\" (UniqueName: \"kubernetes.io/projected/bfe9e4d7-1b5c-441b-9d15-f65061f9fa9d-kube-api-access-6mbps\") on node \"crc\" DevicePath \"\"" Dec 01 14:42:20 crc kubenswrapper[4585]: I1201 14:42:20.752184 4585 scope.go:117] "RemoveContainer" containerID="45681f78708101fca599ffd3167dd087a5b9740a0e9e83b77ef2106e512d77f4" Dec 01 14:42:20 crc kubenswrapper[4585]: I1201 14:42:20.757274 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8pws6"] Dec 01 14:42:20 crc kubenswrapper[4585]: I1201 14:42:20.771646 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4884k"] Dec 01 14:42:20 crc kubenswrapper[4585]: I1201 14:42:20.779694 4585 scope.go:117] "RemoveContainer" containerID="217728b9c2b51c473199257de9befb3946af7acedfd4e770794dc3976ee7f667" Dec 01 14:42:20 crc kubenswrapper[4585]: I1201 14:42:20.780415 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4884k"] Dec 01 14:42:20 crc kubenswrapper[4585]: I1201 14:42:20.801480 4585 scope.go:117] "RemoveContainer" containerID="d3d47d50af21dcd7116784407b136953b9847e90e73cf132cc06926855c9041c" Dec 01 14:42:20 crc kubenswrapper[4585]: E1201 14:42:20.803818 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3d47d50af21dcd7116784407b136953b9847e90e73cf132cc06926855c9041c\": container with ID starting with d3d47d50af21dcd7116784407b136953b9847e90e73cf132cc06926855c9041c not found: ID does not exist" containerID="d3d47d50af21dcd7116784407b136953b9847e90e73cf132cc06926855c9041c" Dec 01 14:42:20 crc kubenswrapper[4585]: I1201 14:42:20.803864 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3d47d50af21dcd7116784407b136953b9847e90e73cf132cc06926855c9041c"} err="failed to get container status \"d3d47d50af21dcd7116784407b136953b9847e90e73cf132cc06926855c9041c\": rpc error: code = NotFound desc = could not find container \"d3d47d50af21dcd7116784407b136953b9847e90e73cf132cc06926855c9041c\": container with ID starting with d3d47d50af21dcd7116784407b136953b9847e90e73cf132cc06926855c9041c not found: ID does not exist" Dec 01 14:42:20 crc kubenswrapper[4585]: I1201 14:42:20.803892 4585 scope.go:117] "RemoveContainer" containerID="45681f78708101fca599ffd3167dd087a5b9740a0e9e83b77ef2106e512d77f4" Dec 01 14:42:20 crc kubenswrapper[4585]: E1201 14:42:20.804507 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45681f78708101fca599ffd3167dd087a5b9740a0e9e83b77ef2106e512d77f4\": container with ID starting with 45681f78708101fca599ffd3167dd087a5b9740a0e9e83b77ef2106e512d77f4 not found: ID does not exist" containerID="45681f78708101fca599ffd3167dd087a5b9740a0e9e83b77ef2106e512d77f4" Dec 01 14:42:20 crc kubenswrapper[4585]: I1201 14:42:20.804611 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45681f78708101fca599ffd3167dd087a5b9740a0e9e83b77ef2106e512d77f4"} err="failed to get container status \"45681f78708101fca599ffd3167dd087a5b9740a0e9e83b77ef2106e512d77f4\": rpc error: code = NotFound desc = could not find container \"45681f78708101fca599ffd3167dd087a5b9740a0e9e83b77ef2106e512d77f4\": container with ID starting with 45681f78708101fca599ffd3167dd087a5b9740a0e9e83b77ef2106e512d77f4 not found: ID does not exist" Dec 01 14:42:20 crc kubenswrapper[4585]: I1201 14:42:20.804660 4585 scope.go:117] "RemoveContainer" containerID="217728b9c2b51c473199257de9befb3946af7acedfd4e770794dc3976ee7f667" Dec 01 14:42:20 crc kubenswrapper[4585]: E1201 14:42:20.805170 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"217728b9c2b51c473199257de9befb3946af7acedfd4e770794dc3976ee7f667\": container with ID starting with 217728b9c2b51c473199257de9befb3946af7acedfd4e770794dc3976ee7f667 not found: ID does not exist" containerID="217728b9c2b51c473199257de9befb3946af7acedfd4e770794dc3976ee7f667" Dec 01 14:42:20 crc kubenswrapper[4585]: I1201 14:42:20.805206 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"217728b9c2b51c473199257de9befb3946af7acedfd4e770794dc3976ee7f667"} err="failed to get container status \"217728b9c2b51c473199257de9befb3946af7acedfd4e770794dc3976ee7f667\": rpc error: code = NotFound desc = could not find container \"217728b9c2b51c473199257de9befb3946af7acedfd4e770794dc3976ee7f667\": container with ID starting with 217728b9c2b51c473199257de9befb3946af7acedfd4e770794dc3976ee7f667 not found: ID does not exist" Dec 01 14:42:20 crc kubenswrapper[4585]: I1201 14:42:20.805228 4585 scope.go:117] "RemoveContainer" containerID="457ac69e66642af2f347f72c8ea9f4faf5df67d896167d99ee9de10f47db325c" Dec 01 14:42:20 crc kubenswrapper[4585]: I1201 14:42:20.851925 4585 scope.go:117] "RemoveContainer" containerID="f9636c7970aa918960e2b1d03337e960bed9011403d070092e5181ebc59eb278" Dec 01 14:42:20 crc kubenswrapper[4585]: I1201 14:42:20.878879 4585 scope.go:117] "RemoveContainer" containerID="905134ea3176bf8f285c2e12d1e65a86e3f886d1b7131e1e622ac96ebd6d5f0c" Dec 01 14:42:20 crc kubenswrapper[4585]: I1201 14:42:20.951188 4585 scope.go:117] "RemoveContainer" containerID="457ac69e66642af2f347f72c8ea9f4faf5df67d896167d99ee9de10f47db325c" Dec 01 14:42:20 crc kubenswrapper[4585]: E1201 14:42:20.951646 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"457ac69e66642af2f347f72c8ea9f4faf5df67d896167d99ee9de10f47db325c\": container with ID starting with 457ac69e66642af2f347f72c8ea9f4faf5df67d896167d99ee9de10f47db325c not found: ID does not exist" containerID="457ac69e66642af2f347f72c8ea9f4faf5df67d896167d99ee9de10f47db325c" Dec 01 14:42:20 crc kubenswrapper[4585]: I1201 14:42:20.951687 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"457ac69e66642af2f347f72c8ea9f4faf5df67d896167d99ee9de10f47db325c"} err="failed to get container status \"457ac69e66642af2f347f72c8ea9f4faf5df67d896167d99ee9de10f47db325c\": rpc error: code = NotFound desc = could not find container \"457ac69e66642af2f347f72c8ea9f4faf5df67d896167d99ee9de10f47db325c\": container with ID starting with 457ac69e66642af2f347f72c8ea9f4faf5df67d896167d99ee9de10f47db325c not found: ID does not exist" Dec 01 14:42:20 crc kubenswrapper[4585]: I1201 14:42:20.951713 4585 scope.go:117] "RemoveContainer" containerID="f9636c7970aa918960e2b1d03337e960bed9011403d070092e5181ebc59eb278" Dec 01 14:42:20 crc kubenswrapper[4585]: E1201 14:42:20.952088 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9636c7970aa918960e2b1d03337e960bed9011403d070092e5181ebc59eb278\": container with ID starting with f9636c7970aa918960e2b1d03337e960bed9011403d070092e5181ebc59eb278 not found: ID does not exist" containerID="f9636c7970aa918960e2b1d03337e960bed9011403d070092e5181ebc59eb278" Dec 01 14:42:20 crc kubenswrapper[4585]: I1201 14:42:20.952149 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9636c7970aa918960e2b1d03337e960bed9011403d070092e5181ebc59eb278"} err="failed to get container status \"f9636c7970aa918960e2b1d03337e960bed9011403d070092e5181ebc59eb278\": rpc error: code = NotFound desc = could not find container \"f9636c7970aa918960e2b1d03337e960bed9011403d070092e5181ebc59eb278\": container with ID starting with f9636c7970aa918960e2b1d03337e960bed9011403d070092e5181ebc59eb278 not found: ID does not exist" Dec 01 14:42:20 crc kubenswrapper[4585]: I1201 14:42:20.952178 4585 scope.go:117] "RemoveContainer" containerID="905134ea3176bf8f285c2e12d1e65a86e3f886d1b7131e1e622ac96ebd6d5f0c" Dec 01 14:42:20 crc kubenswrapper[4585]: E1201 14:42:20.952504 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"905134ea3176bf8f285c2e12d1e65a86e3f886d1b7131e1e622ac96ebd6d5f0c\": container with ID starting with 905134ea3176bf8f285c2e12d1e65a86e3f886d1b7131e1e622ac96ebd6d5f0c not found: ID does not exist" containerID="905134ea3176bf8f285c2e12d1e65a86e3f886d1b7131e1e622ac96ebd6d5f0c" Dec 01 14:42:20 crc kubenswrapper[4585]: I1201 14:42:20.952547 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"905134ea3176bf8f285c2e12d1e65a86e3f886d1b7131e1e622ac96ebd6d5f0c"} err="failed to get container status \"905134ea3176bf8f285c2e12d1e65a86e3f886d1b7131e1e622ac96ebd6d5f0c\": rpc error: code = NotFound desc = could not find container \"905134ea3176bf8f285c2e12d1e65a86e3f886d1b7131e1e622ac96ebd6d5f0c\": container with ID starting with 905134ea3176bf8f285c2e12d1e65a86e3f886d1b7131e1e622ac96ebd6d5f0c not found: ID does not exist" Dec 01 14:42:21 crc kubenswrapper[4585]: I1201 14:42:21.412947 4585 scope.go:117] "RemoveContainer" containerID="a1dbd00cb2d9328b688322869d95c509a2164260f953ca463a70d2042af9535f" Dec 01 14:42:21 crc kubenswrapper[4585]: E1201 14:42:21.413210 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:42:22 crc kubenswrapper[4585]: I1201 14:42:22.426355 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0304d577-354f-483a-b982-4d785eb3410d" path="/var/lib/kubelet/pods/0304d577-354f-483a-b982-4d785eb3410d/volumes" Dec 01 14:42:22 crc kubenswrapper[4585]: I1201 14:42:22.427528 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfe9e4d7-1b5c-441b-9d15-f65061f9fa9d" path="/var/lib/kubelet/pods/bfe9e4d7-1b5c-441b-9d15-f65061f9fa9d/volumes" Dec 01 14:42:35 crc kubenswrapper[4585]: I1201 14:42:35.413042 4585 scope.go:117] "RemoveContainer" containerID="a1dbd00cb2d9328b688322869d95c509a2164260f953ca463a70d2042af9535f" Dec 01 14:42:35 crc kubenswrapper[4585]: E1201 14:42:35.414303 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:42:48 crc kubenswrapper[4585]: I1201 14:42:48.414134 4585 scope.go:117] "RemoveContainer" containerID="a1dbd00cb2d9328b688322869d95c509a2164260f953ca463a70d2042af9535f" Dec 01 14:42:49 crc kubenswrapper[4585]: I1201 14:42:49.000085 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" event={"ID":"f7beb40d-bcd0-43c8-a9fe-c32408790a4c","Type":"ContainerStarted","Data":"6ded03fb550f95ffdee9f445ca211a7031c49c1e9aa5ae8c0ec5434bb5ff5043"} Dec 01 14:43:03 crc kubenswrapper[4585]: I1201 14:43:03.131806 4585 generic.go:334] "Generic (PLEG): container finished" podID="1232f97e-9bf9-4917-b806-e5de8f180f70" containerID="aa3b1c19477fc7179f73c98682f65838b40123fe4a44339bcfdea62afc9d748b" exitCode=0 Dec 01 14:43:03 crc kubenswrapper[4585]: I1201 14:43:03.131904 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8cmsb" event={"ID":"1232f97e-9bf9-4917-b806-e5de8f180f70","Type":"ContainerDied","Data":"aa3b1c19477fc7179f73c98682f65838b40123fe4a44339bcfdea62afc9d748b"} Dec 01 14:43:04 crc kubenswrapper[4585]: I1201 14:43:04.578600 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8cmsb" Dec 01 14:43:04 crc kubenswrapper[4585]: I1201 14:43:04.745772 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/1232f97e-9bf9-4917-b806-e5de8f180f70-ceilometer-compute-config-data-0\") pod \"1232f97e-9bf9-4917-b806-e5de8f180f70\" (UID: \"1232f97e-9bf9-4917-b806-e5de8f180f70\") " Dec 01 14:43:04 crc kubenswrapper[4585]: I1201 14:43:04.745943 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktx9n\" (UniqueName: \"kubernetes.io/projected/1232f97e-9bf9-4917-b806-e5de8f180f70-kube-api-access-ktx9n\") pod \"1232f97e-9bf9-4917-b806-e5de8f180f70\" (UID: \"1232f97e-9bf9-4917-b806-e5de8f180f70\") " Dec 01 14:43:04 crc kubenswrapper[4585]: I1201 14:43:04.746010 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1232f97e-9bf9-4917-b806-e5de8f180f70-ssh-key\") pod \"1232f97e-9bf9-4917-b806-e5de8f180f70\" (UID: \"1232f97e-9bf9-4917-b806-e5de8f180f70\") " Dec 01 14:43:04 crc kubenswrapper[4585]: I1201 14:43:04.746037 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/1232f97e-9bf9-4917-b806-e5de8f180f70-ceilometer-compute-config-data-1\") pod \"1232f97e-9bf9-4917-b806-e5de8f180f70\" (UID: \"1232f97e-9bf9-4917-b806-e5de8f180f70\") " Dec 01 14:43:04 crc kubenswrapper[4585]: I1201 14:43:04.746149 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1232f97e-9bf9-4917-b806-e5de8f180f70-inventory\") pod \"1232f97e-9bf9-4917-b806-e5de8f180f70\" (UID: \"1232f97e-9bf9-4917-b806-e5de8f180f70\") " Dec 01 14:43:04 crc kubenswrapper[4585]: I1201 14:43:04.746249 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/1232f97e-9bf9-4917-b806-e5de8f180f70-ceilometer-compute-config-data-2\") pod \"1232f97e-9bf9-4917-b806-e5de8f180f70\" (UID: \"1232f97e-9bf9-4917-b806-e5de8f180f70\") " Dec 01 14:43:04 crc kubenswrapper[4585]: I1201 14:43:04.746289 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1232f97e-9bf9-4917-b806-e5de8f180f70-telemetry-combined-ca-bundle\") pod \"1232f97e-9bf9-4917-b806-e5de8f180f70\" (UID: \"1232f97e-9bf9-4917-b806-e5de8f180f70\") " Dec 01 14:43:04 crc kubenswrapper[4585]: I1201 14:43:04.764522 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1232f97e-9bf9-4917-b806-e5de8f180f70-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "1232f97e-9bf9-4917-b806-e5de8f180f70" (UID: "1232f97e-9bf9-4917-b806-e5de8f180f70"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:43:04 crc kubenswrapper[4585]: I1201 14:43:04.770824 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1232f97e-9bf9-4917-b806-e5de8f180f70-kube-api-access-ktx9n" (OuterVolumeSpecName: "kube-api-access-ktx9n") pod "1232f97e-9bf9-4917-b806-e5de8f180f70" (UID: "1232f97e-9bf9-4917-b806-e5de8f180f70"). InnerVolumeSpecName "kube-api-access-ktx9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:43:04 crc kubenswrapper[4585]: I1201 14:43:04.821553 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1232f97e-9bf9-4917-b806-e5de8f180f70-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "1232f97e-9bf9-4917-b806-e5de8f180f70" (UID: "1232f97e-9bf9-4917-b806-e5de8f180f70"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:43:04 crc kubenswrapper[4585]: I1201 14:43:04.846808 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1232f97e-9bf9-4917-b806-e5de8f180f70-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1232f97e-9bf9-4917-b806-e5de8f180f70" (UID: "1232f97e-9bf9-4917-b806-e5de8f180f70"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:43:04 crc kubenswrapper[4585]: I1201 14:43:04.849126 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktx9n\" (UniqueName: \"kubernetes.io/projected/1232f97e-9bf9-4917-b806-e5de8f180f70-kube-api-access-ktx9n\") on node \"crc\" DevicePath \"\"" Dec 01 14:43:04 crc kubenswrapper[4585]: I1201 14:43:04.849154 4585 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1232f97e-9bf9-4917-b806-e5de8f180f70-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 14:43:04 crc kubenswrapper[4585]: I1201 14:43:04.849164 4585 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/1232f97e-9bf9-4917-b806-e5de8f180f70-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Dec 01 14:43:04 crc kubenswrapper[4585]: I1201 14:43:04.849174 4585 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1232f97e-9bf9-4917-b806-e5de8f180f70-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 14:43:04 crc kubenswrapper[4585]: I1201 14:43:04.849272 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1232f97e-9bf9-4917-b806-e5de8f180f70-inventory" (OuterVolumeSpecName: "inventory") pod "1232f97e-9bf9-4917-b806-e5de8f180f70" (UID: "1232f97e-9bf9-4917-b806-e5de8f180f70"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:43:04 crc kubenswrapper[4585]: I1201 14:43:04.858028 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1232f97e-9bf9-4917-b806-e5de8f180f70-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "1232f97e-9bf9-4917-b806-e5de8f180f70" (UID: "1232f97e-9bf9-4917-b806-e5de8f180f70"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:43:04 crc kubenswrapper[4585]: I1201 14:43:04.884589 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1232f97e-9bf9-4917-b806-e5de8f180f70-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "1232f97e-9bf9-4917-b806-e5de8f180f70" (UID: "1232f97e-9bf9-4917-b806-e5de8f180f70"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:43:04 crc kubenswrapper[4585]: I1201 14:43:04.951387 4585 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/1232f97e-9bf9-4917-b806-e5de8f180f70-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 01 14:43:04 crc kubenswrapper[4585]: I1201 14:43:04.951424 4585 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1232f97e-9bf9-4917-b806-e5de8f180f70-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 14:43:04 crc kubenswrapper[4585]: I1201 14:43:04.951434 4585 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/1232f97e-9bf9-4917-b806-e5de8f180f70-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 01 14:43:05 crc kubenswrapper[4585]: I1201 14:43:05.151574 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8cmsb" event={"ID":"1232f97e-9bf9-4917-b806-e5de8f180f70","Type":"ContainerDied","Data":"b491654aeafaa1524ce7a008f5e73badeea6eb5810f207fc1aaae00925982ba1"} Dec 01 14:43:05 crc kubenswrapper[4585]: I1201 14:43:05.151831 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b491654aeafaa1524ce7a008f5e73badeea6eb5810f207fc1aaae00925982ba1" Dec 01 14:43:05 crc kubenswrapper[4585]: I1201 14:43:05.151649 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-8cmsb" Dec 01 14:43:42 crc kubenswrapper[4585]: E1201 14:43:42.992355 4585 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.44:55454->38.102.83.44:34393: read tcp 38.102.83.44:55454->38.102.83.44:34393: read: connection reset by peer Dec 01 14:43:59 crc kubenswrapper[4585]: I1201 14:43:59.740952 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ddx8z"] Dec 01 14:43:59 crc kubenswrapper[4585]: E1201 14:43:59.742060 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1232f97e-9bf9-4917-b806-e5de8f180f70" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 01 14:43:59 crc kubenswrapper[4585]: I1201 14:43:59.742078 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="1232f97e-9bf9-4917-b806-e5de8f180f70" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 01 14:43:59 crc kubenswrapper[4585]: E1201 14:43:59.742097 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0304d577-354f-483a-b982-4d785eb3410d" containerName="registry-server" Dec 01 14:43:59 crc kubenswrapper[4585]: I1201 14:43:59.742104 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="0304d577-354f-483a-b982-4d785eb3410d" containerName="registry-server" Dec 01 14:43:59 crc kubenswrapper[4585]: E1201 14:43:59.742121 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfe9e4d7-1b5c-441b-9d15-f65061f9fa9d" containerName="registry-server" Dec 01 14:43:59 crc kubenswrapper[4585]: I1201 14:43:59.742127 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfe9e4d7-1b5c-441b-9d15-f65061f9fa9d" containerName="registry-server" Dec 01 14:43:59 crc kubenswrapper[4585]: E1201 14:43:59.742140 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0304d577-354f-483a-b982-4d785eb3410d" containerName="extract-content" Dec 01 14:43:59 crc kubenswrapper[4585]: I1201 14:43:59.742146 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="0304d577-354f-483a-b982-4d785eb3410d" containerName="extract-content" Dec 01 14:43:59 crc kubenswrapper[4585]: E1201 14:43:59.742158 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfe9e4d7-1b5c-441b-9d15-f65061f9fa9d" containerName="extract-utilities" Dec 01 14:43:59 crc kubenswrapper[4585]: I1201 14:43:59.742166 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfe9e4d7-1b5c-441b-9d15-f65061f9fa9d" containerName="extract-utilities" Dec 01 14:43:59 crc kubenswrapper[4585]: E1201 14:43:59.742175 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0304d577-354f-483a-b982-4d785eb3410d" containerName="extract-utilities" Dec 01 14:43:59 crc kubenswrapper[4585]: I1201 14:43:59.742180 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="0304d577-354f-483a-b982-4d785eb3410d" containerName="extract-utilities" Dec 01 14:43:59 crc kubenswrapper[4585]: E1201 14:43:59.742196 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfe9e4d7-1b5c-441b-9d15-f65061f9fa9d" containerName="extract-content" Dec 01 14:43:59 crc kubenswrapper[4585]: I1201 14:43:59.742202 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfe9e4d7-1b5c-441b-9d15-f65061f9fa9d" containerName="extract-content" Dec 01 14:43:59 crc kubenswrapper[4585]: I1201 14:43:59.742382 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="0304d577-354f-483a-b982-4d785eb3410d" containerName="registry-server" Dec 01 14:43:59 crc kubenswrapper[4585]: I1201 14:43:59.742398 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfe9e4d7-1b5c-441b-9d15-f65061f9fa9d" containerName="registry-server" Dec 01 14:43:59 crc kubenswrapper[4585]: I1201 14:43:59.742418 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="1232f97e-9bf9-4917-b806-e5de8f180f70" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 01 14:43:59 crc kubenswrapper[4585]: I1201 14:43:59.746244 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ddx8z" Dec 01 14:43:59 crc kubenswrapper[4585]: I1201 14:43:59.751173 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ddx8z"] Dec 01 14:43:59 crc kubenswrapper[4585]: I1201 14:43:59.818188 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e63ce738-111b-4a99-bd44-a47793663a5e-catalog-content\") pod \"redhat-operators-ddx8z\" (UID: \"e63ce738-111b-4a99-bd44-a47793663a5e\") " pod="openshift-marketplace/redhat-operators-ddx8z" Dec 01 14:43:59 crc kubenswrapper[4585]: I1201 14:43:59.818246 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e63ce738-111b-4a99-bd44-a47793663a5e-utilities\") pod \"redhat-operators-ddx8z\" (UID: \"e63ce738-111b-4a99-bd44-a47793663a5e\") " pod="openshift-marketplace/redhat-operators-ddx8z" Dec 01 14:43:59 crc kubenswrapper[4585]: I1201 14:43:59.818355 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52snx\" (UniqueName: \"kubernetes.io/projected/e63ce738-111b-4a99-bd44-a47793663a5e-kube-api-access-52snx\") pod \"redhat-operators-ddx8z\" (UID: \"e63ce738-111b-4a99-bd44-a47793663a5e\") " pod="openshift-marketplace/redhat-operators-ddx8z" Dec 01 14:43:59 crc kubenswrapper[4585]: I1201 14:43:59.920583 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e63ce738-111b-4a99-bd44-a47793663a5e-catalog-content\") pod \"redhat-operators-ddx8z\" (UID: \"e63ce738-111b-4a99-bd44-a47793663a5e\") " pod="openshift-marketplace/redhat-operators-ddx8z" Dec 01 14:43:59 crc kubenswrapper[4585]: I1201 14:43:59.920639 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e63ce738-111b-4a99-bd44-a47793663a5e-utilities\") pod \"redhat-operators-ddx8z\" (UID: \"e63ce738-111b-4a99-bd44-a47793663a5e\") " pod="openshift-marketplace/redhat-operators-ddx8z" Dec 01 14:43:59 crc kubenswrapper[4585]: I1201 14:43:59.920753 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52snx\" (UniqueName: \"kubernetes.io/projected/e63ce738-111b-4a99-bd44-a47793663a5e-kube-api-access-52snx\") pod \"redhat-operators-ddx8z\" (UID: \"e63ce738-111b-4a99-bd44-a47793663a5e\") " pod="openshift-marketplace/redhat-operators-ddx8z" Dec 01 14:43:59 crc kubenswrapper[4585]: I1201 14:43:59.922041 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e63ce738-111b-4a99-bd44-a47793663a5e-catalog-content\") pod \"redhat-operators-ddx8z\" (UID: \"e63ce738-111b-4a99-bd44-a47793663a5e\") " pod="openshift-marketplace/redhat-operators-ddx8z" Dec 01 14:43:59 crc kubenswrapper[4585]: I1201 14:43:59.922314 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e63ce738-111b-4a99-bd44-a47793663a5e-utilities\") pod \"redhat-operators-ddx8z\" (UID: \"e63ce738-111b-4a99-bd44-a47793663a5e\") " pod="openshift-marketplace/redhat-operators-ddx8z" Dec 01 14:43:59 crc kubenswrapper[4585]: I1201 14:43:59.953762 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52snx\" (UniqueName: \"kubernetes.io/projected/e63ce738-111b-4a99-bd44-a47793663a5e-kube-api-access-52snx\") pod \"redhat-operators-ddx8z\" (UID: \"e63ce738-111b-4a99-bd44-a47793663a5e\") " pod="openshift-marketplace/redhat-operators-ddx8z" Dec 01 14:44:00 crc kubenswrapper[4585]: I1201 14:44:00.068693 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ddx8z" Dec 01 14:44:00 crc kubenswrapper[4585]: I1201 14:44:00.600386 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ddx8z"] Dec 01 14:44:00 crc kubenswrapper[4585]: I1201 14:44:00.969360 4585 generic.go:334] "Generic (PLEG): container finished" podID="e63ce738-111b-4a99-bd44-a47793663a5e" containerID="7c4ee03dde4e6ff6a658768e8ff7ae7256b830c4f18532a004a2b77e7e766999" exitCode=0 Dec 01 14:44:00 crc kubenswrapper[4585]: I1201 14:44:00.969619 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ddx8z" event={"ID":"e63ce738-111b-4a99-bd44-a47793663a5e","Type":"ContainerDied","Data":"7c4ee03dde4e6ff6a658768e8ff7ae7256b830c4f18532a004a2b77e7e766999"} Dec 01 14:44:00 crc kubenswrapper[4585]: I1201 14:44:00.969687 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ddx8z" event={"ID":"e63ce738-111b-4a99-bd44-a47793663a5e","Type":"ContainerStarted","Data":"29cbc7bdd5558c7f0521ec1204bb204394735c32a72971529a134cd0ac517315"} Dec 01 14:44:01 crc kubenswrapper[4585]: I1201 14:44:01.308823 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Dec 01 14:44:01 crc kubenswrapper[4585]: I1201 14:44:01.310819 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 01 14:44:01 crc kubenswrapper[4585]: I1201 14:44:01.313192 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Dec 01 14:44:01 crc kubenswrapper[4585]: I1201 14:44:01.313434 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-twgmj" Dec 01 14:44:01 crc kubenswrapper[4585]: I1201 14:44:01.318637 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 01 14:44:01 crc kubenswrapper[4585]: I1201 14:44:01.318937 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Dec 01 14:44:01 crc kubenswrapper[4585]: I1201 14:44:01.326773 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 01 14:44:01 crc kubenswrapper[4585]: I1201 14:44:01.470880 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrjtj\" (UniqueName: \"kubernetes.io/projected/8c35f110-b7a3-4cbc-b181-1589a74f5d89-kube-api-access-qrjtj\") pod \"tempest-tests-tempest\" (UID: \"8c35f110-b7a3-4cbc-b181-1589a74f5d89\") " pod="openstack/tempest-tests-tempest" Dec 01 14:44:01 crc kubenswrapper[4585]: I1201 14:44:01.470939 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8c35f110-b7a3-4cbc-b181-1589a74f5d89-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"8c35f110-b7a3-4cbc-b181-1589a74f5d89\") " pod="openstack/tempest-tests-tempest" Dec 01 14:44:01 crc kubenswrapper[4585]: I1201 14:44:01.470966 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"tempest-tests-tempest\" (UID: \"8c35f110-b7a3-4cbc-b181-1589a74f5d89\") " pod="openstack/tempest-tests-tempest" Dec 01 14:44:01 crc kubenswrapper[4585]: I1201 14:44:01.471104 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/8c35f110-b7a3-4cbc-b181-1589a74f5d89-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"8c35f110-b7a3-4cbc-b181-1589a74f5d89\") " pod="openstack/tempest-tests-tempest" Dec 01 14:44:01 crc kubenswrapper[4585]: I1201 14:44:01.471171 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/8c35f110-b7a3-4cbc-b181-1589a74f5d89-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"8c35f110-b7a3-4cbc-b181-1589a74f5d89\") " pod="openstack/tempest-tests-tempest" Dec 01 14:44:01 crc kubenswrapper[4585]: I1201 14:44:01.471395 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8c35f110-b7a3-4cbc-b181-1589a74f5d89-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"8c35f110-b7a3-4cbc-b181-1589a74f5d89\") " pod="openstack/tempest-tests-tempest" Dec 01 14:44:01 crc kubenswrapper[4585]: I1201 14:44:01.471436 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8c35f110-b7a3-4cbc-b181-1589a74f5d89-config-data\") pod \"tempest-tests-tempest\" (UID: \"8c35f110-b7a3-4cbc-b181-1589a74f5d89\") " pod="openstack/tempest-tests-tempest" Dec 01 14:44:01 crc kubenswrapper[4585]: I1201 14:44:01.471513 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8c35f110-b7a3-4cbc-b181-1589a74f5d89-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"8c35f110-b7a3-4cbc-b181-1589a74f5d89\") " pod="openstack/tempest-tests-tempest" Dec 01 14:44:01 crc kubenswrapper[4585]: I1201 14:44:01.471548 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/8c35f110-b7a3-4cbc-b181-1589a74f5d89-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"8c35f110-b7a3-4cbc-b181-1589a74f5d89\") " pod="openstack/tempest-tests-tempest" Dec 01 14:44:01 crc kubenswrapper[4585]: I1201 14:44:01.573413 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/8c35f110-b7a3-4cbc-b181-1589a74f5d89-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"8c35f110-b7a3-4cbc-b181-1589a74f5d89\") " pod="openstack/tempest-tests-tempest" Dec 01 14:44:01 crc kubenswrapper[4585]: I1201 14:44:01.573515 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8c35f110-b7a3-4cbc-b181-1589a74f5d89-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"8c35f110-b7a3-4cbc-b181-1589a74f5d89\") " pod="openstack/tempest-tests-tempest" Dec 01 14:44:01 crc kubenswrapper[4585]: I1201 14:44:01.573550 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8c35f110-b7a3-4cbc-b181-1589a74f5d89-config-data\") pod \"tempest-tests-tempest\" (UID: \"8c35f110-b7a3-4cbc-b181-1589a74f5d89\") " pod="openstack/tempest-tests-tempest" Dec 01 14:44:01 crc kubenswrapper[4585]: I1201 14:44:01.573582 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8c35f110-b7a3-4cbc-b181-1589a74f5d89-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"8c35f110-b7a3-4cbc-b181-1589a74f5d89\") " pod="openstack/tempest-tests-tempest" Dec 01 14:44:01 crc kubenswrapper[4585]: I1201 14:44:01.573611 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/8c35f110-b7a3-4cbc-b181-1589a74f5d89-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"8c35f110-b7a3-4cbc-b181-1589a74f5d89\") " pod="openstack/tempest-tests-tempest" Dec 01 14:44:01 crc kubenswrapper[4585]: I1201 14:44:01.573650 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrjtj\" (UniqueName: \"kubernetes.io/projected/8c35f110-b7a3-4cbc-b181-1589a74f5d89-kube-api-access-qrjtj\") pod \"tempest-tests-tempest\" (UID: \"8c35f110-b7a3-4cbc-b181-1589a74f5d89\") " pod="openstack/tempest-tests-tempest" Dec 01 14:44:01 crc kubenswrapper[4585]: I1201 14:44:01.574431 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8c35f110-b7a3-4cbc-b181-1589a74f5d89-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"8c35f110-b7a3-4cbc-b181-1589a74f5d89\") " pod="openstack/tempest-tests-tempest" Dec 01 14:44:01 crc kubenswrapper[4585]: I1201 14:44:01.574543 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8c35f110-b7a3-4cbc-b181-1589a74f5d89-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"8c35f110-b7a3-4cbc-b181-1589a74f5d89\") " pod="openstack/tempest-tests-tempest" Dec 01 14:44:01 crc kubenswrapper[4585]: I1201 14:44:01.574644 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"tempest-tests-tempest\" (UID: \"8c35f110-b7a3-4cbc-b181-1589a74f5d89\") " pod="openstack/tempest-tests-tempest" Dec 01 14:44:01 crc kubenswrapper[4585]: I1201 14:44:01.574791 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/8c35f110-b7a3-4cbc-b181-1589a74f5d89-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"8c35f110-b7a3-4cbc-b181-1589a74f5d89\") " pod="openstack/tempest-tests-tempest" Dec 01 14:44:01 crc kubenswrapper[4585]: I1201 14:44:01.574937 4585 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"tempest-tests-tempest\" (UID: \"8c35f110-b7a3-4cbc-b181-1589a74f5d89\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/tempest-tests-tempest" Dec 01 14:44:01 crc kubenswrapper[4585]: I1201 14:44:01.575049 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8c35f110-b7a3-4cbc-b181-1589a74f5d89-config-data\") pod \"tempest-tests-tempest\" (UID: \"8c35f110-b7a3-4cbc-b181-1589a74f5d89\") " pod="openstack/tempest-tests-tempest" Dec 01 14:44:01 crc kubenswrapper[4585]: I1201 14:44:01.575478 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/8c35f110-b7a3-4cbc-b181-1589a74f5d89-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"8c35f110-b7a3-4cbc-b181-1589a74f5d89\") " pod="openstack/tempest-tests-tempest" Dec 01 14:44:01 crc kubenswrapper[4585]: I1201 14:44:01.575567 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/8c35f110-b7a3-4cbc-b181-1589a74f5d89-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"8c35f110-b7a3-4cbc-b181-1589a74f5d89\") " pod="openstack/tempest-tests-tempest" Dec 01 14:44:01 crc kubenswrapper[4585]: I1201 14:44:01.580578 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/8c35f110-b7a3-4cbc-b181-1589a74f5d89-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"8c35f110-b7a3-4cbc-b181-1589a74f5d89\") " pod="openstack/tempest-tests-tempest" Dec 01 14:44:01 crc kubenswrapper[4585]: I1201 14:44:01.582545 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8c35f110-b7a3-4cbc-b181-1589a74f5d89-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"8c35f110-b7a3-4cbc-b181-1589a74f5d89\") " pod="openstack/tempest-tests-tempest" Dec 01 14:44:01 crc kubenswrapper[4585]: I1201 14:44:01.586427 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8c35f110-b7a3-4cbc-b181-1589a74f5d89-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"8c35f110-b7a3-4cbc-b181-1589a74f5d89\") " pod="openstack/tempest-tests-tempest" Dec 01 14:44:01 crc kubenswrapper[4585]: I1201 14:44:01.596602 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrjtj\" (UniqueName: \"kubernetes.io/projected/8c35f110-b7a3-4cbc-b181-1589a74f5d89-kube-api-access-qrjtj\") pod \"tempest-tests-tempest\" (UID: \"8c35f110-b7a3-4cbc-b181-1589a74f5d89\") " pod="openstack/tempest-tests-tempest" Dec 01 14:44:01 crc kubenswrapper[4585]: I1201 14:44:01.602440 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"tempest-tests-tempest\" (UID: \"8c35f110-b7a3-4cbc-b181-1589a74f5d89\") " pod="openstack/tempest-tests-tempest" Dec 01 14:44:01 crc kubenswrapper[4585]: I1201 14:44:01.681743 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 01 14:44:02 crc kubenswrapper[4585]: I1201 14:44:02.115662 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 01 14:44:02 crc kubenswrapper[4585]: I1201 14:44:02.988493 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ddx8z" event={"ID":"e63ce738-111b-4a99-bd44-a47793663a5e","Type":"ContainerStarted","Data":"d2e99645b43db8660babf6c4691c9896aeb0ffc7dc66081dd9ea17823a8bb798"} Dec 01 14:44:02 crc kubenswrapper[4585]: I1201 14:44:02.993036 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"8c35f110-b7a3-4cbc-b181-1589a74f5d89","Type":"ContainerStarted","Data":"ae55f3f6138b44f348dfe68c98045f56fde6028006a698c4d0370376785452e3"} Dec 01 14:44:05 crc kubenswrapper[4585]: I1201 14:44:05.025231 4585 generic.go:334] "Generic (PLEG): container finished" podID="e63ce738-111b-4a99-bd44-a47793663a5e" containerID="d2e99645b43db8660babf6c4691c9896aeb0ffc7dc66081dd9ea17823a8bb798" exitCode=0 Dec 01 14:44:05 crc kubenswrapper[4585]: I1201 14:44:05.025294 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ddx8z" event={"ID":"e63ce738-111b-4a99-bd44-a47793663a5e","Type":"ContainerDied","Data":"d2e99645b43db8660babf6c4691c9896aeb0ffc7dc66081dd9ea17823a8bb798"} Dec 01 14:44:11 crc kubenswrapper[4585]: I1201 14:44:11.088176 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ddx8z" event={"ID":"e63ce738-111b-4a99-bd44-a47793663a5e","Type":"ContainerStarted","Data":"92761291eef3e054eb9aa9fd6773b3e61c7c460ba643e312d7c87fa22f6e7a38"} Dec 01 14:44:12 crc kubenswrapper[4585]: I1201 14:44:12.128233 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ddx8z" podStartSLOduration=3.504042615 podStartE2EDuration="13.128211887s" podCreationTimestamp="2025-12-01 14:43:59 +0000 UTC" firstStartedPulling="2025-12-01 14:44:00.971049134 +0000 UTC m=+2754.955262989" lastFinishedPulling="2025-12-01 14:44:10.595218406 +0000 UTC m=+2764.579432261" observedRunningTime="2025-12-01 14:44:12.118445162 +0000 UTC m=+2766.102659017" watchObservedRunningTime="2025-12-01 14:44:12.128211887 +0000 UTC m=+2766.112425742" Dec 01 14:44:20 crc kubenswrapper[4585]: I1201 14:44:20.069443 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ddx8z" Dec 01 14:44:20 crc kubenswrapper[4585]: I1201 14:44:20.070311 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ddx8z" Dec 01 14:44:21 crc kubenswrapper[4585]: I1201 14:44:21.286818 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ddx8z" podUID="e63ce738-111b-4a99-bd44-a47793663a5e" containerName="registry-server" probeResult="failure" output=< Dec 01 14:44:21 crc kubenswrapper[4585]: timeout: failed to connect service ":50051" within 1s Dec 01 14:44:21 crc kubenswrapper[4585]: > Dec 01 14:44:30 crc kubenswrapper[4585]: I1201 14:44:30.115715 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ddx8z" Dec 01 14:44:30 crc kubenswrapper[4585]: I1201 14:44:30.167469 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ddx8z" Dec 01 14:44:30 crc kubenswrapper[4585]: I1201 14:44:30.936306 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ddx8z"] Dec 01 14:44:31 crc kubenswrapper[4585]: I1201 14:44:31.352015 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ddx8z" podUID="e63ce738-111b-4a99-bd44-a47793663a5e" containerName="registry-server" containerID="cri-o://92761291eef3e054eb9aa9fd6773b3e61c7c460ba643e312d7c87fa22f6e7a38" gracePeriod=2 Dec 01 14:44:32 crc kubenswrapper[4585]: I1201 14:44:32.381534 4585 generic.go:334] "Generic (PLEG): container finished" podID="e63ce738-111b-4a99-bd44-a47793663a5e" containerID="92761291eef3e054eb9aa9fd6773b3e61c7c460ba643e312d7c87fa22f6e7a38" exitCode=0 Dec 01 14:44:32 crc kubenswrapper[4585]: I1201 14:44:32.381679 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ddx8z" event={"ID":"e63ce738-111b-4a99-bd44-a47793663a5e","Type":"ContainerDied","Data":"92761291eef3e054eb9aa9fd6773b3e61c7c460ba643e312d7c87fa22f6e7a38"} Dec 01 14:44:39 crc kubenswrapper[4585]: E1201 14:44:39.496116 4585 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Dec 01 14:44:39 crc kubenswrapper[4585]: E1201 14:44:39.498368 4585 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qrjtj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(8c35f110-b7a3-4cbc-b181-1589a74f5d89): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 14:44:39 crc kubenswrapper[4585]: E1201 14:44:39.499731 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="8c35f110-b7a3-4cbc-b181-1589a74f5d89" Dec 01 14:44:39 crc kubenswrapper[4585]: I1201 14:44:39.924802 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ddx8z" Dec 01 14:44:40 crc kubenswrapper[4585]: I1201 14:44:40.050965 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52snx\" (UniqueName: \"kubernetes.io/projected/e63ce738-111b-4a99-bd44-a47793663a5e-kube-api-access-52snx\") pod \"e63ce738-111b-4a99-bd44-a47793663a5e\" (UID: \"e63ce738-111b-4a99-bd44-a47793663a5e\") " Dec 01 14:44:40 crc kubenswrapper[4585]: I1201 14:44:40.051145 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e63ce738-111b-4a99-bd44-a47793663a5e-utilities\") pod \"e63ce738-111b-4a99-bd44-a47793663a5e\" (UID: \"e63ce738-111b-4a99-bd44-a47793663a5e\") " Dec 01 14:44:40 crc kubenswrapper[4585]: I1201 14:44:40.051265 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e63ce738-111b-4a99-bd44-a47793663a5e-catalog-content\") pod \"e63ce738-111b-4a99-bd44-a47793663a5e\" (UID: \"e63ce738-111b-4a99-bd44-a47793663a5e\") " Dec 01 14:44:40 crc kubenswrapper[4585]: I1201 14:44:40.051532 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e63ce738-111b-4a99-bd44-a47793663a5e-utilities" (OuterVolumeSpecName: "utilities") pod "e63ce738-111b-4a99-bd44-a47793663a5e" (UID: "e63ce738-111b-4a99-bd44-a47793663a5e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:44:40 crc kubenswrapper[4585]: I1201 14:44:40.051697 4585 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e63ce738-111b-4a99-bd44-a47793663a5e-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 14:44:40 crc kubenswrapper[4585]: I1201 14:44:40.057184 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e63ce738-111b-4a99-bd44-a47793663a5e-kube-api-access-52snx" (OuterVolumeSpecName: "kube-api-access-52snx") pod "e63ce738-111b-4a99-bd44-a47793663a5e" (UID: "e63ce738-111b-4a99-bd44-a47793663a5e"). InnerVolumeSpecName "kube-api-access-52snx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:44:40 crc kubenswrapper[4585]: I1201 14:44:40.126104 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e63ce738-111b-4a99-bd44-a47793663a5e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e63ce738-111b-4a99-bd44-a47793663a5e" (UID: "e63ce738-111b-4a99-bd44-a47793663a5e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:44:40 crc kubenswrapper[4585]: I1201 14:44:40.153142 4585 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e63ce738-111b-4a99-bd44-a47793663a5e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 14:44:40 crc kubenswrapper[4585]: I1201 14:44:40.153173 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52snx\" (UniqueName: \"kubernetes.io/projected/e63ce738-111b-4a99-bd44-a47793663a5e-kube-api-access-52snx\") on node \"crc\" DevicePath \"\"" Dec 01 14:44:40 crc kubenswrapper[4585]: I1201 14:44:40.491346 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ddx8z" event={"ID":"e63ce738-111b-4a99-bd44-a47793663a5e","Type":"ContainerDied","Data":"29cbc7bdd5558c7f0521ec1204bb204394735c32a72971529a134cd0ac517315"} Dec 01 14:44:40 crc kubenswrapper[4585]: I1201 14:44:40.491394 4585 scope.go:117] "RemoveContainer" containerID="92761291eef3e054eb9aa9fd6773b3e61c7c460ba643e312d7c87fa22f6e7a38" Dec 01 14:44:40 crc kubenswrapper[4585]: I1201 14:44:40.491421 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ddx8z" Dec 01 14:44:40 crc kubenswrapper[4585]: E1201 14:44:40.494797 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="8c35f110-b7a3-4cbc-b181-1589a74f5d89" Dec 01 14:44:40 crc kubenswrapper[4585]: I1201 14:44:40.544928 4585 scope.go:117] "RemoveContainer" containerID="d2e99645b43db8660babf6c4691c9896aeb0ffc7dc66081dd9ea17823a8bb798" Dec 01 14:44:40 crc kubenswrapper[4585]: I1201 14:44:40.578400 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ddx8z"] Dec 01 14:44:40 crc kubenswrapper[4585]: I1201 14:44:40.584662 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ddx8z"] Dec 01 14:44:40 crc kubenswrapper[4585]: I1201 14:44:40.585083 4585 scope.go:117] "RemoveContainer" containerID="7c4ee03dde4e6ff6a658768e8ff7ae7256b830c4f18532a004a2b77e7e766999" Dec 01 14:44:42 crc kubenswrapper[4585]: I1201 14:44:42.441673 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e63ce738-111b-4a99-bd44-a47793663a5e" path="/var/lib/kubelet/pods/e63ce738-111b-4a99-bd44-a47793663a5e/volumes" Dec 01 14:44:53 crc kubenswrapper[4585]: I1201 14:44:53.893546 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 01 14:44:55 crc kubenswrapper[4585]: I1201 14:44:55.631793 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"8c35f110-b7a3-4cbc-b181-1589a74f5d89","Type":"ContainerStarted","Data":"8512da59d4272b463b14c3573c1e5d024a57983c7ceb15727e4da3753642f185"} Dec 01 14:45:00 crc kubenswrapper[4585]: I1201 14:45:00.142146 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=8.380185481 podStartE2EDuration="1m0.142124417s" podCreationTimestamp="2025-12-01 14:44:00 +0000 UTC" firstStartedPulling="2025-12-01 14:44:02.120779797 +0000 UTC m=+2756.104993652" lastFinishedPulling="2025-12-01 14:44:53.882718733 +0000 UTC m=+2807.866932588" observedRunningTime="2025-12-01 14:44:55.652856434 +0000 UTC m=+2809.637070279" watchObservedRunningTime="2025-12-01 14:45:00.142124417 +0000 UTC m=+2814.126338272" Dec 01 14:45:00 crc kubenswrapper[4585]: I1201 14:45:00.148447 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410005-vbqr6"] Dec 01 14:45:00 crc kubenswrapper[4585]: E1201 14:45:00.149131 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e63ce738-111b-4a99-bd44-a47793663a5e" containerName="extract-content" Dec 01 14:45:00 crc kubenswrapper[4585]: I1201 14:45:00.149243 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="e63ce738-111b-4a99-bd44-a47793663a5e" containerName="extract-content" Dec 01 14:45:00 crc kubenswrapper[4585]: E1201 14:45:00.149359 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e63ce738-111b-4a99-bd44-a47793663a5e" containerName="extract-utilities" Dec 01 14:45:00 crc kubenswrapper[4585]: I1201 14:45:00.149461 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="e63ce738-111b-4a99-bd44-a47793663a5e" containerName="extract-utilities" Dec 01 14:45:00 crc kubenswrapper[4585]: E1201 14:45:00.149583 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e63ce738-111b-4a99-bd44-a47793663a5e" containerName="registry-server" Dec 01 14:45:00 crc kubenswrapper[4585]: I1201 14:45:00.149663 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="e63ce738-111b-4a99-bd44-a47793663a5e" containerName="registry-server" Dec 01 14:45:00 crc kubenswrapper[4585]: I1201 14:45:00.150153 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="e63ce738-111b-4a99-bd44-a47793663a5e" containerName="registry-server" Dec 01 14:45:00 crc kubenswrapper[4585]: I1201 14:45:00.151190 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410005-vbqr6" Dec 01 14:45:00 crc kubenswrapper[4585]: I1201 14:45:00.154707 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 14:45:00 crc kubenswrapper[4585]: I1201 14:45:00.154950 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 14:45:00 crc kubenswrapper[4585]: I1201 14:45:00.159888 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410005-vbqr6"] Dec 01 14:45:00 crc kubenswrapper[4585]: I1201 14:45:00.237891 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c7900b55-d224-4908-9408-386643dc4e28-secret-volume\") pod \"collect-profiles-29410005-vbqr6\" (UID: \"c7900b55-d224-4908-9408-386643dc4e28\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410005-vbqr6" Dec 01 14:45:00 crc kubenswrapper[4585]: I1201 14:45:00.238272 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfvqd\" (UniqueName: \"kubernetes.io/projected/c7900b55-d224-4908-9408-386643dc4e28-kube-api-access-tfvqd\") pod \"collect-profiles-29410005-vbqr6\" (UID: \"c7900b55-d224-4908-9408-386643dc4e28\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410005-vbqr6" Dec 01 14:45:00 crc kubenswrapper[4585]: I1201 14:45:00.238306 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c7900b55-d224-4908-9408-386643dc4e28-config-volume\") pod \"collect-profiles-29410005-vbqr6\" (UID: \"c7900b55-d224-4908-9408-386643dc4e28\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410005-vbqr6" Dec 01 14:45:00 crc kubenswrapper[4585]: I1201 14:45:00.340308 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c7900b55-d224-4908-9408-386643dc4e28-secret-volume\") pod \"collect-profiles-29410005-vbqr6\" (UID: \"c7900b55-d224-4908-9408-386643dc4e28\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410005-vbqr6" Dec 01 14:45:00 crc kubenswrapper[4585]: I1201 14:45:00.340395 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfvqd\" (UniqueName: \"kubernetes.io/projected/c7900b55-d224-4908-9408-386643dc4e28-kube-api-access-tfvqd\") pod \"collect-profiles-29410005-vbqr6\" (UID: \"c7900b55-d224-4908-9408-386643dc4e28\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410005-vbqr6" Dec 01 14:45:00 crc kubenswrapper[4585]: I1201 14:45:00.340424 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c7900b55-d224-4908-9408-386643dc4e28-config-volume\") pod \"collect-profiles-29410005-vbqr6\" (UID: \"c7900b55-d224-4908-9408-386643dc4e28\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410005-vbqr6" Dec 01 14:45:00 crc kubenswrapper[4585]: I1201 14:45:00.341637 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c7900b55-d224-4908-9408-386643dc4e28-config-volume\") pod \"collect-profiles-29410005-vbqr6\" (UID: \"c7900b55-d224-4908-9408-386643dc4e28\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410005-vbqr6" Dec 01 14:45:00 crc kubenswrapper[4585]: I1201 14:45:00.346173 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c7900b55-d224-4908-9408-386643dc4e28-secret-volume\") pod \"collect-profiles-29410005-vbqr6\" (UID: \"c7900b55-d224-4908-9408-386643dc4e28\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410005-vbqr6" Dec 01 14:45:00 crc kubenswrapper[4585]: I1201 14:45:00.359149 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfvqd\" (UniqueName: \"kubernetes.io/projected/c7900b55-d224-4908-9408-386643dc4e28-kube-api-access-tfvqd\") pod \"collect-profiles-29410005-vbqr6\" (UID: \"c7900b55-d224-4908-9408-386643dc4e28\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410005-vbqr6" Dec 01 14:45:00 crc kubenswrapper[4585]: I1201 14:45:00.520263 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410005-vbqr6" Dec 01 14:45:00 crc kubenswrapper[4585]: I1201 14:45:00.960559 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410005-vbqr6"] Dec 01 14:45:01 crc kubenswrapper[4585]: I1201 14:45:01.683725 4585 generic.go:334] "Generic (PLEG): container finished" podID="c7900b55-d224-4908-9408-386643dc4e28" containerID="027ad5b6c85c40dbbf378b54dbc3abc90e24b8887cde515ae23548ebd7127203" exitCode=0 Dec 01 14:45:01 crc kubenswrapper[4585]: I1201 14:45:01.683827 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410005-vbqr6" event={"ID":"c7900b55-d224-4908-9408-386643dc4e28","Type":"ContainerDied","Data":"027ad5b6c85c40dbbf378b54dbc3abc90e24b8887cde515ae23548ebd7127203"} Dec 01 14:45:01 crc kubenswrapper[4585]: I1201 14:45:01.684037 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410005-vbqr6" event={"ID":"c7900b55-d224-4908-9408-386643dc4e28","Type":"ContainerStarted","Data":"e442f4c19cb4b85f38a39d68c0953002f1dbc1e6167a06fb121707880031b6cf"} Dec 01 14:45:02 crc kubenswrapper[4585]: I1201 14:45:02.981526 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410005-vbqr6" Dec 01 14:45:03 crc kubenswrapper[4585]: I1201 14:45:03.090645 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c7900b55-d224-4908-9408-386643dc4e28-secret-volume\") pod \"c7900b55-d224-4908-9408-386643dc4e28\" (UID: \"c7900b55-d224-4908-9408-386643dc4e28\") " Dec 01 14:45:03 crc kubenswrapper[4585]: I1201 14:45:03.091009 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfvqd\" (UniqueName: \"kubernetes.io/projected/c7900b55-d224-4908-9408-386643dc4e28-kube-api-access-tfvqd\") pod \"c7900b55-d224-4908-9408-386643dc4e28\" (UID: \"c7900b55-d224-4908-9408-386643dc4e28\") " Dec 01 14:45:03 crc kubenswrapper[4585]: I1201 14:45:03.091075 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c7900b55-d224-4908-9408-386643dc4e28-config-volume\") pod \"c7900b55-d224-4908-9408-386643dc4e28\" (UID: \"c7900b55-d224-4908-9408-386643dc4e28\") " Dec 01 14:45:03 crc kubenswrapper[4585]: I1201 14:45:03.091786 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7900b55-d224-4908-9408-386643dc4e28-config-volume" (OuterVolumeSpecName: "config-volume") pod "c7900b55-d224-4908-9408-386643dc4e28" (UID: "c7900b55-d224-4908-9408-386643dc4e28"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:45:03 crc kubenswrapper[4585]: I1201 14:45:03.097101 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7900b55-d224-4908-9408-386643dc4e28-kube-api-access-tfvqd" (OuterVolumeSpecName: "kube-api-access-tfvqd") pod "c7900b55-d224-4908-9408-386643dc4e28" (UID: "c7900b55-d224-4908-9408-386643dc4e28"). InnerVolumeSpecName "kube-api-access-tfvqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:45:03 crc kubenswrapper[4585]: I1201 14:45:03.097154 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7900b55-d224-4908-9408-386643dc4e28-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c7900b55-d224-4908-9408-386643dc4e28" (UID: "c7900b55-d224-4908-9408-386643dc4e28"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:45:03 crc kubenswrapper[4585]: I1201 14:45:03.192928 4585 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c7900b55-d224-4908-9408-386643dc4e28-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 14:45:03 crc kubenswrapper[4585]: I1201 14:45:03.192956 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfvqd\" (UniqueName: \"kubernetes.io/projected/c7900b55-d224-4908-9408-386643dc4e28-kube-api-access-tfvqd\") on node \"crc\" DevicePath \"\"" Dec 01 14:45:03 crc kubenswrapper[4585]: I1201 14:45:03.192964 4585 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c7900b55-d224-4908-9408-386643dc4e28-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 14:45:03 crc kubenswrapper[4585]: I1201 14:45:03.705643 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410005-vbqr6" event={"ID":"c7900b55-d224-4908-9408-386643dc4e28","Type":"ContainerDied","Data":"e442f4c19cb4b85f38a39d68c0953002f1dbc1e6167a06fb121707880031b6cf"} Dec 01 14:45:03 crc kubenswrapper[4585]: I1201 14:45:03.705707 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e442f4c19cb4b85f38a39d68c0953002f1dbc1e6167a06fb121707880031b6cf" Dec 01 14:45:03 crc kubenswrapper[4585]: I1201 14:45:03.705712 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410005-vbqr6" Dec 01 14:45:04 crc kubenswrapper[4585]: I1201 14:45:04.055988 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409960-6f6w5"] Dec 01 14:45:04 crc kubenswrapper[4585]: I1201 14:45:04.062862 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409960-6f6w5"] Dec 01 14:45:04 crc kubenswrapper[4585]: I1201 14:45:04.424615 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ca012d6-094a-4703-b8cc-d9d53fa9886d" path="/var/lib/kubelet/pods/1ca012d6-094a-4703-b8cc-d9d53fa9886d/volumes" Dec 01 14:45:13 crc kubenswrapper[4585]: I1201 14:45:13.716089 4585 patch_prober.go:28] interesting pod/machine-config-daemon-lj9gs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 14:45:13 crc kubenswrapper[4585]: I1201 14:45:13.716577 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 14:45:35 crc kubenswrapper[4585]: I1201 14:45:35.141177 4585 scope.go:117] "RemoveContainer" containerID="589913f8e0eccdf800c0ca0f20d5850b40b34cbd7ee4a27991847f30a8b4690f" Dec 01 14:45:43 crc kubenswrapper[4585]: I1201 14:45:43.716082 4585 patch_prober.go:28] interesting pod/machine-config-daemon-lj9gs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 14:45:43 crc kubenswrapper[4585]: I1201 14:45:43.716610 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 14:46:13 crc kubenswrapper[4585]: I1201 14:46:13.716651 4585 patch_prober.go:28] interesting pod/machine-config-daemon-lj9gs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 14:46:13 crc kubenswrapper[4585]: I1201 14:46:13.717214 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 14:46:13 crc kubenswrapper[4585]: I1201 14:46:13.717261 4585 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" Dec 01 14:46:13 crc kubenswrapper[4585]: I1201 14:46:13.718045 4585 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6ded03fb550f95ffdee9f445ca211a7031c49c1e9aa5ae8c0ec5434bb5ff5043"} pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 14:46:13 crc kubenswrapper[4585]: I1201 14:46:13.718117 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerName="machine-config-daemon" containerID="cri-o://6ded03fb550f95ffdee9f445ca211a7031c49c1e9aa5ae8c0ec5434bb5ff5043" gracePeriod=600 Dec 01 14:46:14 crc kubenswrapper[4585]: I1201 14:46:14.367843 4585 generic.go:334] "Generic (PLEG): container finished" podID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerID="6ded03fb550f95ffdee9f445ca211a7031c49c1e9aa5ae8c0ec5434bb5ff5043" exitCode=0 Dec 01 14:46:14 crc kubenswrapper[4585]: I1201 14:46:14.368274 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" event={"ID":"f7beb40d-bcd0-43c8-a9fe-c32408790a4c","Type":"ContainerDied","Data":"6ded03fb550f95ffdee9f445ca211a7031c49c1e9aa5ae8c0ec5434bb5ff5043"} Dec 01 14:46:14 crc kubenswrapper[4585]: I1201 14:46:14.368330 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" event={"ID":"f7beb40d-bcd0-43c8-a9fe-c32408790a4c","Type":"ContainerStarted","Data":"7d09788976c97c2a3c377b62292f62228c7f604af4c086c3c984900409d574e6"} Dec 01 14:46:14 crc kubenswrapper[4585]: I1201 14:46:14.368354 4585 scope.go:117] "RemoveContainer" containerID="a1dbd00cb2d9328b688322869d95c509a2164260f953ca463a70d2042af9535f" Dec 01 14:46:29 crc kubenswrapper[4585]: I1201 14:46:29.871689 4585 generic.go:334] "Generic (PLEG): container finished" podID="8c35f110-b7a3-4cbc-b181-1589a74f5d89" containerID="8512da59d4272b463b14c3573c1e5d024a57983c7ceb15727e4da3753642f185" exitCode=0 Dec 01 14:46:29 crc kubenswrapper[4585]: I1201 14:46:29.871816 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"8c35f110-b7a3-4cbc-b181-1589a74f5d89","Type":"ContainerDied","Data":"8512da59d4272b463b14c3573c1e5d024a57983c7ceb15727e4da3753642f185"} Dec 01 14:46:31 crc kubenswrapper[4585]: I1201 14:46:31.299734 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 01 14:46:31 crc kubenswrapper[4585]: I1201 14:46:31.475603 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8c35f110-b7a3-4cbc-b181-1589a74f5d89-openstack-config-secret\") pod \"8c35f110-b7a3-4cbc-b181-1589a74f5d89\" (UID: \"8c35f110-b7a3-4cbc-b181-1589a74f5d89\") " Dec 01 14:46:31 crc kubenswrapper[4585]: I1201 14:46:31.475757 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8c35f110-b7a3-4cbc-b181-1589a74f5d89-openstack-config\") pod \"8c35f110-b7a3-4cbc-b181-1589a74f5d89\" (UID: \"8c35f110-b7a3-4cbc-b181-1589a74f5d89\") " Dec 01 14:46:31 crc kubenswrapper[4585]: I1201 14:46:31.475789 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8c35f110-b7a3-4cbc-b181-1589a74f5d89-config-data\") pod \"8c35f110-b7a3-4cbc-b181-1589a74f5d89\" (UID: \"8c35f110-b7a3-4cbc-b181-1589a74f5d89\") " Dec 01 14:46:31 crc kubenswrapper[4585]: I1201 14:46:31.475838 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/8c35f110-b7a3-4cbc-b181-1589a74f5d89-test-operator-ephemeral-workdir\") pod \"8c35f110-b7a3-4cbc-b181-1589a74f5d89\" (UID: \"8c35f110-b7a3-4cbc-b181-1589a74f5d89\") " Dec 01 14:46:31 crc kubenswrapper[4585]: I1201 14:46:31.475884 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/8c35f110-b7a3-4cbc-b181-1589a74f5d89-test-operator-ephemeral-temporary\") pod \"8c35f110-b7a3-4cbc-b181-1589a74f5d89\" (UID: \"8c35f110-b7a3-4cbc-b181-1589a74f5d89\") " Dec 01 14:46:31 crc kubenswrapper[4585]: I1201 14:46:31.475928 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrjtj\" (UniqueName: \"kubernetes.io/projected/8c35f110-b7a3-4cbc-b181-1589a74f5d89-kube-api-access-qrjtj\") pod \"8c35f110-b7a3-4cbc-b181-1589a74f5d89\" (UID: \"8c35f110-b7a3-4cbc-b181-1589a74f5d89\") " Dec 01 14:46:31 crc kubenswrapper[4585]: I1201 14:46:31.475962 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8c35f110-b7a3-4cbc-b181-1589a74f5d89-ssh-key\") pod \"8c35f110-b7a3-4cbc-b181-1589a74f5d89\" (UID: \"8c35f110-b7a3-4cbc-b181-1589a74f5d89\") " Dec 01 14:46:31 crc kubenswrapper[4585]: I1201 14:46:31.475999 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/8c35f110-b7a3-4cbc-b181-1589a74f5d89-ca-certs\") pod \"8c35f110-b7a3-4cbc-b181-1589a74f5d89\" (UID: \"8c35f110-b7a3-4cbc-b181-1589a74f5d89\") " Dec 01 14:46:31 crc kubenswrapper[4585]: I1201 14:46:31.476024 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"8c35f110-b7a3-4cbc-b181-1589a74f5d89\" (UID: \"8c35f110-b7a3-4cbc-b181-1589a74f5d89\") " Dec 01 14:46:31 crc kubenswrapper[4585]: I1201 14:46:31.478748 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c35f110-b7a3-4cbc-b181-1589a74f5d89-config-data" (OuterVolumeSpecName: "config-data") pod "8c35f110-b7a3-4cbc-b181-1589a74f5d89" (UID: "8c35f110-b7a3-4cbc-b181-1589a74f5d89"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:46:31 crc kubenswrapper[4585]: I1201 14:46:31.479625 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c35f110-b7a3-4cbc-b181-1589a74f5d89-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "8c35f110-b7a3-4cbc-b181-1589a74f5d89" (UID: "8c35f110-b7a3-4cbc-b181-1589a74f5d89"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:46:31 crc kubenswrapper[4585]: I1201 14:46:31.480686 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c35f110-b7a3-4cbc-b181-1589a74f5d89-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "8c35f110-b7a3-4cbc-b181-1589a74f5d89" (UID: "8c35f110-b7a3-4cbc-b181-1589a74f5d89"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:46:31 crc kubenswrapper[4585]: I1201 14:46:31.487127 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c35f110-b7a3-4cbc-b181-1589a74f5d89-kube-api-access-qrjtj" (OuterVolumeSpecName: "kube-api-access-qrjtj") pod "8c35f110-b7a3-4cbc-b181-1589a74f5d89" (UID: "8c35f110-b7a3-4cbc-b181-1589a74f5d89"). InnerVolumeSpecName "kube-api-access-qrjtj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:46:31 crc kubenswrapper[4585]: I1201 14:46:31.501261 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "test-operator-logs") pod "8c35f110-b7a3-4cbc-b181-1589a74f5d89" (UID: "8c35f110-b7a3-4cbc-b181-1589a74f5d89"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 14:46:31 crc kubenswrapper[4585]: I1201 14:46:31.525094 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c35f110-b7a3-4cbc-b181-1589a74f5d89-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "8c35f110-b7a3-4cbc-b181-1589a74f5d89" (UID: "8c35f110-b7a3-4cbc-b181-1589a74f5d89"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:46:31 crc kubenswrapper[4585]: I1201 14:46:31.526125 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c35f110-b7a3-4cbc-b181-1589a74f5d89-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "8c35f110-b7a3-4cbc-b181-1589a74f5d89" (UID: "8c35f110-b7a3-4cbc-b181-1589a74f5d89"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:46:31 crc kubenswrapper[4585]: I1201 14:46:31.530278 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c35f110-b7a3-4cbc-b181-1589a74f5d89-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "8c35f110-b7a3-4cbc-b181-1589a74f5d89" (UID: "8c35f110-b7a3-4cbc-b181-1589a74f5d89"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 14:46:31 crc kubenswrapper[4585]: I1201 14:46:31.539147 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c35f110-b7a3-4cbc-b181-1589a74f5d89-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8c35f110-b7a3-4cbc-b181-1589a74f5d89" (UID: "8c35f110-b7a3-4cbc-b181-1589a74f5d89"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 14:46:31 crc kubenswrapper[4585]: I1201 14:46:31.577717 4585 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8c35f110-b7a3-4cbc-b181-1589a74f5d89-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 01 14:46:31 crc kubenswrapper[4585]: I1201 14:46:31.578020 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8c35f110-b7a3-4cbc-b181-1589a74f5d89-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 14:46:31 crc kubenswrapper[4585]: I1201 14:46:31.578111 4585 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/8c35f110-b7a3-4cbc-b181-1589a74f5d89-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Dec 01 14:46:31 crc kubenswrapper[4585]: I1201 14:46:31.578173 4585 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/8c35f110-b7a3-4cbc-b181-1589a74f5d89-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Dec 01 14:46:31 crc kubenswrapper[4585]: I1201 14:46:31.578266 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrjtj\" (UniqueName: \"kubernetes.io/projected/8c35f110-b7a3-4cbc-b181-1589a74f5d89-kube-api-access-qrjtj\") on node \"crc\" DevicePath \"\"" Dec 01 14:46:31 crc kubenswrapper[4585]: I1201 14:46:31.578350 4585 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8c35f110-b7a3-4cbc-b181-1589a74f5d89-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 14:46:31 crc kubenswrapper[4585]: I1201 14:46:31.578425 4585 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/8c35f110-b7a3-4cbc-b181-1589a74f5d89-ca-certs\") on node \"crc\" DevicePath \"\"" Dec 01 14:46:31 crc kubenswrapper[4585]: I1201 14:46:31.578743 4585 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Dec 01 14:46:31 crc kubenswrapper[4585]: I1201 14:46:31.579543 4585 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8c35f110-b7a3-4cbc-b181-1589a74f5d89-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 01 14:46:31 crc kubenswrapper[4585]: I1201 14:46:31.599103 4585 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Dec 01 14:46:31 crc kubenswrapper[4585]: I1201 14:46:31.638868 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wp6p8"] Dec 01 14:46:31 crc kubenswrapper[4585]: E1201 14:46:31.639508 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7900b55-d224-4908-9408-386643dc4e28" containerName="collect-profiles" Dec 01 14:46:31 crc kubenswrapper[4585]: I1201 14:46:31.639585 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7900b55-d224-4908-9408-386643dc4e28" containerName="collect-profiles" Dec 01 14:46:31 crc kubenswrapper[4585]: E1201 14:46:31.639668 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c35f110-b7a3-4cbc-b181-1589a74f5d89" containerName="tempest-tests-tempest-tests-runner" Dec 01 14:46:31 crc kubenswrapper[4585]: I1201 14:46:31.639725 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c35f110-b7a3-4cbc-b181-1589a74f5d89" containerName="tempest-tests-tempest-tests-runner" Dec 01 14:46:31 crc kubenswrapper[4585]: I1201 14:46:31.639989 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c35f110-b7a3-4cbc-b181-1589a74f5d89" containerName="tempest-tests-tempest-tests-runner" Dec 01 14:46:31 crc kubenswrapper[4585]: I1201 14:46:31.640145 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7900b55-d224-4908-9408-386643dc4e28" containerName="collect-profiles" Dec 01 14:46:31 crc kubenswrapper[4585]: I1201 14:46:31.641435 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wp6p8" Dec 01 14:46:31 crc kubenswrapper[4585]: I1201 14:46:31.687245 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wp6p8"] Dec 01 14:46:31 crc kubenswrapper[4585]: I1201 14:46:31.688248 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2c2500b-2e6f-4158-a9e9-91c3105e9f18-catalog-content\") pod \"community-operators-wp6p8\" (UID: \"b2c2500b-2e6f-4158-a9e9-91c3105e9f18\") " pod="openshift-marketplace/community-operators-wp6p8" Dec 01 14:46:31 crc kubenswrapper[4585]: I1201 14:46:31.688289 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xsl2\" (UniqueName: \"kubernetes.io/projected/b2c2500b-2e6f-4158-a9e9-91c3105e9f18-kube-api-access-6xsl2\") pod \"community-operators-wp6p8\" (UID: \"b2c2500b-2e6f-4158-a9e9-91c3105e9f18\") " pod="openshift-marketplace/community-operators-wp6p8" Dec 01 14:46:31 crc kubenswrapper[4585]: I1201 14:46:31.688310 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2c2500b-2e6f-4158-a9e9-91c3105e9f18-utilities\") pod \"community-operators-wp6p8\" (UID: \"b2c2500b-2e6f-4158-a9e9-91c3105e9f18\") " pod="openshift-marketplace/community-operators-wp6p8" Dec 01 14:46:31 crc kubenswrapper[4585]: I1201 14:46:31.688424 4585 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Dec 01 14:46:31 crc kubenswrapper[4585]: I1201 14:46:31.789230 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xsl2\" (UniqueName: \"kubernetes.io/projected/b2c2500b-2e6f-4158-a9e9-91c3105e9f18-kube-api-access-6xsl2\") pod \"community-operators-wp6p8\" (UID: \"b2c2500b-2e6f-4158-a9e9-91c3105e9f18\") " pod="openshift-marketplace/community-operators-wp6p8" Dec 01 14:46:31 crc kubenswrapper[4585]: I1201 14:46:31.789272 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2c2500b-2e6f-4158-a9e9-91c3105e9f18-utilities\") pod \"community-operators-wp6p8\" (UID: \"b2c2500b-2e6f-4158-a9e9-91c3105e9f18\") " pod="openshift-marketplace/community-operators-wp6p8" Dec 01 14:46:31 crc kubenswrapper[4585]: I1201 14:46:31.789400 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2c2500b-2e6f-4158-a9e9-91c3105e9f18-catalog-content\") pod \"community-operators-wp6p8\" (UID: \"b2c2500b-2e6f-4158-a9e9-91c3105e9f18\") " pod="openshift-marketplace/community-operators-wp6p8" Dec 01 14:46:31 crc kubenswrapper[4585]: I1201 14:46:31.789800 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2c2500b-2e6f-4158-a9e9-91c3105e9f18-catalog-content\") pod \"community-operators-wp6p8\" (UID: \"b2c2500b-2e6f-4158-a9e9-91c3105e9f18\") " pod="openshift-marketplace/community-operators-wp6p8" Dec 01 14:46:31 crc kubenswrapper[4585]: I1201 14:46:31.789948 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2c2500b-2e6f-4158-a9e9-91c3105e9f18-utilities\") pod \"community-operators-wp6p8\" (UID: \"b2c2500b-2e6f-4158-a9e9-91c3105e9f18\") " pod="openshift-marketplace/community-operators-wp6p8" Dec 01 14:46:31 crc kubenswrapper[4585]: I1201 14:46:31.809647 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xsl2\" (UniqueName: \"kubernetes.io/projected/b2c2500b-2e6f-4158-a9e9-91c3105e9f18-kube-api-access-6xsl2\") pod \"community-operators-wp6p8\" (UID: \"b2c2500b-2e6f-4158-a9e9-91c3105e9f18\") " pod="openshift-marketplace/community-operators-wp6p8" Dec 01 14:46:31 crc kubenswrapper[4585]: I1201 14:46:31.891399 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"8c35f110-b7a3-4cbc-b181-1589a74f5d89","Type":"ContainerDied","Data":"ae55f3f6138b44f348dfe68c98045f56fde6028006a698c4d0370376785452e3"} Dec 01 14:46:31 crc kubenswrapper[4585]: I1201 14:46:31.891452 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae55f3f6138b44f348dfe68c98045f56fde6028006a698c4d0370376785452e3" Dec 01 14:46:31 crc kubenswrapper[4585]: I1201 14:46:31.891540 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 01 14:46:31 crc kubenswrapper[4585]: I1201 14:46:31.960099 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wp6p8" Dec 01 14:46:32 crc kubenswrapper[4585]: I1201 14:46:32.492510 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wp6p8"] Dec 01 14:46:32 crc kubenswrapper[4585]: I1201 14:46:32.901358 4585 generic.go:334] "Generic (PLEG): container finished" podID="b2c2500b-2e6f-4158-a9e9-91c3105e9f18" containerID="87101e33c80202fdc4dff746b13072da5d650253b62e2ff82002b6ca130fa404" exitCode=0 Dec 01 14:46:32 crc kubenswrapper[4585]: I1201 14:46:32.901460 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wp6p8" event={"ID":"b2c2500b-2e6f-4158-a9e9-91c3105e9f18","Type":"ContainerDied","Data":"87101e33c80202fdc4dff746b13072da5d650253b62e2ff82002b6ca130fa404"} Dec 01 14:46:32 crc kubenswrapper[4585]: I1201 14:46:32.901607 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wp6p8" event={"ID":"b2c2500b-2e6f-4158-a9e9-91c3105e9f18","Type":"ContainerStarted","Data":"06c8b11470b3bc7f3fce1059d4676996f26aa6a7e5e253f1042ada810dc1e2cd"} Dec 01 14:46:32 crc kubenswrapper[4585]: I1201 14:46:32.903214 4585 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 14:46:34 crc kubenswrapper[4585]: I1201 14:46:34.919652 4585 generic.go:334] "Generic (PLEG): container finished" podID="b2c2500b-2e6f-4158-a9e9-91c3105e9f18" containerID="aa3a6c4128ce2991b0e35e307f47901a3b4113801665aff5b42f6ac5d3b6cfa0" exitCode=0 Dec 01 14:46:34 crc kubenswrapper[4585]: I1201 14:46:34.919808 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wp6p8" event={"ID":"b2c2500b-2e6f-4158-a9e9-91c3105e9f18","Type":"ContainerDied","Data":"aa3a6c4128ce2991b0e35e307f47901a3b4113801665aff5b42f6ac5d3b6cfa0"} Dec 01 14:46:36 crc kubenswrapper[4585]: I1201 14:46:36.941208 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wp6p8" event={"ID":"b2c2500b-2e6f-4158-a9e9-91c3105e9f18","Type":"ContainerStarted","Data":"aea16d5f1065a14c30a1e7510dd068f9c159c6d28b08d05d3ff91d35874d7ec3"} Dec 01 14:46:36 crc kubenswrapper[4585]: I1201 14:46:36.970877 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wp6p8" podStartSLOduration=2.932395905 podStartE2EDuration="5.97085476s" podCreationTimestamp="2025-12-01 14:46:31 +0000 UTC" firstStartedPulling="2025-12-01 14:46:32.902954498 +0000 UTC m=+2906.887168353" lastFinishedPulling="2025-12-01 14:46:35.941413333 +0000 UTC m=+2909.925627208" observedRunningTime="2025-12-01 14:46:36.964202367 +0000 UTC m=+2910.948416222" watchObservedRunningTime="2025-12-01 14:46:36.97085476 +0000 UTC m=+2910.955068615" Dec 01 14:46:41 crc kubenswrapper[4585]: I1201 14:46:41.960592 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wp6p8" Dec 01 14:46:41 crc kubenswrapper[4585]: I1201 14:46:41.962096 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wp6p8" Dec 01 14:46:42 crc kubenswrapper[4585]: I1201 14:46:42.008798 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wp6p8" Dec 01 14:46:42 crc kubenswrapper[4585]: I1201 14:46:42.058686 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wp6p8" Dec 01 14:46:42 crc kubenswrapper[4585]: I1201 14:46:42.260005 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wp6p8"] Dec 01 14:46:43 crc kubenswrapper[4585]: I1201 14:46:43.999308 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wp6p8" podUID="b2c2500b-2e6f-4158-a9e9-91c3105e9f18" containerName="registry-server" containerID="cri-o://aea16d5f1065a14c30a1e7510dd068f9c159c6d28b08d05d3ff91d35874d7ec3" gracePeriod=2 Dec 01 14:46:44 crc kubenswrapper[4585]: I1201 14:46:44.087825 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 01 14:46:44 crc kubenswrapper[4585]: I1201 14:46:44.089090 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 01 14:46:44 crc kubenswrapper[4585]: I1201 14:46:44.094218 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-twgmj" Dec 01 14:46:44 crc kubenswrapper[4585]: I1201 14:46:44.097363 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 01 14:46:44 crc kubenswrapper[4585]: I1201 14:46:44.222582 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndxnm\" (UniqueName: \"kubernetes.io/projected/e0b341a5-c1b2-40a7-b2c4-0128fe7a389f-kube-api-access-ndxnm\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e0b341a5-c1b2-40a7-b2c4-0128fe7a389f\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 01 14:46:44 crc kubenswrapper[4585]: I1201 14:46:44.223006 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e0b341a5-c1b2-40a7-b2c4-0128fe7a389f\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 01 14:46:44 crc kubenswrapper[4585]: I1201 14:46:44.324833 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndxnm\" (UniqueName: \"kubernetes.io/projected/e0b341a5-c1b2-40a7-b2c4-0128fe7a389f-kube-api-access-ndxnm\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e0b341a5-c1b2-40a7-b2c4-0128fe7a389f\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 01 14:46:44 crc kubenswrapper[4585]: I1201 14:46:44.324927 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e0b341a5-c1b2-40a7-b2c4-0128fe7a389f\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 01 14:46:44 crc kubenswrapper[4585]: I1201 14:46:44.325454 4585 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e0b341a5-c1b2-40a7-b2c4-0128fe7a389f\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 01 14:46:44 crc kubenswrapper[4585]: I1201 14:46:44.344046 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndxnm\" (UniqueName: \"kubernetes.io/projected/e0b341a5-c1b2-40a7-b2c4-0128fe7a389f-kube-api-access-ndxnm\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e0b341a5-c1b2-40a7-b2c4-0128fe7a389f\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 01 14:46:44 crc kubenswrapper[4585]: I1201 14:46:44.350329 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e0b341a5-c1b2-40a7-b2c4-0128fe7a389f\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 01 14:46:44 crc kubenswrapper[4585]: I1201 14:46:44.454486 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wp6p8" Dec 01 14:46:44 crc kubenswrapper[4585]: I1201 14:46:44.463652 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 01 14:46:44 crc kubenswrapper[4585]: I1201 14:46:44.531235 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xsl2\" (UniqueName: \"kubernetes.io/projected/b2c2500b-2e6f-4158-a9e9-91c3105e9f18-kube-api-access-6xsl2\") pod \"b2c2500b-2e6f-4158-a9e9-91c3105e9f18\" (UID: \"b2c2500b-2e6f-4158-a9e9-91c3105e9f18\") " Dec 01 14:46:44 crc kubenswrapper[4585]: I1201 14:46:44.531356 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2c2500b-2e6f-4158-a9e9-91c3105e9f18-catalog-content\") pod \"b2c2500b-2e6f-4158-a9e9-91c3105e9f18\" (UID: \"b2c2500b-2e6f-4158-a9e9-91c3105e9f18\") " Dec 01 14:46:44 crc kubenswrapper[4585]: I1201 14:46:44.531415 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2c2500b-2e6f-4158-a9e9-91c3105e9f18-utilities\") pod \"b2c2500b-2e6f-4158-a9e9-91c3105e9f18\" (UID: \"b2c2500b-2e6f-4158-a9e9-91c3105e9f18\") " Dec 01 14:46:44 crc kubenswrapper[4585]: I1201 14:46:44.533516 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2c2500b-2e6f-4158-a9e9-91c3105e9f18-utilities" (OuterVolumeSpecName: "utilities") pod "b2c2500b-2e6f-4158-a9e9-91c3105e9f18" (UID: "b2c2500b-2e6f-4158-a9e9-91c3105e9f18"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:46:44 crc kubenswrapper[4585]: I1201 14:46:44.534847 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2c2500b-2e6f-4158-a9e9-91c3105e9f18-kube-api-access-6xsl2" (OuterVolumeSpecName: "kube-api-access-6xsl2") pod "b2c2500b-2e6f-4158-a9e9-91c3105e9f18" (UID: "b2c2500b-2e6f-4158-a9e9-91c3105e9f18"). InnerVolumeSpecName "kube-api-access-6xsl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:46:44 crc kubenswrapper[4585]: I1201 14:46:44.598097 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2c2500b-2e6f-4158-a9e9-91c3105e9f18-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b2c2500b-2e6f-4158-a9e9-91c3105e9f18" (UID: "b2c2500b-2e6f-4158-a9e9-91c3105e9f18"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:46:44 crc kubenswrapper[4585]: I1201 14:46:44.634143 4585 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2c2500b-2e6f-4158-a9e9-91c3105e9f18-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 14:46:44 crc kubenswrapper[4585]: I1201 14:46:44.634174 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xsl2\" (UniqueName: \"kubernetes.io/projected/b2c2500b-2e6f-4158-a9e9-91c3105e9f18-kube-api-access-6xsl2\") on node \"crc\" DevicePath \"\"" Dec 01 14:46:44 crc kubenswrapper[4585]: I1201 14:46:44.634191 4585 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2c2500b-2e6f-4158-a9e9-91c3105e9f18-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 14:46:44 crc kubenswrapper[4585]: I1201 14:46:44.921356 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 01 14:46:45 crc kubenswrapper[4585]: I1201 14:46:45.009247 4585 generic.go:334] "Generic (PLEG): container finished" podID="b2c2500b-2e6f-4158-a9e9-91c3105e9f18" containerID="aea16d5f1065a14c30a1e7510dd068f9c159c6d28b08d05d3ff91d35874d7ec3" exitCode=0 Dec 01 14:46:45 crc kubenswrapper[4585]: I1201 14:46:45.009313 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wp6p8" event={"ID":"b2c2500b-2e6f-4158-a9e9-91c3105e9f18","Type":"ContainerDied","Data":"aea16d5f1065a14c30a1e7510dd068f9c159c6d28b08d05d3ff91d35874d7ec3"} Dec 01 14:46:45 crc kubenswrapper[4585]: I1201 14:46:45.009339 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wp6p8" event={"ID":"b2c2500b-2e6f-4158-a9e9-91c3105e9f18","Type":"ContainerDied","Data":"06c8b11470b3bc7f3fce1059d4676996f26aa6a7e5e253f1042ada810dc1e2cd"} Dec 01 14:46:45 crc kubenswrapper[4585]: I1201 14:46:45.009344 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wp6p8" Dec 01 14:46:45 crc kubenswrapper[4585]: I1201 14:46:45.009357 4585 scope.go:117] "RemoveContainer" containerID="aea16d5f1065a14c30a1e7510dd068f9c159c6d28b08d05d3ff91d35874d7ec3" Dec 01 14:46:45 crc kubenswrapper[4585]: I1201 14:46:45.011636 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"e0b341a5-c1b2-40a7-b2c4-0128fe7a389f","Type":"ContainerStarted","Data":"2d5f96a7fc97b0f04099c74107c2bf8153289ff8ae23235829cd696ba043663a"} Dec 01 14:46:45 crc kubenswrapper[4585]: I1201 14:46:45.034544 4585 scope.go:117] "RemoveContainer" containerID="aa3a6c4128ce2991b0e35e307f47901a3b4113801665aff5b42f6ac5d3b6cfa0" Dec 01 14:46:45 crc kubenswrapper[4585]: I1201 14:46:45.043714 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wp6p8"] Dec 01 14:46:45 crc kubenswrapper[4585]: I1201 14:46:45.051998 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wp6p8"] Dec 01 14:46:45 crc kubenswrapper[4585]: I1201 14:46:45.066067 4585 scope.go:117] "RemoveContainer" containerID="87101e33c80202fdc4dff746b13072da5d650253b62e2ff82002b6ca130fa404" Dec 01 14:46:45 crc kubenswrapper[4585]: I1201 14:46:45.088601 4585 scope.go:117] "RemoveContainer" containerID="aea16d5f1065a14c30a1e7510dd068f9c159c6d28b08d05d3ff91d35874d7ec3" Dec 01 14:46:45 crc kubenswrapper[4585]: E1201 14:46:45.090082 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aea16d5f1065a14c30a1e7510dd068f9c159c6d28b08d05d3ff91d35874d7ec3\": container with ID starting with aea16d5f1065a14c30a1e7510dd068f9c159c6d28b08d05d3ff91d35874d7ec3 not found: ID does not exist" containerID="aea16d5f1065a14c30a1e7510dd068f9c159c6d28b08d05d3ff91d35874d7ec3" Dec 01 14:46:45 crc kubenswrapper[4585]: I1201 14:46:45.090133 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aea16d5f1065a14c30a1e7510dd068f9c159c6d28b08d05d3ff91d35874d7ec3"} err="failed to get container status \"aea16d5f1065a14c30a1e7510dd068f9c159c6d28b08d05d3ff91d35874d7ec3\": rpc error: code = NotFound desc = could not find container \"aea16d5f1065a14c30a1e7510dd068f9c159c6d28b08d05d3ff91d35874d7ec3\": container with ID starting with aea16d5f1065a14c30a1e7510dd068f9c159c6d28b08d05d3ff91d35874d7ec3 not found: ID does not exist" Dec 01 14:46:45 crc kubenswrapper[4585]: I1201 14:46:45.090159 4585 scope.go:117] "RemoveContainer" containerID="aa3a6c4128ce2991b0e35e307f47901a3b4113801665aff5b42f6ac5d3b6cfa0" Dec 01 14:46:45 crc kubenswrapper[4585]: E1201 14:46:45.092045 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa3a6c4128ce2991b0e35e307f47901a3b4113801665aff5b42f6ac5d3b6cfa0\": container with ID starting with aa3a6c4128ce2991b0e35e307f47901a3b4113801665aff5b42f6ac5d3b6cfa0 not found: ID does not exist" containerID="aa3a6c4128ce2991b0e35e307f47901a3b4113801665aff5b42f6ac5d3b6cfa0" Dec 01 14:46:45 crc kubenswrapper[4585]: I1201 14:46:45.092071 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa3a6c4128ce2991b0e35e307f47901a3b4113801665aff5b42f6ac5d3b6cfa0"} err="failed to get container status \"aa3a6c4128ce2991b0e35e307f47901a3b4113801665aff5b42f6ac5d3b6cfa0\": rpc error: code = NotFound desc = could not find container \"aa3a6c4128ce2991b0e35e307f47901a3b4113801665aff5b42f6ac5d3b6cfa0\": container with ID starting with aa3a6c4128ce2991b0e35e307f47901a3b4113801665aff5b42f6ac5d3b6cfa0 not found: ID does not exist" Dec 01 14:46:45 crc kubenswrapper[4585]: I1201 14:46:45.092092 4585 scope.go:117] "RemoveContainer" containerID="87101e33c80202fdc4dff746b13072da5d650253b62e2ff82002b6ca130fa404" Dec 01 14:46:45 crc kubenswrapper[4585]: E1201 14:46:45.092335 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87101e33c80202fdc4dff746b13072da5d650253b62e2ff82002b6ca130fa404\": container with ID starting with 87101e33c80202fdc4dff746b13072da5d650253b62e2ff82002b6ca130fa404 not found: ID does not exist" containerID="87101e33c80202fdc4dff746b13072da5d650253b62e2ff82002b6ca130fa404" Dec 01 14:46:45 crc kubenswrapper[4585]: I1201 14:46:45.092358 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87101e33c80202fdc4dff746b13072da5d650253b62e2ff82002b6ca130fa404"} err="failed to get container status \"87101e33c80202fdc4dff746b13072da5d650253b62e2ff82002b6ca130fa404\": rpc error: code = NotFound desc = could not find container \"87101e33c80202fdc4dff746b13072da5d650253b62e2ff82002b6ca130fa404\": container with ID starting with 87101e33c80202fdc4dff746b13072da5d650253b62e2ff82002b6ca130fa404 not found: ID does not exist" Dec 01 14:46:46 crc kubenswrapper[4585]: I1201 14:46:46.424863 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2c2500b-2e6f-4158-a9e9-91c3105e9f18" path="/var/lib/kubelet/pods/b2c2500b-2e6f-4158-a9e9-91c3105e9f18/volumes" Dec 01 14:46:47 crc kubenswrapper[4585]: I1201 14:46:47.039553 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"e0b341a5-c1b2-40a7-b2c4-0128fe7a389f","Type":"ContainerStarted","Data":"f1f83999821267538a1fef416c3f1ff408e94c52e206021693ff1b6bbd4500d7"} Dec 01 14:46:47 crc kubenswrapper[4585]: I1201 14:46:47.059656 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.780608003 podStartE2EDuration="3.059635905s" podCreationTimestamp="2025-12-01 14:46:44 +0000 UTC" firstStartedPulling="2025-12-01 14:46:44.925829256 +0000 UTC m=+2918.910043101" lastFinishedPulling="2025-12-01 14:46:46.204857148 +0000 UTC m=+2920.189071003" observedRunningTime="2025-12-01 14:46:47.057222081 +0000 UTC m=+2921.041436016" watchObservedRunningTime="2025-12-01 14:46:47.059635905 +0000 UTC m=+2921.043849760" Dec 01 14:47:08 crc kubenswrapper[4585]: I1201 14:47:08.845643 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-p4nbp/must-gather-hczfk"] Dec 01 14:47:08 crc kubenswrapper[4585]: E1201 14:47:08.855862 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2c2500b-2e6f-4158-a9e9-91c3105e9f18" containerName="extract-utilities" Dec 01 14:47:08 crc kubenswrapper[4585]: I1201 14:47:08.855901 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2c2500b-2e6f-4158-a9e9-91c3105e9f18" containerName="extract-utilities" Dec 01 14:47:08 crc kubenswrapper[4585]: E1201 14:47:08.855914 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2c2500b-2e6f-4158-a9e9-91c3105e9f18" containerName="registry-server" Dec 01 14:47:08 crc kubenswrapper[4585]: I1201 14:47:08.855921 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2c2500b-2e6f-4158-a9e9-91c3105e9f18" containerName="registry-server" Dec 01 14:47:08 crc kubenswrapper[4585]: E1201 14:47:08.855931 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2c2500b-2e6f-4158-a9e9-91c3105e9f18" containerName="extract-content" Dec 01 14:47:08 crc kubenswrapper[4585]: I1201 14:47:08.855938 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2c2500b-2e6f-4158-a9e9-91c3105e9f18" containerName="extract-content" Dec 01 14:47:08 crc kubenswrapper[4585]: I1201 14:47:08.856210 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2c2500b-2e6f-4158-a9e9-91c3105e9f18" containerName="registry-server" Dec 01 14:47:08 crc kubenswrapper[4585]: I1201 14:47:08.857199 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p4nbp/must-gather-hczfk" Dec 01 14:47:08 crc kubenswrapper[4585]: I1201 14:47:08.862287 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-p4nbp"/"openshift-service-ca.crt" Dec 01 14:47:08 crc kubenswrapper[4585]: I1201 14:47:08.870612 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-p4nbp/must-gather-hczfk"] Dec 01 14:47:08 crc kubenswrapper[4585]: I1201 14:47:08.870857 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-p4nbp"/"kube-root-ca.crt" Dec 01 14:47:08 crc kubenswrapper[4585]: I1201 14:47:08.930522 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0bdd4164-b891-436c-805e-1cf3a07fb6c4-must-gather-output\") pod \"must-gather-hczfk\" (UID: \"0bdd4164-b891-436c-805e-1cf3a07fb6c4\") " pod="openshift-must-gather-p4nbp/must-gather-hczfk" Dec 01 14:47:08 crc kubenswrapper[4585]: I1201 14:47:08.930633 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82h8s\" (UniqueName: \"kubernetes.io/projected/0bdd4164-b891-436c-805e-1cf3a07fb6c4-kube-api-access-82h8s\") pod \"must-gather-hczfk\" (UID: \"0bdd4164-b891-436c-805e-1cf3a07fb6c4\") " pod="openshift-must-gather-p4nbp/must-gather-hczfk" Dec 01 14:47:09 crc kubenswrapper[4585]: I1201 14:47:09.032022 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0bdd4164-b891-436c-805e-1cf3a07fb6c4-must-gather-output\") pod \"must-gather-hczfk\" (UID: \"0bdd4164-b891-436c-805e-1cf3a07fb6c4\") " pod="openshift-must-gather-p4nbp/must-gather-hczfk" Dec 01 14:47:09 crc kubenswrapper[4585]: I1201 14:47:09.032090 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82h8s\" (UniqueName: \"kubernetes.io/projected/0bdd4164-b891-436c-805e-1cf3a07fb6c4-kube-api-access-82h8s\") pod \"must-gather-hczfk\" (UID: \"0bdd4164-b891-436c-805e-1cf3a07fb6c4\") " pod="openshift-must-gather-p4nbp/must-gather-hczfk" Dec 01 14:47:09 crc kubenswrapper[4585]: I1201 14:47:09.032451 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0bdd4164-b891-436c-805e-1cf3a07fb6c4-must-gather-output\") pod \"must-gather-hczfk\" (UID: \"0bdd4164-b891-436c-805e-1cf3a07fb6c4\") " pod="openshift-must-gather-p4nbp/must-gather-hczfk" Dec 01 14:47:09 crc kubenswrapper[4585]: I1201 14:47:09.075041 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82h8s\" (UniqueName: \"kubernetes.io/projected/0bdd4164-b891-436c-805e-1cf3a07fb6c4-kube-api-access-82h8s\") pod \"must-gather-hczfk\" (UID: \"0bdd4164-b891-436c-805e-1cf3a07fb6c4\") " pod="openshift-must-gather-p4nbp/must-gather-hczfk" Dec 01 14:47:09 crc kubenswrapper[4585]: I1201 14:47:09.174429 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p4nbp/must-gather-hczfk" Dec 01 14:47:09 crc kubenswrapper[4585]: I1201 14:47:09.661439 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-p4nbp/must-gather-hczfk"] Dec 01 14:47:10 crc kubenswrapper[4585]: I1201 14:47:10.310343 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p4nbp/must-gather-hczfk" event={"ID":"0bdd4164-b891-436c-805e-1cf3a07fb6c4","Type":"ContainerStarted","Data":"8599999833addf67546a50923af2aa520bd50648352b9e6832402a4003ed08f6"} Dec 01 14:47:14 crc kubenswrapper[4585]: I1201 14:47:14.363129 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p4nbp/must-gather-hczfk" event={"ID":"0bdd4164-b891-436c-805e-1cf3a07fb6c4","Type":"ContainerStarted","Data":"6d572da308856b6b00d934ab3af9b8de610d0f3946bbed91aeefcda8da0ffe3a"} Dec 01 14:47:14 crc kubenswrapper[4585]: I1201 14:47:14.365765 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p4nbp/must-gather-hczfk" event={"ID":"0bdd4164-b891-436c-805e-1cf3a07fb6c4","Type":"ContainerStarted","Data":"64d5b79dcc2c3ee0ad987e672c1c09a119869ebbb2d3a8ed66b789bc68960a2f"} Dec 01 14:47:14 crc kubenswrapper[4585]: I1201 14:47:14.377872 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-p4nbp/must-gather-hczfk" podStartSLOduration=2.218877914 podStartE2EDuration="6.377858169s" podCreationTimestamp="2025-12-01 14:47:08 +0000 UTC" firstStartedPulling="2025-12-01 14:47:09.670469297 +0000 UTC m=+2943.654683152" lastFinishedPulling="2025-12-01 14:47:13.829449542 +0000 UTC m=+2947.813663407" observedRunningTime="2025-12-01 14:47:14.375332502 +0000 UTC m=+2948.359546357" watchObservedRunningTime="2025-12-01 14:47:14.377858169 +0000 UTC m=+2948.362072024" Dec 01 14:47:18 crc kubenswrapper[4585]: I1201 14:47:18.241303 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-p4nbp/crc-debug-js6vm"] Dec 01 14:47:18 crc kubenswrapper[4585]: I1201 14:47:18.242891 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p4nbp/crc-debug-js6vm" Dec 01 14:47:18 crc kubenswrapper[4585]: I1201 14:47:18.245813 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-p4nbp"/"default-dockercfg-k5bgs" Dec 01 14:47:18 crc kubenswrapper[4585]: I1201 14:47:18.413243 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7c92g\" (UniqueName: \"kubernetes.io/projected/857fe42e-1276-481a-937d-62011acfeb9f-kube-api-access-7c92g\") pod \"crc-debug-js6vm\" (UID: \"857fe42e-1276-481a-937d-62011acfeb9f\") " pod="openshift-must-gather-p4nbp/crc-debug-js6vm" Dec 01 14:47:18 crc kubenswrapper[4585]: I1201 14:47:18.413336 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/857fe42e-1276-481a-937d-62011acfeb9f-host\") pod \"crc-debug-js6vm\" (UID: \"857fe42e-1276-481a-937d-62011acfeb9f\") " pod="openshift-must-gather-p4nbp/crc-debug-js6vm" Dec 01 14:47:18 crc kubenswrapper[4585]: I1201 14:47:18.515828 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/857fe42e-1276-481a-937d-62011acfeb9f-host\") pod \"crc-debug-js6vm\" (UID: \"857fe42e-1276-481a-937d-62011acfeb9f\") " pod="openshift-must-gather-p4nbp/crc-debug-js6vm" Dec 01 14:47:18 crc kubenswrapper[4585]: I1201 14:47:18.516170 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/857fe42e-1276-481a-937d-62011acfeb9f-host\") pod \"crc-debug-js6vm\" (UID: \"857fe42e-1276-481a-937d-62011acfeb9f\") " pod="openshift-must-gather-p4nbp/crc-debug-js6vm" Dec 01 14:47:18 crc kubenswrapper[4585]: I1201 14:47:18.517255 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7c92g\" (UniqueName: \"kubernetes.io/projected/857fe42e-1276-481a-937d-62011acfeb9f-kube-api-access-7c92g\") pod \"crc-debug-js6vm\" (UID: \"857fe42e-1276-481a-937d-62011acfeb9f\") " pod="openshift-must-gather-p4nbp/crc-debug-js6vm" Dec 01 14:47:18 crc kubenswrapper[4585]: I1201 14:47:18.536249 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7c92g\" (UniqueName: \"kubernetes.io/projected/857fe42e-1276-481a-937d-62011acfeb9f-kube-api-access-7c92g\") pod \"crc-debug-js6vm\" (UID: \"857fe42e-1276-481a-937d-62011acfeb9f\") " pod="openshift-must-gather-p4nbp/crc-debug-js6vm" Dec 01 14:47:18 crc kubenswrapper[4585]: I1201 14:47:18.560469 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p4nbp/crc-debug-js6vm" Dec 01 14:47:19 crc kubenswrapper[4585]: I1201 14:47:19.405522 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p4nbp/crc-debug-js6vm" event={"ID":"857fe42e-1276-481a-937d-62011acfeb9f","Type":"ContainerStarted","Data":"797e1e10e566ddedab519214de4301a0b4278d7b4d270f102aa2ef886dcae115"} Dec 01 14:47:32 crc kubenswrapper[4585]: I1201 14:47:32.555478 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p4nbp/crc-debug-js6vm" event={"ID":"857fe42e-1276-481a-937d-62011acfeb9f","Type":"ContainerStarted","Data":"baa05d52bdf2eb7b346d84c77f455a3c31addbc1d80676848b7348962187d9c2"} Dec 01 14:47:49 crc kubenswrapper[4585]: I1201 14:47:49.708718 4585 generic.go:334] "Generic (PLEG): container finished" podID="857fe42e-1276-481a-937d-62011acfeb9f" containerID="baa05d52bdf2eb7b346d84c77f455a3c31addbc1d80676848b7348962187d9c2" exitCode=0 Dec 01 14:47:49 crc kubenswrapper[4585]: I1201 14:47:49.708892 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p4nbp/crc-debug-js6vm" event={"ID":"857fe42e-1276-481a-937d-62011acfeb9f","Type":"ContainerDied","Data":"baa05d52bdf2eb7b346d84c77f455a3c31addbc1d80676848b7348962187d9c2"} Dec 01 14:47:50 crc kubenswrapper[4585]: I1201 14:47:50.838964 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p4nbp/crc-debug-js6vm" Dec 01 14:47:50 crc kubenswrapper[4585]: I1201 14:47:50.873897 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-p4nbp/crc-debug-js6vm"] Dec 01 14:47:50 crc kubenswrapper[4585]: I1201 14:47:50.880806 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-p4nbp/crc-debug-js6vm"] Dec 01 14:47:50 crc kubenswrapper[4585]: I1201 14:47:50.927445 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/857fe42e-1276-481a-937d-62011acfeb9f-host\") pod \"857fe42e-1276-481a-937d-62011acfeb9f\" (UID: \"857fe42e-1276-481a-937d-62011acfeb9f\") " Dec 01 14:47:50 crc kubenswrapper[4585]: I1201 14:47:50.927784 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c92g\" (UniqueName: \"kubernetes.io/projected/857fe42e-1276-481a-937d-62011acfeb9f-kube-api-access-7c92g\") pod \"857fe42e-1276-481a-937d-62011acfeb9f\" (UID: \"857fe42e-1276-481a-937d-62011acfeb9f\") " Dec 01 14:47:50 crc kubenswrapper[4585]: I1201 14:47:50.927563 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/857fe42e-1276-481a-937d-62011acfeb9f-host" (OuterVolumeSpecName: "host") pod "857fe42e-1276-481a-937d-62011acfeb9f" (UID: "857fe42e-1276-481a-937d-62011acfeb9f"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 14:47:50 crc kubenswrapper[4585]: I1201 14:47:50.928441 4585 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/857fe42e-1276-481a-937d-62011acfeb9f-host\") on node \"crc\" DevicePath \"\"" Dec 01 14:47:50 crc kubenswrapper[4585]: I1201 14:47:50.934410 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/857fe42e-1276-481a-937d-62011acfeb9f-kube-api-access-7c92g" (OuterVolumeSpecName: "kube-api-access-7c92g") pod "857fe42e-1276-481a-937d-62011acfeb9f" (UID: "857fe42e-1276-481a-937d-62011acfeb9f"). InnerVolumeSpecName "kube-api-access-7c92g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:47:51 crc kubenswrapper[4585]: I1201 14:47:51.030140 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c92g\" (UniqueName: \"kubernetes.io/projected/857fe42e-1276-481a-937d-62011acfeb9f-kube-api-access-7c92g\") on node \"crc\" DevicePath \"\"" Dec 01 14:47:51 crc kubenswrapper[4585]: I1201 14:47:51.727145 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="797e1e10e566ddedab519214de4301a0b4278d7b4d270f102aa2ef886dcae115" Dec 01 14:47:51 crc kubenswrapper[4585]: I1201 14:47:51.727219 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p4nbp/crc-debug-js6vm" Dec 01 14:47:52 crc kubenswrapper[4585]: I1201 14:47:52.083533 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-p4nbp/crc-debug-dzbxb"] Dec 01 14:47:52 crc kubenswrapper[4585]: E1201 14:47:52.084706 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="857fe42e-1276-481a-937d-62011acfeb9f" containerName="container-00" Dec 01 14:47:52 crc kubenswrapper[4585]: I1201 14:47:52.084779 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="857fe42e-1276-481a-937d-62011acfeb9f" containerName="container-00" Dec 01 14:47:52 crc kubenswrapper[4585]: I1201 14:47:52.085057 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="857fe42e-1276-481a-937d-62011acfeb9f" containerName="container-00" Dec 01 14:47:52 crc kubenswrapper[4585]: I1201 14:47:52.085713 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p4nbp/crc-debug-dzbxb" Dec 01 14:47:52 crc kubenswrapper[4585]: I1201 14:47:52.089608 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-p4nbp"/"default-dockercfg-k5bgs" Dec 01 14:47:52 crc kubenswrapper[4585]: I1201 14:47:52.148655 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkzrr\" (UniqueName: \"kubernetes.io/projected/2d09eaea-79f0-48ee-93a4-8260a8006f5d-kube-api-access-dkzrr\") pod \"crc-debug-dzbxb\" (UID: \"2d09eaea-79f0-48ee-93a4-8260a8006f5d\") " pod="openshift-must-gather-p4nbp/crc-debug-dzbxb" Dec 01 14:47:52 crc kubenswrapper[4585]: I1201 14:47:52.149004 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2d09eaea-79f0-48ee-93a4-8260a8006f5d-host\") pod \"crc-debug-dzbxb\" (UID: \"2d09eaea-79f0-48ee-93a4-8260a8006f5d\") " pod="openshift-must-gather-p4nbp/crc-debug-dzbxb" Dec 01 14:47:52 crc kubenswrapper[4585]: I1201 14:47:52.250778 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkzrr\" (UniqueName: \"kubernetes.io/projected/2d09eaea-79f0-48ee-93a4-8260a8006f5d-kube-api-access-dkzrr\") pod \"crc-debug-dzbxb\" (UID: \"2d09eaea-79f0-48ee-93a4-8260a8006f5d\") " pod="openshift-must-gather-p4nbp/crc-debug-dzbxb" Dec 01 14:47:52 crc kubenswrapper[4585]: I1201 14:47:52.250900 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2d09eaea-79f0-48ee-93a4-8260a8006f5d-host\") pod \"crc-debug-dzbxb\" (UID: \"2d09eaea-79f0-48ee-93a4-8260a8006f5d\") " pod="openshift-must-gather-p4nbp/crc-debug-dzbxb" Dec 01 14:47:52 crc kubenswrapper[4585]: I1201 14:47:52.251026 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2d09eaea-79f0-48ee-93a4-8260a8006f5d-host\") pod \"crc-debug-dzbxb\" (UID: \"2d09eaea-79f0-48ee-93a4-8260a8006f5d\") " pod="openshift-must-gather-p4nbp/crc-debug-dzbxb" Dec 01 14:47:52 crc kubenswrapper[4585]: I1201 14:47:52.270801 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkzrr\" (UniqueName: \"kubernetes.io/projected/2d09eaea-79f0-48ee-93a4-8260a8006f5d-kube-api-access-dkzrr\") pod \"crc-debug-dzbxb\" (UID: \"2d09eaea-79f0-48ee-93a4-8260a8006f5d\") " pod="openshift-must-gather-p4nbp/crc-debug-dzbxb" Dec 01 14:47:52 crc kubenswrapper[4585]: I1201 14:47:52.400348 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p4nbp/crc-debug-dzbxb" Dec 01 14:47:52 crc kubenswrapper[4585]: I1201 14:47:52.442380 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="857fe42e-1276-481a-937d-62011acfeb9f" path="/var/lib/kubelet/pods/857fe42e-1276-481a-937d-62011acfeb9f/volumes" Dec 01 14:47:52 crc kubenswrapper[4585]: I1201 14:47:52.737344 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p4nbp/crc-debug-dzbxb" event={"ID":"2d09eaea-79f0-48ee-93a4-8260a8006f5d","Type":"ContainerStarted","Data":"438f807cfee10cfbe07befaea10c749ab610bbe8e37008d9689ad52f3894e606"} Dec 01 14:47:53 crc kubenswrapper[4585]: I1201 14:47:53.746201 4585 generic.go:334] "Generic (PLEG): container finished" podID="2d09eaea-79f0-48ee-93a4-8260a8006f5d" containerID="d7f8a663fce8d9a74757526c49516068bece6968b3e021c90fb93d1d3df79b7e" exitCode=1 Dec 01 14:47:53 crc kubenswrapper[4585]: I1201 14:47:53.746366 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p4nbp/crc-debug-dzbxb" event={"ID":"2d09eaea-79f0-48ee-93a4-8260a8006f5d","Type":"ContainerDied","Data":"d7f8a663fce8d9a74757526c49516068bece6968b3e021c90fb93d1d3df79b7e"} Dec 01 14:47:53 crc kubenswrapper[4585]: I1201 14:47:53.782250 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-p4nbp/crc-debug-dzbxb"] Dec 01 14:47:53 crc kubenswrapper[4585]: I1201 14:47:53.796552 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-p4nbp/crc-debug-dzbxb"] Dec 01 14:47:54 crc kubenswrapper[4585]: I1201 14:47:54.885932 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p4nbp/crc-debug-dzbxb" Dec 01 14:47:55 crc kubenswrapper[4585]: I1201 14:47:55.006520 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkzrr\" (UniqueName: \"kubernetes.io/projected/2d09eaea-79f0-48ee-93a4-8260a8006f5d-kube-api-access-dkzrr\") pod \"2d09eaea-79f0-48ee-93a4-8260a8006f5d\" (UID: \"2d09eaea-79f0-48ee-93a4-8260a8006f5d\") " Dec 01 14:47:55 crc kubenswrapper[4585]: I1201 14:47:55.007204 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2d09eaea-79f0-48ee-93a4-8260a8006f5d-host\") pod \"2d09eaea-79f0-48ee-93a4-8260a8006f5d\" (UID: \"2d09eaea-79f0-48ee-93a4-8260a8006f5d\") " Dec 01 14:47:55 crc kubenswrapper[4585]: I1201 14:47:55.007587 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2d09eaea-79f0-48ee-93a4-8260a8006f5d-host" (OuterVolumeSpecName: "host") pod "2d09eaea-79f0-48ee-93a4-8260a8006f5d" (UID: "2d09eaea-79f0-48ee-93a4-8260a8006f5d"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 14:47:55 crc kubenswrapper[4585]: I1201 14:47:55.007719 4585 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2d09eaea-79f0-48ee-93a4-8260a8006f5d-host\") on node \"crc\" DevicePath \"\"" Dec 01 14:47:55 crc kubenswrapper[4585]: I1201 14:47:55.019237 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d09eaea-79f0-48ee-93a4-8260a8006f5d-kube-api-access-dkzrr" (OuterVolumeSpecName: "kube-api-access-dkzrr") pod "2d09eaea-79f0-48ee-93a4-8260a8006f5d" (UID: "2d09eaea-79f0-48ee-93a4-8260a8006f5d"). InnerVolumeSpecName "kube-api-access-dkzrr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:47:55 crc kubenswrapper[4585]: I1201 14:47:55.109098 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkzrr\" (UniqueName: \"kubernetes.io/projected/2d09eaea-79f0-48ee-93a4-8260a8006f5d-kube-api-access-dkzrr\") on node \"crc\" DevicePath \"\"" Dec 01 14:47:55 crc kubenswrapper[4585]: I1201 14:47:55.786251 4585 scope.go:117] "RemoveContainer" containerID="d7f8a663fce8d9a74757526c49516068bece6968b3e021c90fb93d1d3df79b7e" Dec 01 14:47:55 crc kubenswrapper[4585]: I1201 14:47:55.786285 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p4nbp/crc-debug-dzbxb" Dec 01 14:47:56 crc kubenswrapper[4585]: I1201 14:47:56.429786 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d09eaea-79f0-48ee-93a4-8260a8006f5d" path="/var/lib/kubelet/pods/2d09eaea-79f0-48ee-93a4-8260a8006f5d/volumes" Dec 01 14:48:28 crc kubenswrapper[4585]: I1201 14:48:28.796882 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6564675f78-48rkf_87101522-5785-472c-9563-d86146676171/barbican-api/0.log" Dec 01 14:48:28 crc kubenswrapper[4585]: I1201 14:48:28.952381 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6564675f78-48rkf_87101522-5785-472c-9563-d86146676171/barbican-api-log/0.log" Dec 01 14:48:28 crc kubenswrapper[4585]: I1201 14:48:28.990111 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-868456d7cd-m64gp_31b37f1e-9af0-4922-8e48-55f82175411c/barbican-keystone-listener/0.log" Dec 01 14:48:29 crc kubenswrapper[4585]: I1201 14:48:29.152679 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-868456d7cd-m64gp_31b37f1e-9af0-4922-8e48-55f82175411c/barbican-keystone-listener-log/0.log" Dec 01 14:48:29 crc kubenswrapper[4585]: I1201 14:48:29.234269 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-65f6dd57f9-22fc5_61f0d5fb-5daa-4828-a1a8-92dc39a7c822/barbican-worker/0.log" Dec 01 14:48:29 crc kubenswrapper[4585]: I1201 14:48:29.430257 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-65f6dd57f9-22fc5_61f0d5fb-5daa-4828-a1a8-92dc39a7c822/barbican-worker-log/0.log" Dec 01 14:48:29 crc kubenswrapper[4585]: I1201 14:48:29.469675 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-6t4hr_9d1f4c36-f08f-4359-a950-a506d064998b/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 14:48:29 crc kubenswrapper[4585]: I1201 14:48:29.619770 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_7642d2c9-b2cc-400c-b45a-957690fb2e86/ceilometer-central-agent/0.log" Dec 01 14:48:29 crc kubenswrapper[4585]: I1201 14:48:29.673936 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_7642d2c9-b2cc-400c-b45a-957690fb2e86/ceilometer-notification-agent/0.log" Dec 01 14:48:29 crc kubenswrapper[4585]: I1201 14:48:29.733354 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_7642d2c9-b2cc-400c-b45a-957690fb2e86/sg-core/0.log" Dec 01 14:48:29 crc kubenswrapper[4585]: I1201 14:48:29.746875 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_7642d2c9-b2cc-400c-b45a-957690fb2e86/proxy-httpd/0.log" Dec 01 14:48:29 crc kubenswrapper[4585]: I1201 14:48:29.993151 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_952acec2-d757-4b65-aaf3-61bb69e5d5d7/cinder-api/0.log" Dec 01 14:48:30 crc kubenswrapper[4585]: I1201 14:48:30.098359 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_952acec2-d757-4b65-aaf3-61bb69e5d5d7/cinder-api-log/0.log" Dec 01 14:48:30 crc kubenswrapper[4585]: I1201 14:48:30.235321 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_68b9fd2e-0f05-46d6-86aa-319cbbf01db1/cinder-scheduler/0.log" Dec 01 14:48:30 crc kubenswrapper[4585]: I1201 14:48:30.313436 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_68b9fd2e-0f05-46d6-86aa-319cbbf01db1/probe/0.log" Dec 01 14:48:30 crc kubenswrapper[4585]: I1201 14:48:30.358648 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-xjc7h_16e7590a-927c-4ff1-8eb8-3b4a248ce6f8/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 14:48:30 crc kubenswrapper[4585]: I1201 14:48:30.611157 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-hb7s8_c12f5739-060f-4047-b987-d8c958aeb133/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 14:48:30 crc kubenswrapper[4585]: I1201 14:48:30.629943 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6cd9bffc9-cjbbt_7e89a01c-fb21-4027-bbeb-6bfe70da33d0/init/0.log" Dec 01 14:48:30 crc kubenswrapper[4585]: I1201 14:48:30.872627 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6cd9bffc9-cjbbt_7e89a01c-fb21-4027-bbeb-6bfe70da33d0/init/0.log" Dec 01 14:48:30 crc kubenswrapper[4585]: I1201 14:48:30.924396 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6cd9bffc9-cjbbt_7e89a01c-fb21-4027-bbeb-6bfe70da33d0/dnsmasq-dns/0.log" Dec 01 14:48:31 crc kubenswrapper[4585]: I1201 14:48:31.003496 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-pw7wp_bed7040b-db55-41ae-9384-7b730ced5331/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 14:48:31 crc kubenswrapper[4585]: I1201 14:48:31.197737 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_2e570020-789a-4807-9cff-651caad31856/glance-httpd/0.log" Dec 01 14:48:31 crc kubenswrapper[4585]: I1201 14:48:31.221019 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_2e570020-789a-4807-9cff-651caad31856/glance-log/0.log" Dec 01 14:48:31 crc kubenswrapper[4585]: I1201 14:48:31.353051 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_bba1bc55-ebde-45b0-a2bf-1b05117fc134/glance-httpd/0.log" Dec 01 14:48:31 crc kubenswrapper[4585]: I1201 14:48:31.423023 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_bba1bc55-ebde-45b0-a2bf-1b05117fc134/glance-log/0.log" Dec 01 14:48:31 crc kubenswrapper[4585]: I1201 14:48:31.663730 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6bbf659b46-55tth_e3b59a9b-bef9-4f7c-b861-bf6bc2a8bac1/horizon/0.log" Dec 01 14:48:31 crc kubenswrapper[4585]: I1201 14:48:31.667338 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6bbf659b46-55tth_e3b59a9b-bef9-4f7c-b861-bf6bc2a8bac1/horizon/1.log" Dec 01 14:48:31 crc kubenswrapper[4585]: I1201 14:48:31.946655 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6bbf659b46-55tth_e3b59a9b-bef9-4f7c-b861-bf6bc2a8bac1/horizon-log/0.log" Dec 01 14:48:32 crc kubenswrapper[4585]: I1201 14:48:32.084558 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-pzrlq_0919f038-3bf5-4f3c-baa3-5c85ceef4819/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 14:48:32 crc kubenswrapper[4585]: I1201 14:48:32.108794 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-29vjp_f27fcf2c-32e5-488b-b1f2-3f61b3096f4f/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 14:48:32 crc kubenswrapper[4585]: I1201 14:48:32.347921 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-576d96b8bf-jl74m_2ccd2ad6-369c-4511-9704-9f091dac6dd7/keystone-api/0.log" Dec 01 14:48:32 crc kubenswrapper[4585]: I1201 14:48:32.420295 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_e6dc37b7-09de-4e17-9d88-358b3d3d5908/kube-state-metrics/0.log" Dec 01 14:48:32 crc kubenswrapper[4585]: I1201 14:48:32.686143 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-6x5zw_a592f160-6520-4d70-94bd-5064e63fa1a0/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 14:48:33 crc kubenswrapper[4585]: I1201 14:48:33.259767 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-fc9dbdd9-h6k6z_197759e4-0035-4430-9bee-483578d6804e/neutron-api/0.log" Dec 01 14:48:33 crc kubenswrapper[4585]: I1201 14:48:33.299544 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-nw4f8_1a9f93f2-8afc-428b-9578-dd3353f8a43b/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 14:48:33 crc kubenswrapper[4585]: I1201 14:48:33.358214 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-fc9dbdd9-h6k6z_197759e4-0035-4430-9bee-483578d6804e/neutron-httpd/0.log" Dec 01 14:48:33 crc kubenswrapper[4585]: I1201 14:48:33.808244 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_f2bbc3d0-64bb-4942-8ee2-a05b538ec68f/nova-api-log/0.log" Dec 01 14:48:33 crc kubenswrapper[4585]: I1201 14:48:33.904183 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_1ae9e96d-f1e0-4183-9034-f553c8af4864/nova-cell0-conductor-conductor/0.log" Dec 01 14:48:33 crc kubenswrapper[4585]: I1201 14:48:33.928759 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_f2bbc3d0-64bb-4942-8ee2-a05b538ec68f/nova-api-api/0.log" Dec 01 14:48:34 crc kubenswrapper[4585]: I1201 14:48:34.200732 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_4cf9cb62-4c4f-43ae-8a94-78eafbb19a82/nova-cell1-conductor-conductor/0.log" Dec 01 14:48:34 crc kubenswrapper[4585]: I1201 14:48:34.290062 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_1ce769f5-6fc1-4585-a050-98ec1e1d9915/nova-cell1-novncproxy-novncproxy/0.log" Dec 01 14:48:34 crc kubenswrapper[4585]: I1201 14:48:34.554585 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-b48gn_a10b857d-29b7-46a5-9c12-775200f3ab74/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 14:48:34 crc kubenswrapper[4585]: I1201 14:48:34.842566 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_61c99c9c-c763-4e3a-8e71-6d78e3779991/nova-metadata-log/0.log" Dec 01 14:48:35 crc kubenswrapper[4585]: I1201 14:48:35.074360 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_779dee18-6c1e-4e00-b3be-22ce3d8e2259/nova-scheduler-scheduler/0.log" Dec 01 14:48:35 crc kubenswrapper[4585]: I1201 14:48:35.169197 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_471a678b-d81a-4526-b826-65b359685c99/mysql-bootstrap/0.log" Dec 01 14:48:35 crc kubenswrapper[4585]: I1201 14:48:35.476032 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_471a678b-d81a-4526-b826-65b359685c99/mysql-bootstrap/0.log" Dec 01 14:48:35 crc kubenswrapper[4585]: I1201 14:48:35.539221 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_471a678b-d81a-4526-b826-65b359685c99/galera/0.log" Dec 01 14:48:35 crc kubenswrapper[4585]: I1201 14:48:35.682669 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_61a59437-2c03-417a-839f-6b610fa43a83/mysql-bootstrap/0.log" Dec 01 14:48:35 crc kubenswrapper[4585]: I1201 14:48:35.710628 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_61c99c9c-c763-4e3a-8e71-6d78e3779991/nova-metadata-metadata/0.log" Dec 01 14:48:36 crc kubenswrapper[4585]: I1201 14:48:36.003707 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_61a59437-2c03-417a-839f-6b610fa43a83/galera/0.log" Dec 01 14:48:36 crc kubenswrapper[4585]: I1201 14:48:36.014259 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_29f23dd7-40f6-4677-bd4c-ebdf3152b72f/openstackclient/0.log" Dec 01 14:48:36 crc kubenswrapper[4585]: I1201 14:48:36.102575 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_61a59437-2c03-417a-839f-6b610fa43a83/mysql-bootstrap/0.log" Dec 01 14:48:36 crc kubenswrapper[4585]: I1201 14:48:36.327936 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-m7xvm_2f3d9474-e60e-401e-8597-1bd7af4f34c3/ovn-controller/0.log" Dec 01 14:48:36 crc kubenswrapper[4585]: I1201 14:48:36.428708 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-xddp8_5299e047-b328-440f-a888-8001cad4933b/openstack-network-exporter/0.log" Dec 01 14:48:36 crc kubenswrapper[4585]: I1201 14:48:36.709517 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-rt4xq_7c01d629-7b26-457f-8ab7-e67464b2e578/ovsdb-server-init/0.log" Dec 01 14:48:36 crc kubenswrapper[4585]: I1201 14:48:36.886081 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-rt4xq_7c01d629-7b26-457f-8ab7-e67464b2e578/ovs-vswitchd/0.log" Dec 01 14:48:36 crc kubenswrapper[4585]: I1201 14:48:36.967089 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-rt4xq_7c01d629-7b26-457f-8ab7-e67464b2e578/ovsdb-server-init/0.log" Dec 01 14:48:37 crc kubenswrapper[4585]: I1201 14:48:37.030635 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-rt4xq_7c01d629-7b26-457f-8ab7-e67464b2e578/ovsdb-server/0.log" Dec 01 14:48:37 crc kubenswrapper[4585]: I1201 14:48:37.225875 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-n7w8n_3a6c0545-e1eb-412f-afee-4764733eff64/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 14:48:37 crc kubenswrapper[4585]: I1201 14:48:37.338468 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_e15da9d0-0ba7-4885-8da4-89631b7886f6/ovn-northd/0.log" Dec 01 14:48:37 crc kubenswrapper[4585]: I1201 14:48:37.353227 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_e15da9d0-0ba7-4885-8da4-89631b7886f6/openstack-network-exporter/0.log" Dec 01 14:48:37 crc kubenswrapper[4585]: I1201 14:48:37.598543 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_fa90f38d-8525-4b91-9a7b-717ddc968614/openstack-network-exporter/0.log" Dec 01 14:48:37 crc kubenswrapper[4585]: I1201 14:48:37.628106 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_fa90f38d-8525-4b91-9a7b-717ddc968614/ovsdbserver-nb/0.log" Dec 01 14:48:37 crc kubenswrapper[4585]: I1201 14:48:37.827770 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_a3ceca23-b268-4b00-a4c2-026390eae759/openstack-network-exporter/0.log" Dec 01 14:48:37 crc kubenswrapper[4585]: I1201 14:48:37.870772 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_a3ceca23-b268-4b00-a4c2-026390eae759/ovsdbserver-sb/0.log" Dec 01 14:48:38 crc kubenswrapper[4585]: I1201 14:48:38.023608 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5594675dd-jdqsw_c8ba0e29-a5ba-4540-a9b6-154a30ff9e99/placement-api/0.log" Dec 01 14:48:38 crc kubenswrapper[4585]: I1201 14:48:38.150081 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_645c2200-d127-4ffe-a91e-9f9ae104dc06/setup-container/0.log" Dec 01 14:48:38 crc kubenswrapper[4585]: I1201 14:48:38.189843 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5594675dd-jdqsw_c8ba0e29-a5ba-4540-a9b6-154a30ff9e99/placement-log/0.log" Dec 01 14:48:38 crc kubenswrapper[4585]: I1201 14:48:38.469556 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_b271e13c-b935-4f31-a32d-865af7228e55/setup-container/0.log" Dec 01 14:48:38 crc kubenswrapper[4585]: I1201 14:48:38.579005 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_645c2200-d127-4ffe-a91e-9f9ae104dc06/setup-container/0.log" Dec 01 14:48:38 crc kubenswrapper[4585]: I1201 14:48:38.591179 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_645c2200-d127-4ffe-a91e-9f9ae104dc06/rabbitmq/0.log" Dec 01 14:48:38 crc kubenswrapper[4585]: I1201 14:48:38.737468 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_b271e13c-b935-4f31-a32d-865af7228e55/setup-container/0.log" Dec 01 14:48:38 crc kubenswrapper[4585]: I1201 14:48:38.877271 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-g98cz_27a8adc5-7598-4bf7-b46f-9a853afce3e6/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 14:48:38 crc kubenswrapper[4585]: I1201 14:48:38.880501 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_b271e13c-b935-4f31-a32d-865af7228e55/rabbitmq/0.log" Dec 01 14:48:39 crc kubenswrapper[4585]: I1201 14:48:39.095095 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-47qxg_883ed263-3b11-459f-83d8-c29a49f9c79c/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 14:48:39 crc kubenswrapper[4585]: I1201 14:48:39.221660 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-p647g_c334f141-1564-4112-a013-53207cf5900c/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 14:48:39 crc kubenswrapper[4585]: I1201 14:48:39.392313 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-td45x_3d807047-8744-4a9e-9bf8-1f492a8034b5/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 14:48:39 crc kubenswrapper[4585]: I1201 14:48:39.497008 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-pbt6b_31a42d9c-35e4-437d-8f54-47a3cef27d7e/ssh-known-hosts-edpm-deployment/0.log" Dec 01 14:48:39 crc kubenswrapper[4585]: I1201 14:48:39.791952 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5dc687f89-lzwxh_8f12e73f-f03a-4a68-a3e5-d4373d8fc583/proxy-httpd/0.log" Dec 01 14:48:39 crc kubenswrapper[4585]: I1201 14:48:39.881931 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5dc687f89-lzwxh_8f12e73f-f03a-4a68-a3e5-d4373d8fc583/proxy-server/0.log" Dec 01 14:48:40 crc kubenswrapper[4585]: I1201 14:48:40.026460 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-7zn7j_26bcdef2-b1e8-4848-abc4-b1f6a45c9916/swift-ring-rebalance/0.log" Dec 01 14:48:40 crc kubenswrapper[4585]: I1201 14:48:40.083646 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3bc5c97f-1882-47da-843c-f8dba234f1f3/account-auditor/0.log" Dec 01 14:48:40 crc kubenswrapper[4585]: I1201 14:48:40.169447 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3bc5c97f-1882-47da-843c-f8dba234f1f3/account-reaper/0.log" Dec 01 14:48:40 crc kubenswrapper[4585]: I1201 14:48:40.319752 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3bc5c97f-1882-47da-843c-f8dba234f1f3/account-replicator/0.log" Dec 01 14:48:40 crc kubenswrapper[4585]: I1201 14:48:40.402772 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3bc5c97f-1882-47da-843c-f8dba234f1f3/account-server/0.log" Dec 01 14:48:40 crc kubenswrapper[4585]: I1201 14:48:40.461994 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3bc5c97f-1882-47da-843c-f8dba234f1f3/container-replicator/0.log" Dec 01 14:48:40 crc kubenswrapper[4585]: I1201 14:48:40.492376 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3bc5c97f-1882-47da-843c-f8dba234f1f3/container-auditor/0.log" Dec 01 14:48:40 crc kubenswrapper[4585]: I1201 14:48:40.688536 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3bc5c97f-1882-47da-843c-f8dba234f1f3/container-updater/0.log" Dec 01 14:48:40 crc kubenswrapper[4585]: I1201 14:48:40.713328 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3bc5c97f-1882-47da-843c-f8dba234f1f3/container-server/0.log" Dec 01 14:48:40 crc kubenswrapper[4585]: I1201 14:48:40.757258 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3bc5c97f-1882-47da-843c-f8dba234f1f3/object-auditor/0.log" Dec 01 14:48:40 crc kubenswrapper[4585]: I1201 14:48:40.773271 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3bc5c97f-1882-47da-843c-f8dba234f1f3/object-expirer/0.log" Dec 01 14:48:40 crc kubenswrapper[4585]: I1201 14:48:40.972954 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3bc5c97f-1882-47da-843c-f8dba234f1f3/object-replicator/0.log" Dec 01 14:48:41 crc kubenswrapper[4585]: I1201 14:48:41.052632 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3bc5c97f-1882-47da-843c-f8dba234f1f3/object-server/0.log" Dec 01 14:48:41 crc kubenswrapper[4585]: I1201 14:48:41.071340 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3bc5c97f-1882-47da-843c-f8dba234f1f3/rsync/0.log" Dec 01 14:48:41 crc kubenswrapper[4585]: I1201 14:48:41.102783 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3bc5c97f-1882-47da-843c-f8dba234f1f3/object-updater/0.log" Dec 01 14:48:41 crc kubenswrapper[4585]: I1201 14:48:41.261572 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3bc5c97f-1882-47da-843c-f8dba234f1f3/swift-recon-cron/0.log" Dec 01 14:48:41 crc kubenswrapper[4585]: I1201 14:48:41.498499 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-8cmsb_1232f97e-9bf9-4917-b806-e5de8f180f70/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 14:48:41 crc kubenswrapper[4585]: I1201 14:48:41.568903 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_8c35f110-b7a3-4cbc-b181-1589a74f5d89/tempest-tests-tempest-tests-runner/0.log" Dec 01 14:48:41 crc kubenswrapper[4585]: I1201 14:48:41.714622 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_e0b341a5-c1b2-40a7-b2c4-0128fe7a389f/test-operator-logs-container/0.log" Dec 01 14:48:41 crc kubenswrapper[4585]: I1201 14:48:41.876747 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-jrbqs_a09e5590-d28a-4c20-80eb-ff1f448ec290/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 14:48:43 crc kubenswrapper[4585]: I1201 14:48:43.716190 4585 patch_prober.go:28] interesting pod/machine-config-daemon-lj9gs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 14:48:43 crc kubenswrapper[4585]: I1201 14:48:43.716542 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 14:48:56 crc kubenswrapper[4585]: I1201 14:48:56.166794 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_bbcd4b11-d625-4425-82d8-7c32d8c24c5c/memcached/0.log" Dec 01 14:49:12 crc kubenswrapper[4585]: I1201 14:49:12.868879 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_92745d039728c2a50a72aca57f7e5e275bb4aa1a0150662e177c6352fbgxw5c_d29365aa-3c8b-46c7-8b46-eb101a582cc2/util/0.log" Dec 01 14:49:13 crc kubenswrapper[4585]: I1201 14:49:13.142804 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_92745d039728c2a50a72aca57f7e5e275bb4aa1a0150662e177c6352fbgxw5c_d29365aa-3c8b-46c7-8b46-eb101a582cc2/pull/0.log" Dec 01 14:49:13 crc kubenswrapper[4585]: I1201 14:49:13.144258 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_92745d039728c2a50a72aca57f7e5e275bb4aa1a0150662e177c6352fbgxw5c_d29365aa-3c8b-46c7-8b46-eb101a582cc2/util/0.log" Dec 01 14:49:13 crc kubenswrapper[4585]: I1201 14:49:13.210353 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_92745d039728c2a50a72aca57f7e5e275bb4aa1a0150662e177c6352fbgxw5c_d29365aa-3c8b-46c7-8b46-eb101a582cc2/pull/0.log" Dec 01 14:49:13 crc kubenswrapper[4585]: I1201 14:49:13.438393 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_92745d039728c2a50a72aca57f7e5e275bb4aa1a0150662e177c6352fbgxw5c_d29365aa-3c8b-46c7-8b46-eb101a582cc2/util/0.log" Dec 01 14:49:13 crc kubenswrapper[4585]: I1201 14:49:13.473569 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_92745d039728c2a50a72aca57f7e5e275bb4aa1a0150662e177c6352fbgxw5c_d29365aa-3c8b-46c7-8b46-eb101a582cc2/extract/0.log" Dec 01 14:49:13 crc kubenswrapper[4585]: I1201 14:49:13.474398 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_92745d039728c2a50a72aca57f7e5e275bb4aa1a0150662e177c6352fbgxw5c_d29365aa-3c8b-46c7-8b46-eb101a582cc2/pull/0.log" Dec 01 14:49:13 crc kubenswrapper[4585]: I1201 14:49:13.645779 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-br8df_8da768c2-cb8c-40f9-b8d1-54a66743b340/kube-rbac-proxy/0.log" Dec 01 14:49:13 crc kubenswrapper[4585]: I1201 14:49:13.698090 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-qnbzj_59994d2c-6485-4beb-bcfc-3fd4a22bd203/kube-rbac-proxy/0.log" Dec 01 14:49:13 crc kubenswrapper[4585]: I1201 14:49:13.715609 4585 patch_prober.go:28] interesting pod/machine-config-daemon-lj9gs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 14:49:13 crc kubenswrapper[4585]: I1201 14:49:13.715653 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 14:49:13 crc kubenswrapper[4585]: I1201 14:49:13.723840 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-br8df_8da768c2-cb8c-40f9-b8d1-54a66743b340/manager/0.log" Dec 01 14:49:13 crc kubenswrapper[4585]: I1201 14:49:13.847894 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-qnbzj_59994d2c-6485-4beb-bcfc-3fd4a22bd203/manager/0.log" Dec 01 14:49:13 crc kubenswrapper[4585]: I1201 14:49:13.958897 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-8qd82_c4697227-2800-4a64-89bf-5bf831077ceb/kube-rbac-proxy/0.log" Dec 01 14:49:13 crc kubenswrapper[4585]: I1201 14:49:13.979402 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-8qd82_c4697227-2800-4a64-89bf-5bf831077ceb/manager/0.log" Dec 01 14:49:14 crc kubenswrapper[4585]: I1201 14:49:14.174846 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-668d9c48b9-4kmsq_7f8c91fb-441e-44f0-bf97-1340df47f4b0/kube-rbac-proxy/0.log" Dec 01 14:49:14 crc kubenswrapper[4585]: I1201 14:49:14.243243 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-668d9c48b9-4kmsq_7f8c91fb-441e-44f0-bf97-1340df47f4b0/manager/0.log" Dec 01 14:49:14 crc kubenswrapper[4585]: I1201 14:49:14.337244 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-qpqgr_60bbdebb-4ac8-4971-82b2-252a989a8c3a/kube-rbac-proxy/0.log" Dec 01 14:49:14 crc kubenswrapper[4585]: I1201 14:49:14.470015 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-qpqgr_60bbdebb-4ac8-4971-82b2-252a989a8c3a/manager/0.log" Dec 01 14:49:14 crc kubenswrapper[4585]: I1201 14:49:14.513772 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-vxz95_5afd79b9-5528-4ffe-9d3f-ac7b05502348/kube-rbac-proxy/0.log" Dec 01 14:49:14 crc kubenswrapper[4585]: I1201 14:49:14.569689 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-vxz95_5afd79b9-5528-4ffe-9d3f-ac7b05502348/manager/0.log" Dec 01 14:49:14 crc kubenswrapper[4585]: I1201 14:49:14.666436 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-77nsb_f496d7d1-7362-487d-88d7-33e2c26ce97b/kube-rbac-proxy/0.log" Dec 01 14:49:14 crc kubenswrapper[4585]: I1201 14:49:14.949409 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-77nsb_f496d7d1-7362-487d-88d7-33e2c26ce97b/manager/0.log" Dec 01 14:49:14 crc kubenswrapper[4585]: I1201 14:49:14.969755 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-sqs7f_1d516bcd-4ed7-4c83-a07e-3a8f66761090/kube-rbac-proxy/0.log" Dec 01 14:49:15 crc kubenswrapper[4585]: I1201 14:49:15.030235 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-sqs7f_1d516bcd-4ed7-4c83-a07e-3a8f66761090/manager/0.log" Dec 01 14:49:15 crc kubenswrapper[4585]: I1201 14:49:15.185057 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-546d4bdf48-xsbwl_50b75abe-8fa5-4e48-87bb-560b5609feda/kube-rbac-proxy/0.log" Dec 01 14:49:15 crc kubenswrapper[4585]: I1201 14:49:15.271768 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-546d4bdf48-xsbwl_50b75abe-8fa5-4e48-87bb-560b5609feda/manager/0.log" Dec 01 14:49:15 crc kubenswrapper[4585]: I1201 14:49:15.387433 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6546668bfd-pmpnl_abe5e9b4-4f45-4fb6-92f7-739d4174996b/kube-rbac-proxy/0.log" Dec 01 14:49:15 crc kubenswrapper[4585]: I1201 14:49:15.419561 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6546668bfd-pmpnl_abe5e9b4-4f45-4fb6-92f7-739d4174996b/manager/0.log" Dec 01 14:49:15 crc kubenswrapper[4585]: I1201 14:49:15.561945 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-ndffl_1c21caba-6277-4106-b637-a4874412f527/kube-rbac-proxy/0.log" Dec 01 14:49:15 crc kubenswrapper[4585]: I1201 14:49:15.669002 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-ndffl_1c21caba-6277-4106-b637-a4874412f527/manager/0.log" Dec 01 14:49:15 crc kubenswrapper[4585]: I1201 14:49:15.750943 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-8fdcb_f62dd90c-aa85-4650-92e0-13e52ec60360/kube-rbac-proxy/0.log" Dec 01 14:49:15 crc kubenswrapper[4585]: I1201 14:49:15.854136 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-8fdcb_f62dd90c-aa85-4650-92e0-13e52ec60360/manager/0.log" Dec 01 14:49:15 crc kubenswrapper[4585]: I1201 14:49:15.967103 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-w4g57_baa99a85-be34-458d-bc16-c367d4635b10/kube-rbac-proxy/0.log" Dec 01 14:49:16 crc kubenswrapper[4585]: I1201 14:49:16.077491 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-w4g57_baa99a85-be34-458d-bc16-c367d4635b10/manager/0.log" Dec 01 14:49:16 crc kubenswrapper[4585]: I1201 14:49:16.221566 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-6hdc6_0854e7b6-a6fb-4fd0-9e48-564df5d8fea2/kube-rbac-proxy/0.log" Dec 01 14:49:16 crc kubenswrapper[4585]: I1201 14:49:16.234184 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-6hdc6_0854e7b6-a6fb-4fd0-9e48-564df5d8fea2/manager/0.log" Dec 01 14:49:16 crc kubenswrapper[4585]: I1201 14:49:16.387544 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4fr7bh_bcc7d39e-d462-4eaa-89fa-625c72c956b6/kube-rbac-proxy/0.log" Dec 01 14:49:16 crc kubenswrapper[4585]: I1201 14:49:16.495939 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4fr7bh_bcc7d39e-d462-4eaa-89fa-625c72c956b6/manager/0.log" Dec 01 14:49:16 crc kubenswrapper[4585]: I1201 14:49:16.986442 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-d645d669b-rhjvp_9c187dfe-402b-4e73-8f4f-3d9dcf360954/operator/0.log" Dec 01 14:49:17 crc kubenswrapper[4585]: I1201 14:49:17.091682 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-47c9p_12b19089-35d3-41e8-b50f-385c3d8bb27a/registry-server/0.log" Dec 01 14:49:17 crc kubenswrapper[4585]: I1201 14:49:17.331533 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-r4qhj_7b6381d5-3b01-4c14-a553-e4a51274b140/kube-rbac-proxy/0.log" Dec 01 14:49:17 crc kubenswrapper[4585]: I1201 14:49:17.500032 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-r4qhj_7b6381d5-3b01-4c14-a553-e4a51274b140/manager/0.log" Dec 01 14:49:17 crc kubenswrapper[4585]: I1201 14:49:17.555324 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-m6vwb_cdbd6707-63ae-429d-8111-48ab6f912699/kube-rbac-proxy/0.log" Dec 01 14:49:17 crc kubenswrapper[4585]: I1201 14:49:17.657852 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-m6vwb_cdbd6707-63ae-429d-8111-48ab6f912699/manager/0.log" Dec 01 14:49:17 crc kubenswrapper[4585]: I1201 14:49:17.751254 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-b9b8558c-w5sxw_2177fed7-edae-4e55-94fd-2037166cbfdc/manager/0.log" Dec 01 14:49:17 crc kubenswrapper[4585]: I1201 14:49:17.771461 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-dcz6q_f4100ac0-da14-4d72-88e8-7f7356dad361/operator/0.log" Dec 01 14:49:17 crc kubenswrapper[4585]: I1201 14:49:17.901275 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5448bbd495-75vsz_b847594a-d018-4939-8177-3faf4a42da5a/manager/0.log" Dec 01 14:49:17 crc kubenswrapper[4585]: I1201 14:49:17.912458 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5448bbd495-75vsz_b847594a-d018-4939-8177-3faf4a42da5a/kube-rbac-proxy/0.log" Dec 01 14:49:18 crc kubenswrapper[4585]: I1201 14:49:18.003443 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-tgt7n_a672a71f-0885-4771-811e-fd658d282a84/kube-rbac-proxy/0.log" Dec 01 14:49:18 crc kubenswrapper[4585]: I1201 14:49:18.119090 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-tgt7n_a672a71f-0885-4771-811e-fd658d282a84/manager/0.log" Dec 01 14:49:18 crc kubenswrapper[4585]: I1201 14:49:18.161886 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-798dw_c756d201-c2d0-45f1-af3a-acdff1926a1a/kube-rbac-proxy/0.log" Dec 01 14:49:18 crc kubenswrapper[4585]: I1201 14:49:18.224110 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-798dw_c756d201-c2d0-45f1-af3a-acdff1926a1a/manager/0.log" Dec 01 14:49:18 crc kubenswrapper[4585]: I1201 14:49:18.328773 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-cqpws_59cf60ed-a0d2-4ddc-bfc5-d5973ecbb9e4/kube-rbac-proxy/0.log" Dec 01 14:49:18 crc kubenswrapper[4585]: I1201 14:49:18.361781 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-cqpws_59cf60ed-a0d2-4ddc-bfc5-d5973ecbb9e4/manager/0.log" Dec 01 14:49:38 crc kubenswrapper[4585]: I1201 14:49:38.660742 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-rn9hl_dc83aa4a-2686-47c8-876b-c6cf2192b493/control-plane-machine-set-operator/0.log" Dec 01 14:49:38 crc kubenswrapper[4585]: I1201 14:49:38.896192 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-42pj4_795dab1c-49d5-4b05-a84f-4e1655d459fc/kube-rbac-proxy/0.log" Dec 01 14:49:38 crc kubenswrapper[4585]: I1201 14:49:38.902144 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-42pj4_795dab1c-49d5-4b05-a84f-4e1655d459fc/machine-api-operator/0.log" Dec 01 14:49:43 crc kubenswrapper[4585]: I1201 14:49:43.716627 4585 patch_prober.go:28] interesting pod/machine-config-daemon-lj9gs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 14:49:43 crc kubenswrapper[4585]: I1201 14:49:43.717210 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 14:49:43 crc kubenswrapper[4585]: I1201 14:49:43.717522 4585 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" Dec 01 14:49:43 crc kubenswrapper[4585]: I1201 14:49:43.718301 4585 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7d09788976c97c2a3c377b62292f62228c7f604af4c086c3c984900409d574e6"} pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 14:49:43 crc kubenswrapper[4585]: I1201 14:49:43.718374 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerName="machine-config-daemon" containerID="cri-o://7d09788976c97c2a3c377b62292f62228c7f604af4c086c3c984900409d574e6" gracePeriod=600 Dec 01 14:49:43 crc kubenswrapper[4585]: E1201 14:49:43.837028 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:49:44 crc kubenswrapper[4585]: I1201 14:49:44.792227 4585 generic.go:334] "Generic (PLEG): container finished" podID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerID="7d09788976c97c2a3c377b62292f62228c7f604af4c086c3c984900409d574e6" exitCode=0 Dec 01 14:49:44 crc kubenswrapper[4585]: I1201 14:49:44.792386 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" event={"ID":"f7beb40d-bcd0-43c8-a9fe-c32408790a4c","Type":"ContainerDied","Data":"7d09788976c97c2a3c377b62292f62228c7f604af4c086c3c984900409d574e6"} Dec 01 14:49:44 crc kubenswrapper[4585]: I1201 14:49:44.793027 4585 scope.go:117] "RemoveContainer" containerID="6ded03fb550f95ffdee9f445ca211a7031c49c1e9aa5ae8c0ec5434bb5ff5043" Dec 01 14:49:44 crc kubenswrapper[4585]: I1201 14:49:44.794122 4585 scope.go:117] "RemoveContainer" containerID="7d09788976c97c2a3c377b62292f62228c7f604af4c086c3c984900409d574e6" Dec 01 14:49:44 crc kubenswrapper[4585]: E1201 14:49:44.794765 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:49:52 crc kubenswrapper[4585]: I1201 14:49:52.326548 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-mk428_45ed0e5e-d1d0-45c5-9710-bcc051a7956e/cert-manager-controller/0.log" Dec 01 14:49:52 crc kubenswrapper[4585]: I1201 14:49:52.457075 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-kfgj7_03ae09b7-07fe-4a7b-9012-c17019e6d0fa/cert-manager-cainjector/0.log" Dec 01 14:49:52 crc kubenswrapper[4585]: I1201 14:49:52.593492 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-2ldxr_f1560ff4-292f-425d-8b6f-d481c951c541/cert-manager-webhook/0.log" Dec 01 14:49:56 crc kubenswrapper[4585]: I1201 14:49:56.420560 4585 scope.go:117] "RemoveContainer" containerID="7d09788976c97c2a3c377b62292f62228c7f604af4c086c3c984900409d574e6" Dec 01 14:49:56 crc kubenswrapper[4585]: E1201 14:49:56.421761 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:50:06 crc kubenswrapper[4585]: I1201 14:50:06.008315 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-kmtzd_5de4a007-93f4-45e6-a70a-5a036ff4377c/nmstate-console-plugin/0.log" Dec 01 14:50:06 crc kubenswrapper[4585]: I1201 14:50:06.214360 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-jj8n7_6c8cbbf5-3146-44bc-8533-17523cd27750/nmstate-handler/0.log" Dec 01 14:50:06 crc kubenswrapper[4585]: I1201 14:50:06.289408 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-w84ck_d03ec6db-a14f-40ee-80b7-2232ffc0a321/kube-rbac-proxy/0.log" Dec 01 14:50:06 crc kubenswrapper[4585]: I1201 14:50:06.383779 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-w84ck_d03ec6db-a14f-40ee-80b7-2232ffc0a321/nmstate-metrics/0.log" Dec 01 14:50:06 crc kubenswrapper[4585]: I1201 14:50:06.551907 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-c4cpl_dfe8ef28-9d20-49ce-8084-bfdfbc024e0c/nmstate-operator/0.log" Dec 01 14:50:06 crc kubenswrapper[4585]: I1201 14:50:06.640167 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-rvcwb_a4a15dc7-9cbc-4c39-b9ec-f73877001cd7/nmstate-webhook/0.log" Dec 01 14:50:10 crc kubenswrapper[4585]: I1201 14:50:10.413028 4585 scope.go:117] "RemoveContainer" containerID="7d09788976c97c2a3c377b62292f62228c7f604af4c086c3c984900409d574e6" Dec 01 14:50:10 crc kubenswrapper[4585]: E1201 14:50:10.413860 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:50:22 crc kubenswrapper[4585]: I1201 14:50:22.593472 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-hx8mm_3028ccae-b87c-4752-9558-1399dc8fa279/kube-rbac-proxy/0.log" Dec 01 14:50:22 crc kubenswrapper[4585]: I1201 14:50:22.732562 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-hx8mm_3028ccae-b87c-4752-9558-1399dc8fa279/controller/0.log" Dec 01 14:50:22 crc kubenswrapper[4585]: I1201 14:50:22.908492 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nklgw_acd2a938-907c-443a-bd52-c0dfd4fbd455/cp-frr-files/0.log" Dec 01 14:50:23 crc kubenswrapper[4585]: I1201 14:50:23.190569 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nklgw_acd2a938-907c-443a-bd52-c0dfd4fbd455/cp-frr-files/0.log" Dec 01 14:50:23 crc kubenswrapper[4585]: I1201 14:50:23.255791 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nklgw_acd2a938-907c-443a-bd52-c0dfd4fbd455/cp-reloader/0.log" Dec 01 14:50:23 crc kubenswrapper[4585]: I1201 14:50:23.311555 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nklgw_acd2a938-907c-443a-bd52-c0dfd4fbd455/cp-metrics/0.log" Dec 01 14:50:23 crc kubenswrapper[4585]: I1201 14:50:23.337638 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nklgw_acd2a938-907c-443a-bd52-c0dfd4fbd455/cp-reloader/0.log" Dec 01 14:50:23 crc kubenswrapper[4585]: I1201 14:50:23.497046 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nklgw_acd2a938-907c-443a-bd52-c0dfd4fbd455/cp-frr-files/0.log" Dec 01 14:50:23 crc kubenswrapper[4585]: I1201 14:50:23.502860 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nklgw_acd2a938-907c-443a-bd52-c0dfd4fbd455/cp-reloader/0.log" Dec 01 14:50:23 crc kubenswrapper[4585]: I1201 14:50:23.557351 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nklgw_acd2a938-907c-443a-bd52-c0dfd4fbd455/cp-metrics/0.log" Dec 01 14:50:23 crc kubenswrapper[4585]: I1201 14:50:23.639998 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nklgw_acd2a938-907c-443a-bd52-c0dfd4fbd455/cp-metrics/0.log" Dec 01 14:50:23 crc kubenswrapper[4585]: I1201 14:50:23.812953 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nklgw_acd2a938-907c-443a-bd52-c0dfd4fbd455/cp-frr-files/0.log" Dec 01 14:50:23 crc kubenswrapper[4585]: I1201 14:50:23.831759 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nklgw_acd2a938-907c-443a-bd52-c0dfd4fbd455/cp-metrics/0.log" Dec 01 14:50:23 crc kubenswrapper[4585]: I1201 14:50:23.874462 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nklgw_acd2a938-907c-443a-bd52-c0dfd4fbd455/cp-reloader/0.log" Dec 01 14:50:23 crc kubenswrapper[4585]: I1201 14:50:23.888482 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nklgw_acd2a938-907c-443a-bd52-c0dfd4fbd455/controller/0.log" Dec 01 14:50:24 crc kubenswrapper[4585]: I1201 14:50:24.010651 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nklgw_acd2a938-907c-443a-bd52-c0dfd4fbd455/frr-metrics/0.log" Dec 01 14:50:24 crc kubenswrapper[4585]: I1201 14:50:24.135018 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nklgw_acd2a938-907c-443a-bd52-c0dfd4fbd455/kube-rbac-proxy-frr/0.log" Dec 01 14:50:24 crc kubenswrapper[4585]: I1201 14:50:24.150426 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nklgw_acd2a938-907c-443a-bd52-c0dfd4fbd455/kube-rbac-proxy/0.log" Dec 01 14:50:24 crc kubenswrapper[4585]: I1201 14:50:24.352697 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nklgw_acd2a938-907c-443a-bd52-c0dfd4fbd455/reloader/0.log" Dec 01 14:50:24 crc kubenswrapper[4585]: I1201 14:50:24.412304 4585 scope.go:117] "RemoveContainer" containerID="7d09788976c97c2a3c377b62292f62228c7f604af4c086c3c984900409d574e6" Dec 01 14:50:24 crc kubenswrapper[4585]: E1201 14:50:24.412547 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:50:24 crc kubenswrapper[4585]: I1201 14:50:24.521914 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-lvvzw_7ee13572-ff22-43ea-8570-cc0f3a64d44e/frr-k8s-webhook-server/0.log" Dec 01 14:50:24 crc kubenswrapper[4585]: I1201 14:50:24.766497 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-65958ffb48-2555t_661d1aa1-ad66-45b2-8562-69776e5fb5af/manager/0.log" Dec 01 14:50:25 crc kubenswrapper[4585]: I1201 14:50:25.009890 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-64878d448f-cvc5q_99217243-f8e1-4533-925c-a3fac9b81346/webhook-server/0.log" Dec 01 14:50:25 crc kubenswrapper[4585]: I1201 14:50:25.036025 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nklgw_acd2a938-907c-443a-bd52-c0dfd4fbd455/frr/0.log" Dec 01 14:50:25 crc kubenswrapper[4585]: I1201 14:50:25.092519 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-tnnzj_24c44b85-a153-4622-864f-a0f690044361/kube-rbac-proxy/0.log" Dec 01 14:50:25 crc kubenswrapper[4585]: I1201 14:50:25.443407 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-tnnzj_24c44b85-a153-4622-864f-a0f690044361/speaker/0.log" Dec 01 14:50:38 crc kubenswrapper[4585]: I1201 14:50:38.415312 4585 scope.go:117] "RemoveContainer" containerID="7d09788976c97c2a3c377b62292f62228c7f604af4c086c3c984900409d574e6" Dec 01 14:50:38 crc kubenswrapper[4585]: E1201 14:50:38.416114 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:50:38 crc kubenswrapper[4585]: I1201 14:50:38.660903 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ff8sb7_37acd505-ab0f-4779-844d-3dbe65a936c0/util/0.log" Dec 01 14:50:38 crc kubenswrapper[4585]: I1201 14:50:38.978146 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ff8sb7_37acd505-ab0f-4779-844d-3dbe65a936c0/util/0.log" Dec 01 14:50:38 crc kubenswrapper[4585]: I1201 14:50:38.984866 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ff8sb7_37acd505-ab0f-4779-844d-3dbe65a936c0/pull/0.log" Dec 01 14:50:39 crc kubenswrapper[4585]: I1201 14:50:39.001211 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ff8sb7_37acd505-ab0f-4779-844d-3dbe65a936c0/pull/0.log" Dec 01 14:50:39 crc kubenswrapper[4585]: I1201 14:50:39.169782 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ff8sb7_37acd505-ab0f-4779-844d-3dbe65a936c0/util/0.log" Dec 01 14:50:39 crc kubenswrapper[4585]: I1201 14:50:39.211638 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ff8sb7_37acd505-ab0f-4779-844d-3dbe65a936c0/extract/0.log" Dec 01 14:50:39 crc kubenswrapper[4585]: I1201 14:50:39.300994 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ff8sb7_37acd505-ab0f-4779-844d-3dbe65a936c0/pull/0.log" Dec 01 14:50:39 crc kubenswrapper[4585]: I1201 14:50:39.402197 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bnmj8_2fb40ba9-c26f-4fe8-900d-c5bd775febf6/util/0.log" Dec 01 14:50:39 crc kubenswrapper[4585]: I1201 14:50:39.596850 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bnmj8_2fb40ba9-c26f-4fe8-900d-c5bd775febf6/pull/0.log" Dec 01 14:50:39 crc kubenswrapper[4585]: I1201 14:50:39.636184 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bnmj8_2fb40ba9-c26f-4fe8-900d-c5bd775febf6/util/0.log" Dec 01 14:50:39 crc kubenswrapper[4585]: I1201 14:50:39.643806 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bnmj8_2fb40ba9-c26f-4fe8-900d-c5bd775febf6/pull/0.log" Dec 01 14:50:40 crc kubenswrapper[4585]: I1201 14:50:40.096326 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bnmj8_2fb40ba9-c26f-4fe8-900d-c5bd775febf6/util/0.log" Dec 01 14:50:40 crc kubenswrapper[4585]: I1201 14:50:40.154903 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bnmj8_2fb40ba9-c26f-4fe8-900d-c5bd775febf6/pull/0.log" Dec 01 14:50:40 crc kubenswrapper[4585]: I1201 14:50:40.193905 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bnmj8_2fb40ba9-c26f-4fe8-900d-c5bd775febf6/extract/0.log" Dec 01 14:50:40 crc kubenswrapper[4585]: I1201 14:50:40.377039 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mzg67_d92e1230-1b33-449a-9d96-204cdc4cc3ee/extract-utilities/0.log" Dec 01 14:50:40 crc kubenswrapper[4585]: I1201 14:50:40.584716 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mzg67_d92e1230-1b33-449a-9d96-204cdc4cc3ee/extract-content/0.log" Dec 01 14:50:40 crc kubenswrapper[4585]: I1201 14:50:40.588729 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mzg67_d92e1230-1b33-449a-9d96-204cdc4cc3ee/extract-content/0.log" Dec 01 14:50:40 crc kubenswrapper[4585]: I1201 14:50:40.648215 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mzg67_d92e1230-1b33-449a-9d96-204cdc4cc3ee/extract-utilities/0.log" Dec 01 14:50:40 crc kubenswrapper[4585]: I1201 14:50:40.834924 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mzg67_d92e1230-1b33-449a-9d96-204cdc4cc3ee/extract-content/0.log" Dec 01 14:50:40 crc kubenswrapper[4585]: I1201 14:50:40.852087 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mzg67_d92e1230-1b33-449a-9d96-204cdc4cc3ee/extract-utilities/0.log" Dec 01 14:50:41 crc kubenswrapper[4585]: I1201 14:50:41.152140 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mzg67_d92e1230-1b33-449a-9d96-204cdc4cc3ee/registry-server/0.log" Dec 01 14:50:41 crc kubenswrapper[4585]: I1201 14:50:41.158617 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-74dzn_cdaf429b-b6c0-4f60-a032-17262f7466f4/extract-utilities/0.log" Dec 01 14:50:41 crc kubenswrapper[4585]: I1201 14:50:41.423454 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-74dzn_cdaf429b-b6c0-4f60-a032-17262f7466f4/extract-content/0.log" Dec 01 14:50:41 crc kubenswrapper[4585]: I1201 14:50:41.423547 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-74dzn_cdaf429b-b6c0-4f60-a032-17262f7466f4/extract-utilities/0.log" Dec 01 14:50:41 crc kubenswrapper[4585]: I1201 14:50:41.482135 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-74dzn_cdaf429b-b6c0-4f60-a032-17262f7466f4/extract-content/0.log" Dec 01 14:50:41 crc kubenswrapper[4585]: I1201 14:50:41.607436 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-74dzn_cdaf429b-b6c0-4f60-a032-17262f7466f4/extract-utilities/0.log" Dec 01 14:50:41 crc kubenswrapper[4585]: I1201 14:50:41.632514 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-74dzn_cdaf429b-b6c0-4f60-a032-17262f7466f4/extract-content/0.log" Dec 01 14:50:41 crc kubenswrapper[4585]: I1201 14:50:41.858179 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-lwz9j_73887a19-b0ad-43de-a7d3-bda4a7a2a06a/marketplace-operator/0.log" Dec 01 14:50:41 crc kubenswrapper[4585]: I1201 14:50:41.933505 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-74dzn_cdaf429b-b6c0-4f60-a032-17262f7466f4/registry-server/0.log" Dec 01 14:50:42 crc kubenswrapper[4585]: I1201 14:50:42.038145 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vnkff_f7e0be82-9218-49a5-a141-605615d845a8/extract-utilities/0.log" Dec 01 14:50:42 crc kubenswrapper[4585]: I1201 14:50:42.245270 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vnkff_f7e0be82-9218-49a5-a141-605615d845a8/extract-content/0.log" Dec 01 14:50:42 crc kubenswrapper[4585]: I1201 14:50:42.269316 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vnkff_f7e0be82-9218-49a5-a141-605615d845a8/extract-utilities/0.log" Dec 01 14:50:42 crc kubenswrapper[4585]: I1201 14:50:42.294874 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vnkff_f7e0be82-9218-49a5-a141-605615d845a8/extract-content/0.log" Dec 01 14:50:42 crc kubenswrapper[4585]: I1201 14:50:42.483150 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vnkff_f7e0be82-9218-49a5-a141-605615d845a8/extract-utilities/0.log" Dec 01 14:50:42 crc kubenswrapper[4585]: I1201 14:50:42.521876 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vnkff_f7e0be82-9218-49a5-a141-605615d845a8/extract-content/0.log" Dec 01 14:50:42 crc kubenswrapper[4585]: I1201 14:50:42.659883 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vnkff_f7e0be82-9218-49a5-a141-605615d845a8/registry-server/0.log" Dec 01 14:50:42 crc kubenswrapper[4585]: I1201 14:50:42.757006 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4mtgw_d7ac0d5a-4c26-4734-9386-f775f8dc5461/extract-utilities/0.log" Dec 01 14:50:42 crc kubenswrapper[4585]: I1201 14:50:42.913425 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4mtgw_d7ac0d5a-4c26-4734-9386-f775f8dc5461/extract-content/0.log" Dec 01 14:50:42 crc kubenswrapper[4585]: I1201 14:50:42.917740 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4mtgw_d7ac0d5a-4c26-4734-9386-f775f8dc5461/extract-content/0.log" Dec 01 14:50:42 crc kubenswrapper[4585]: I1201 14:50:42.965124 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4mtgw_d7ac0d5a-4c26-4734-9386-f775f8dc5461/extract-utilities/0.log" Dec 01 14:50:43 crc kubenswrapper[4585]: I1201 14:50:43.162289 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4mtgw_d7ac0d5a-4c26-4734-9386-f775f8dc5461/extract-utilities/0.log" Dec 01 14:50:43 crc kubenswrapper[4585]: I1201 14:50:43.167019 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4mtgw_d7ac0d5a-4c26-4734-9386-f775f8dc5461/extract-content/0.log" Dec 01 14:50:43 crc kubenswrapper[4585]: I1201 14:50:43.515794 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4mtgw_d7ac0d5a-4c26-4734-9386-f775f8dc5461/registry-server/0.log" Dec 01 14:50:51 crc kubenswrapper[4585]: I1201 14:50:51.413247 4585 scope.go:117] "RemoveContainer" containerID="7d09788976c97c2a3c377b62292f62228c7f604af4c086c3c984900409d574e6" Dec 01 14:50:51 crc kubenswrapper[4585]: E1201 14:50:51.414249 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:51:05 crc kubenswrapper[4585]: I1201 14:51:05.412621 4585 scope.go:117] "RemoveContainer" containerID="7d09788976c97c2a3c377b62292f62228c7f604af4c086c3c984900409d574e6" Dec 01 14:51:05 crc kubenswrapper[4585]: E1201 14:51:05.413380 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:51:17 crc kubenswrapper[4585]: I1201 14:51:17.412932 4585 scope.go:117] "RemoveContainer" containerID="7d09788976c97c2a3c377b62292f62228c7f604af4c086c3c984900409d574e6" Dec 01 14:51:17 crc kubenswrapper[4585]: E1201 14:51:17.414054 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:51:30 crc kubenswrapper[4585]: I1201 14:51:30.413619 4585 scope.go:117] "RemoveContainer" containerID="7d09788976c97c2a3c377b62292f62228c7f604af4c086c3c984900409d574e6" Dec 01 14:51:30 crc kubenswrapper[4585]: E1201 14:51:30.414742 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:51:43 crc kubenswrapper[4585]: I1201 14:51:43.412484 4585 scope.go:117] "RemoveContainer" containerID="7d09788976c97c2a3c377b62292f62228c7f604af4c086c3c984900409d574e6" Dec 01 14:51:43 crc kubenswrapper[4585]: E1201 14:51:43.414518 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:51:55 crc kubenswrapper[4585]: I1201 14:51:55.412346 4585 scope.go:117] "RemoveContainer" containerID="7d09788976c97c2a3c377b62292f62228c7f604af4c086c3c984900409d574e6" Dec 01 14:51:55 crc kubenswrapper[4585]: E1201 14:51:55.414016 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:52:06 crc kubenswrapper[4585]: I1201 14:52:06.418238 4585 scope.go:117] "RemoveContainer" containerID="7d09788976c97c2a3c377b62292f62228c7f604af4c086c3c984900409d574e6" Dec 01 14:52:06 crc kubenswrapper[4585]: E1201 14:52:06.418930 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:52:21 crc kubenswrapper[4585]: I1201 14:52:21.413032 4585 scope.go:117] "RemoveContainer" containerID="7d09788976c97c2a3c377b62292f62228c7f604af4c086c3c984900409d574e6" Dec 01 14:52:21 crc kubenswrapper[4585]: E1201 14:52:21.413831 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:52:33 crc kubenswrapper[4585]: I1201 14:52:33.412526 4585 scope.go:117] "RemoveContainer" containerID="7d09788976c97c2a3c377b62292f62228c7f604af4c086c3c984900409d574e6" Dec 01 14:52:33 crc kubenswrapper[4585]: E1201 14:52:33.413192 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:52:34 crc kubenswrapper[4585]: I1201 14:52:34.345263 4585 generic.go:334] "Generic (PLEG): container finished" podID="0bdd4164-b891-436c-805e-1cf3a07fb6c4" containerID="64d5b79dcc2c3ee0ad987e672c1c09a119869ebbb2d3a8ed66b789bc68960a2f" exitCode=0 Dec 01 14:52:34 crc kubenswrapper[4585]: I1201 14:52:34.345408 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p4nbp/must-gather-hczfk" event={"ID":"0bdd4164-b891-436c-805e-1cf3a07fb6c4","Type":"ContainerDied","Data":"64d5b79dcc2c3ee0ad987e672c1c09a119869ebbb2d3a8ed66b789bc68960a2f"} Dec 01 14:52:34 crc kubenswrapper[4585]: I1201 14:52:34.346742 4585 scope.go:117] "RemoveContainer" containerID="64d5b79dcc2c3ee0ad987e672c1c09a119869ebbb2d3a8ed66b789bc68960a2f" Dec 01 14:52:34 crc kubenswrapper[4585]: I1201 14:52:34.437472 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-p4nbp_must-gather-hczfk_0bdd4164-b891-436c-805e-1cf3a07fb6c4/gather/0.log" Dec 01 14:52:42 crc kubenswrapper[4585]: I1201 14:52:42.051374 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-p4nbp/must-gather-hczfk"] Dec 01 14:52:42 crc kubenswrapper[4585]: I1201 14:52:42.053445 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-p4nbp/must-gather-hczfk" podUID="0bdd4164-b891-436c-805e-1cf3a07fb6c4" containerName="copy" containerID="cri-o://6d572da308856b6b00d934ab3af9b8de610d0f3946bbed91aeefcda8da0ffe3a" gracePeriod=2 Dec 01 14:52:42 crc kubenswrapper[4585]: I1201 14:52:42.060798 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-p4nbp/must-gather-hczfk"] Dec 01 14:52:42 crc kubenswrapper[4585]: I1201 14:52:42.434754 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-p4nbp_must-gather-hczfk_0bdd4164-b891-436c-805e-1cf3a07fb6c4/copy/0.log" Dec 01 14:52:42 crc kubenswrapper[4585]: I1201 14:52:42.438474 4585 generic.go:334] "Generic (PLEG): container finished" podID="0bdd4164-b891-436c-805e-1cf3a07fb6c4" containerID="6d572da308856b6b00d934ab3af9b8de610d0f3946bbed91aeefcda8da0ffe3a" exitCode=143 Dec 01 14:52:42 crc kubenswrapper[4585]: I1201 14:52:42.438524 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8599999833addf67546a50923af2aa520bd50648352b9e6832402a4003ed08f6" Dec 01 14:52:42 crc kubenswrapper[4585]: I1201 14:52:42.485498 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-p4nbp_must-gather-hczfk_0bdd4164-b891-436c-805e-1cf3a07fb6c4/copy/0.log" Dec 01 14:52:42 crc kubenswrapper[4585]: I1201 14:52:42.485929 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p4nbp/must-gather-hczfk" Dec 01 14:52:42 crc kubenswrapper[4585]: I1201 14:52:42.610560 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82h8s\" (UniqueName: \"kubernetes.io/projected/0bdd4164-b891-436c-805e-1cf3a07fb6c4-kube-api-access-82h8s\") pod \"0bdd4164-b891-436c-805e-1cf3a07fb6c4\" (UID: \"0bdd4164-b891-436c-805e-1cf3a07fb6c4\") " Dec 01 14:52:42 crc kubenswrapper[4585]: I1201 14:52:42.611014 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0bdd4164-b891-436c-805e-1cf3a07fb6c4-must-gather-output\") pod \"0bdd4164-b891-436c-805e-1cf3a07fb6c4\" (UID: \"0bdd4164-b891-436c-805e-1cf3a07fb6c4\") " Dec 01 14:52:42 crc kubenswrapper[4585]: I1201 14:52:42.618173 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bdd4164-b891-436c-805e-1cf3a07fb6c4-kube-api-access-82h8s" (OuterVolumeSpecName: "kube-api-access-82h8s") pod "0bdd4164-b891-436c-805e-1cf3a07fb6c4" (UID: "0bdd4164-b891-436c-805e-1cf3a07fb6c4"). InnerVolumeSpecName "kube-api-access-82h8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:52:42 crc kubenswrapper[4585]: I1201 14:52:42.714316 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82h8s\" (UniqueName: \"kubernetes.io/projected/0bdd4164-b891-436c-805e-1cf3a07fb6c4-kube-api-access-82h8s\") on node \"crc\" DevicePath \"\"" Dec 01 14:52:42 crc kubenswrapper[4585]: I1201 14:52:42.751522 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bdd4164-b891-436c-805e-1cf3a07fb6c4-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "0bdd4164-b891-436c-805e-1cf3a07fb6c4" (UID: "0bdd4164-b891-436c-805e-1cf3a07fb6c4"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:52:42 crc kubenswrapper[4585]: I1201 14:52:42.816711 4585 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0bdd4164-b891-436c-805e-1cf3a07fb6c4-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 01 14:52:43 crc kubenswrapper[4585]: I1201 14:52:43.446791 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p4nbp/must-gather-hczfk" Dec 01 14:52:44 crc kubenswrapper[4585]: I1201 14:52:44.425114 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bdd4164-b891-436c-805e-1cf3a07fb6c4" path="/var/lib/kubelet/pods/0bdd4164-b891-436c-805e-1cf3a07fb6c4/volumes" Dec 01 14:52:46 crc kubenswrapper[4585]: I1201 14:52:46.419341 4585 scope.go:117] "RemoveContainer" containerID="7d09788976c97c2a3c377b62292f62228c7f604af4c086c3c984900409d574e6" Dec 01 14:52:46 crc kubenswrapper[4585]: E1201 14:52:46.419961 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:52:57 crc kubenswrapper[4585]: I1201 14:52:57.413836 4585 scope.go:117] "RemoveContainer" containerID="7d09788976c97c2a3c377b62292f62228c7f604af4c086c3c984900409d574e6" Dec 01 14:52:57 crc kubenswrapper[4585]: E1201 14:52:57.414500 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:53:06 crc kubenswrapper[4585]: I1201 14:53:06.727724 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-685dj"] Dec 01 14:53:06 crc kubenswrapper[4585]: E1201 14:53:06.728863 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bdd4164-b891-436c-805e-1cf3a07fb6c4" containerName="gather" Dec 01 14:53:06 crc kubenswrapper[4585]: I1201 14:53:06.728881 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bdd4164-b891-436c-805e-1cf3a07fb6c4" containerName="gather" Dec 01 14:53:06 crc kubenswrapper[4585]: E1201 14:53:06.728895 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bdd4164-b891-436c-805e-1cf3a07fb6c4" containerName="copy" Dec 01 14:53:06 crc kubenswrapper[4585]: I1201 14:53:06.728902 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bdd4164-b891-436c-805e-1cf3a07fb6c4" containerName="copy" Dec 01 14:53:06 crc kubenswrapper[4585]: E1201 14:53:06.728926 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d09eaea-79f0-48ee-93a4-8260a8006f5d" containerName="container-00" Dec 01 14:53:06 crc kubenswrapper[4585]: I1201 14:53:06.728933 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d09eaea-79f0-48ee-93a4-8260a8006f5d" containerName="container-00" Dec 01 14:53:06 crc kubenswrapper[4585]: I1201 14:53:06.729188 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bdd4164-b891-436c-805e-1cf3a07fb6c4" containerName="copy" Dec 01 14:53:06 crc kubenswrapper[4585]: I1201 14:53:06.729205 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d09eaea-79f0-48ee-93a4-8260a8006f5d" containerName="container-00" Dec 01 14:53:06 crc kubenswrapper[4585]: I1201 14:53:06.729219 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bdd4164-b891-436c-805e-1cf3a07fb6c4" containerName="gather" Dec 01 14:53:06 crc kubenswrapper[4585]: I1201 14:53:06.730530 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-685dj" Dec 01 14:53:06 crc kubenswrapper[4585]: I1201 14:53:06.748132 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-685dj"] Dec 01 14:53:06 crc kubenswrapper[4585]: I1201 14:53:06.763523 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0f87320-7c60-4215-9f78-8fca5b6c7900-catalog-content\") pod \"certified-operators-685dj\" (UID: \"d0f87320-7c60-4215-9f78-8fca5b6c7900\") " pod="openshift-marketplace/certified-operators-685dj" Dec 01 14:53:06 crc kubenswrapper[4585]: I1201 14:53:06.763650 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzt4c\" (UniqueName: \"kubernetes.io/projected/d0f87320-7c60-4215-9f78-8fca5b6c7900-kube-api-access-dzt4c\") pod \"certified-operators-685dj\" (UID: \"d0f87320-7c60-4215-9f78-8fca5b6c7900\") " pod="openshift-marketplace/certified-operators-685dj" Dec 01 14:53:06 crc kubenswrapper[4585]: I1201 14:53:06.763682 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0f87320-7c60-4215-9f78-8fca5b6c7900-utilities\") pod \"certified-operators-685dj\" (UID: \"d0f87320-7c60-4215-9f78-8fca5b6c7900\") " pod="openshift-marketplace/certified-operators-685dj" Dec 01 14:53:06 crc kubenswrapper[4585]: I1201 14:53:06.865830 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzt4c\" (UniqueName: \"kubernetes.io/projected/d0f87320-7c60-4215-9f78-8fca5b6c7900-kube-api-access-dzt4c\") pod \"certified-operators-685dj\" (UID: \"d0f87320-7c60-4215-9f78-8fca5b6c7900\") " pod="openshift-marketplace/certified-operators-685dj" Dec 01 14:53:06 crc kubenswrapper[4585]: I1201 14:53:06.865915 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0f87320-7c60-4215-9f78-8fca5b6c7900-utilities\") pod \"certified-operators-685dj\" (UID: \"d0f87320-7c60-4215-9f78-8fca5b6c7900\") " pod="openshift-marketplace/certified-operators-685dj" Dec 01 14:53:06 crc kubenswrapper[4585]: I1201 14:53:06.866133 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0f87320-7c60-4215-9f78-8fca5b6c7900-catalog-content\") pod \"certified-operators-685dj\" (UID: \"d0f87320-7c60-4215-9f78-8fca5b6c7900\") " pod="openshift-marketplace/certified-operators-685dj" Dec 01 14:53:06 crc kubenswrapper[4585]: I1201 14:53:06.866746 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0f87320-7c60-4215-9f78-8fca5b6c7900-utilities\") pod \"certified-operators-685dj\" (UID: \"d0f87320-7c60-4215-9f78-8fca5b6c7900\") " pod="openshift-marketplace/certified-operators-685dj" Dec 01 14:53:06 crc kubenswrapper[4585]: I1201 14:53:06.866858 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0f87320-7c60-4215-9f78-8fca5b6c7900-catalog-content\") pod \"certified-operators-685dj\" (UID: \"d0f87320-7c60-4215-9f78-8fca5b6c7900\") " pod="openshift-marketplace/certified-operators-685dj" Dec 01 14:53:06 crc kubenswrapper[4585]: I1201 14:53:06.888484 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzt4c\" (UniqueName: \"kubernetes.io/projected/d0f87320-7c60-4215-9f78-8fca5b6c7900-kube-api-access-dzt4c\") pod \"certified-operators-685dj\" (UID: \"d0f87320-7c60-4215-9f78-8fca5b6c7900\") " pod="openshift-marketplace/certified-operators-685dj" Dec 01 14:53:07 crc kubenswrapper[4585]: I1201 14:53:07.075420 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-685dj" Dec 01 14:53:07 crc kubenswrapper[4585]: I1201 14:53:07.518349 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-685dj"] Dec 01 14:53:07 crc kubenswrapper[4585]: I1201 14:53:07.681836 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-685dj" event={"ID":"d0f87320-7c60-4215-9f78-8fca5b6c7900","Type":"ContainerStarted","Data":"9f4516b71442aa1b380a34e42a9b0eab10951209ec508cb3581df0523e38442a"} Dec 01 14:53:08 crc kubenswrapper[4585]: I1201 14:53:08.694411 4585 generic.go:334] "Generic (PLEG): container finished" podID="d0f87320-7c60-4215-9f78-8fca5b6c7900" containerID="45bf341041cb2c641f913e553e2a9bbb36401b527ce1605ae18d53cbc99e26dc" exitCode=0 Dec 01 14:53:08 crc kubenswrapper[4585]: I1201 14:53:08.694526 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-685dj" event={"ID":"d0f87320-7c60-4215-9f78-8fca5b6c7900","Type":"ContainerDied","Data":"45bf341041cb2c641f913e553e2a9bbb36401b527ce1605ae18d53cbc99e26dc"} Dec 01 14:53:08 crc kubenswrapper[4585]: I1201 14:53:08.697810 4585 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 14:53:09 crc kubenswrapper[4585]: I1201 14:53:09.705839 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-685dj" event={"ID":"d0f87320-7c60-4215-9f78-8fca5b6c7900","Type":"ContainerStarted","Data":"fb250bc00636ed71bc27ad8d4a8a29a040cc21e5dc04719a41d8e36698743fcd"} Dec 01 14:53:10 crc kubenswrapper[4585]: I1201 14:53:10.746291 4585 generic.go:334] "Generic (PLEG): container finished" podID="d0f87320-7c60-4215-9f78-8fca5b6c7900" containerID="fb250bc00636ed71bc27ad8d4a8a29a040cc21e5dc04719a41d8e36698743fcd" exitCode=0 Dec 01 14:53:10 crc kubenswrapper[4585]: I1201 14:53:10.746627 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-685dj" event={"ID":"d0f87320-7c60-4215-9f78-8fca5b6c7900","Type":"ContainerDied","Data":"fb250bc00636ed71bc27ad8d4a8a29a040cc21e5dc04719a41d8e36698743fcd"} Dec 01 14:53:11 crc kubenswrapper[4585]: I1201 14:53:11.412546 4585 scope.go:117] "RemoveContainer" containerID="7d09788976c97c2a3c377b62292f62228c7f604af4c086c3c984900409d574e6" Dec 01 14:53:11 crc kubenswrapper[4585]: E1201 14:53:11.413379 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:53:11 crc kubenswrapper[4585]: I1201 14:53:11.755345 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-685dj" event={"ID":"d0f87320-7c60-4215-9f78-8fca5b6c7900","Type":"ContainerStarted","Data":"de1244bf7ff398715ff715fd0600d66eb075d1d3495a72c802c9320218d799c8"} Dec 01 14:53:11 crc kubenswrapper[4585]: I1201 14:53:11.780193 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-685dj" podStartSLOduration=3.215320411 podStartE2EDuration="5.780172903s" podCreationTimestamp="2025-12-01 14:53:06 +0000 UTC" firstStartedPulling="2025-12-01 14:53:08.697519225 +0000 UTC m=+3302.681733080" lastFinishedPulling="2025-12-01 14:53:11.262371717 +0000 UTC m=+3305.246585572" observedRunningTime="2025-12-01 14:53:11.770482628 +0000 UTC m=+3305.754696503" watchObservedRunningTime="2025-12-01 14:53:11.780172903 +0000 UTC m=+3305.764386758" Dec 01 14:53:17 crc kubenswrapper[4585]: I1201 14:53:17.075805 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-685dj" Dec 01 14:53:17 crc kubenswrapper[4585]: I1201 14:53:17.076624 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-685dj" Dec 01 14:53:17 crc kubenswrapper[4585]: I1201 14:53:17.157080 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-685dj" Dec 01 14:53:17 crc kubenswrapper[4585]: I1201 14:53:17.873804 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-685dj" Dec 01 14:53:17 crc kubenswrapper[4585]: I1201 14:53:17.928046 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-685dj"] Dec 01 14:53:19 crc kubenswrapper[4585]: I1201 14:53:19.810962 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vvd5d"] Dec 01 14:53:19 crc kubenswrapper[4585]: I1201 14:53:19.817812 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vvd5d" Dec 01 14:53:19 crc kubenswrapper[4585]: I1201 14:53:19.830364 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vvd5d"] Dec 01 14:53:19 crc kubenswrapper[4585]: I1201 14:53:19.846780 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-685dj" podUID="d0f87320-7c60-4215-9f78-8fca5b6c7900" containerName="registry-server" containerID="cri-o://de1244bf7ff398715ff715fd0600d66eb075d1d3495a72c802c9320218d799c8" gracePeriod=2 Dec 01 14:53:19 crc kubenswrapper[4585]: I1201 14:53:19.938883 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x99ql\" (UniqueName: \"kubernetes.io/projected/247829d1-6608-4e90-9a9f-7b769ce169dd-kube-api-access-x99ql\") pod \"redhat-marketplace-vvd5d\" (UID: \"247829d1-6608-4e90-9a9f-7b769ce169dd\") " pod="openshift-marketplace/redhat-marketplace-vvd5d" Dec 01 14:53:19 crc kubenswrapper[4585]: I1201 14:53:19.938930 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/247829d1-6608-4e90-9a9f-7b769ce169dd-catalog-content\") pod \"redhat-marketplace-vvd5d\" (UID: \"247829d1-6608-4e90-9a9f-7b769ce169dd\") " pod="openshift-marketplace/redhat-marketplace-vvd5d" Dec 01 14:53:19 crc kubenswrapper[4585]: I1201 14:53:19.939021 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/247829d1-6608-4e90-9a9f-7b769ce169dd-utilities\") pod \"redhat-marketplace-vvd5d\" (UID: \"247829d1-6608-4e90-9a9f-7b769ce169dd\") " pod="openshift-marketplace/redhat-marketplace-vvd5d" Dec 01 14:53:20 crc kubenswrapper[4585]: I1201 14:53:20.041532 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x99ql\" (UniqueName: \"kubernetes.io/projected/247829d1-6608-4e90-9a9f-7b769ce169dd-kube-api-access-x99ql\") pod \"redhat-marketplace-vvd5d\" (UID: \"247829d1-6608-4e90-9a9f-7b769ce169dd\") " pod="openshift-marketplace/redhat-marketplace-vvd5d" Dec 01 14:53:20 crc kubenswrapper[4585]: I1201 14:53:20.042037 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/247829d1-6608-4e90-9a9f-7b769ce169dd-catalog-content\") pod \"redhat-marketplace-vvd5d\" (UID: \"247829d1-6608-4e90-9a9f-7b769ce169dd\") " pod="openshift-marketplace/redhat-marketplace-vvd5d" Dec 01 14:53:20 crc kubenswrapper[4585]: I1201 14:53:20.042266 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/247829d1-6608-4e90-9a9f-7b769ce169dd-utilities\") pod \"redhat-marketplace-vvd5d\" (UID: \"247829d1-6608-4e90-9a9f-7b769ce169dd\") " pod="openshift-marketplace/redhat-marketplace-vvd5d" Dec 01 14:53:20 crc kubenswrapper[4585]: I1201 14:53:20.043154 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/247829d1-6608-4e90-9a9f-7b769ce169dd-utilities\") pod \"redhat-marketplace-vvd5d\" (UID: \"247829d1-6608-4e90-9a9f-7b769ce169dd\") " pod="openshift-marketplace/redhat-marketplace-vvd5d" Dec 01 14:53:20 crc kubenswrapper[4585]: I1201 14:53:20.043154 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/247829d1-6608-4e90-9a9f-7b769ce169dd-catalog-content\") pod \"redhat-marketplace-vvd5d\" (UID: \"247829d1-6608-4e90-9a9f-7b769ce169dd\") " pod="openshift-marketplace/redhat-marketplace-vvd5d" Dec 01 14:53:20 crc kubenswrapper[4585]: I1201 14:53:20.069942 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x99ql\" (UniqueName: \"kubernetes.io/projected/247829d1-6608-4e90-9a9f-7b769ce169dd-kube-api-access-x99ql\") pod \"redhat-marketplace-vvd5d\" (UID: \"247829d1-6608-4e90-9a9f-7b769ce169dd\") " pod="openshift-marketplace/redhat-marketplace-vvd5d" Dec 01 14:53:20 crc kubenswrapper[4585]: I1201 14:53:20.141798 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vvd5d" Dec 01 14:53:20 crc kubenswrapper[4585]: I1201 14:53:20.453483 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-685dj" Dec 01 14:53:20 crc kubenswrapper[4585]: I1201 14:53:20.555024 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0f87320-7c60-4215-9f78-8fca5b6c7900-catalog-content\") pod \"d0f87320-7c60-4215-9f78-8fca5b6c7900\" (UID: \"d0f87320-7c60-4215-9f78-8fca5b6c7900\") " Dec 01 14:53:20 crc kubenswrapper[4585]: I1201 14:53:20.555429 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0f87320-7c60-4215-9f78-8fca5b6c7900-utilities\") pod \"d0f87320-7c60-4215-9f78-8fca5b6c7900\" (UID: \"d0f87320-7c60-4215-9f78-8fca5b6c7900\") " Dec 01 14:53:20 crc kubenswrapper[4585]: I1201 14:53:20.555589 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dzt4c\" (UniqueName: \"kubernetes.io/projected/d0f87320-7c60-4215-9f78-8fca5b6c7900-kube-api-access-dzt4c\") pod \"d0f87320-7c60-4215-9f78-8fca5b6c7900\" (UID: \"d0f87320-7c60-4215-9f78-8fca5b6c7900\") " Dec 01 14:53:20 crc kubenswrapper[4585]: I1201 14:53:20.557653 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0f87320-7c60-4215-9f78-8fca5b6c7900-utilities" (OuterVolumeSpecName: "utilities") pod "d0f87320-7c60-4215-9f78-8fca5b6c7900" (UID: "d0f87320-7c60-4215-9f78-8fca5b6c7900"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:53:20 crc kubenswrapper[4585]: I1201 14:53:20.560897 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0f87320-7c60-4215-9f78-8fca5b6c7900-kube-api-access-dzt4c" (OuterVolumeSpecName: "kube-api-access-dzt4c") pod "d0f87320-7c60-4215-9f78-8fca5b6c7900" (UID: "d0f87320-7c60-4215-9f78-8fca5b6c7900"). InnerVolumeSpecName "kube-api-access-dzt4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:53:20 crc kubenswrapper[4585]: I1201 14:53:20.613012 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0f87320-7c60-4215-9f78-8fca5b6c7900-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d0f87320-7c60-4215-9f78-8fca5b6c7900" (UID: "d0f87320-7c60-4215-9f78-8fca5b6c7900"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:53:20 crc kubenswrapper[4585]: I1201 14:53:20.658316 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dzt4c\" (UniqueName: \"kubernetes.io/projected/d0f87320-7c60-4215-9f78-8fca5b6c7900-kube-api-access-dzt4c\") on node \"crc\" DevicePath \"\"" Dec 01 14:53:20 crc kubenswrapper[4585]: I1201 14:53:20.658353 4585 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0f87320-7c60-4215-9f78-8fca5b6c7900-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 14:53:20 crc kubenswrapper[4585]: I1201 14:53:20.658363 4585 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0f87320-7c60-4215-9f78-8fca5b6c7900-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 14:53:20 crc kubenswrapper[4585]: I1201 14:53:20.722089 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vvd5d"] Dec 01 14:53:20 crc kubenswrapper[4585]: I1201 14:53:20.855668 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vvd5d" event={"ID":"247829d1-6608-4e90-9a9f-7b769ce169dd","Type":"ContainerStarted","Data":"8e7bed3950c5b1b2ed3e4ffc00c508295b267285ab994cd570f989ceb75561b6"} Dec 01 14:53:20 crc kubenswrapper[4585]: I1201 14:53:20.859569 4585 generic.go:334] "Generic (PLEG): container finished" podID="d0f87320-7c60-4215-9f78-8fca5b6c7900" containerID="de1244bf7ff398715ff715fd0600d66eb075d1d3495a72c802c9320218d799c8" exitCode=0 Dec 01 14:53:20 crc kubenswrapper[4585]: I1201 14:53:20.859617 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-685dj" event={"ID":"d0f87320-7c60-4215-9f78-8fca5b6c7900","Type":"ContainerDied","Data":"de1244bf7ff398715ff715fd0600d66eb075d1d3495a72c802c9320218d799c8"} Dec 01 14:53:20 crc kubenswrapper[4585]: I1201 14:53:20.859640 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-685dj" Dec 01 14:53:20 crc kubenswrapper[4585]: I1201 14:53:20.859658 4585 scope.go:117] "RemoveContainer" containerID="de1244bf7ff398715ff715fd0600d66eb075d1d3495a72c802c9320218d799c8" Dec 01 14:53:20 crc kubenswrapper[4585]: I1201 14:53:20.859647 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-685dj" event={"ID":"d0f87320-7c60-4215-9f78-8fca5b6c7900","Type":"ContainerDied","Data":"9f4516b71442aa1b380a34e42a9b0eab10951209ec508cb3581df0523e38442a"} Dec 01 14:53:20 crc kubenswrapper[4585]: I1201 14:53:20.900662 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-685dj"] Dec 01 14:53:20 crc kubenswrapper[4585]: I1201 14:53:20.909178 4585 scope.go:117] "RemoveContainer" containerID="fb250bc00636ed71bc27ad8d4a8a29a040cc21e5dc04719a41d8e36698743fcd" Dec 01 14:53:20 crc kubenswrapper[4585]: I1201 14:53:20.913232 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-685dj"] Dec 01 14:53:20 crc kubenswrapper[4585]: I1201 14:53:20.942215 4585 scope.go:117] "RemoveContainer" containerID="45bf341041cb2c641f913e553e2a9bbb36401b527ce1605ae18d53cbc99e26dc" Dec 01 14:53:21 crc kubenswrapper[4585]: I1201 14:53:21.074779 4585 scope.go:117] "RemoveContainer" containerID="de1244bf7ff398715ff715fd0600d66eb075d1d3495a72c802c9320218d799c8" Dec 01 14:53:21 crc kubenswrapper[4585]: E1201 14:53:21.075673 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de1244bf7ff398715ff715fd0600d66eb075d1d3495a72c802c9320218d799c8\": container with ID starting with de1244bf7ff398715ff715fd0600d66eb075d1d3495a72c802c9320218d799c8 not found: ID does not exist" containerID="de1244bf7ff398715ff715fd0600d66eb075d1d3495a72c802c9320218d799c8" Dec 01 14:53:21 crc kubenswrapper[4585]: I1201 14:53:21.075730 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de1244bf7ff398715ff715fd0600d66eb075d1d3495a72c802c9320218d799c8"} err="failed to get container status \"de1244bf7ff398715ff715fd0600d66eb075d1d3495a72c802c9320218d799c8\": rpc error: code = NotFound desc = could not find container \"de1244bf7ff398715ff715fd0600d66eb075d1d3495a72c802c9320218d799c8\": container with ID starting with de1244bf7ff398715ff715fd0600d66eb075d1d3495a72c802c9320218d799c8 not found: ID does not exist" Dec 01 14:53:21 crc kubenswrapper[4585]: I1201 14:53:21.075788 4585 scope.go:117] "RemoveContainer" containerID="fb250bc00636ed71bc27ad8d4a8a29a040cc21e5dc04719a41d8e36698743fcd" Dec 01 14:53:21 crc kubenswrapper[4585]: E1201 14:53:21.076206 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb250bc00636ed71bc27ad8d4a8a29a040cc21e5dc04719a41d8e36698743fcd\": container with ID starting with fb250bc00636ed71bc27ad8d4a8a29a040cc21e5dc04719a41d8e36698743fcd not found: ID does not exist" containerID="fb250bc00636ed71bc27ad8d4a8a29a040cc21e5dc04719a41d8e36698743fcd" Dec 01 14:53:21 crc kubenswrapper[4585]: I1201 14:53:21.076259 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb250bc00636ed71bc27ad8d4a8a29a040cc21e5dc04719a41d8e36698743fcd"} err="failed to get container status \"fb250bc00636ed71bc27ad8d4a8a29a040cc21e5dc04719a41d8e36698743fcd\": rpc error: code = NotFound desc = could not find container \"fb250bc00636ed71bc27ad8d4a8a29a040cc21e5dc04719a41d8e36698743fcd\": container with ID starting with fb250bc00636ed71bc27ad8d4a8a29a040cc21e5dc04719a41d8e36698743fcd not found: ID does not exist" Dec 01 14:53:21 crc kubenswrapper[4585]: I1201 14:53:21.076296 4585 scope.go:117] "RemoveContainer" containerID="45bf341041cb2c641f913e553e2a9bbb36401b527ce1605ae18d53cbc99e26dc" Dec 01 14:53:21 crc kubenswrapper[4585]: E1201 14:53:21.076780 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45bf341041cb2c641f913e553e2a9bbb36401b527ce1605ae18d53cbc99e26dc\": container with ID starting with 45bf341041cb2c641f913e553e2a9bbb36401b527ce1605ae18d53cbc99e26dc not found: ID does not exist" containerID="45bf341041cb2c641f913e553e2a9bbb36401b527ce1605ae18d53cbc99e26dc" Dec 01 14:53:21 crc kubenswrapper[4585]: I1201 14:53:21.076813 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45bf341041cb2c641f913e553e2a9bbb36401b527ce1605ae18d53cbc99e26dc"} err="failed to get container status \"45bf341041cb2c641f913e553e2a9bbb36401b527ce1605ae18d53cbc99e26dc\": rpc error: code = NotFound desc = could not find container \"45bf341041cb2c641f913e553e2a9bbb36401b527ce1605ae18d53cbc99e26dc\": container with ID starting with 45bf341041cb2c641f913e553e2a9bbb36401b527ce1605ae18d53cbc99e26dc not found: ID does not exist" Dec 01 14:53:21 crc kubenswrapper[4585]: I1201 14:53:21.873474 4585 generic.go:334] "Generic (PLEG): container finished" podID="247829d1-6608-4e90-9a9f-7b769ce169dd" containerID="80f87303485335de89f0968a0d80cce7b3a3d5ad2f593f37e4dfb2accf82e8e7" exitCode=0 Dec 01 14:53:21 crc kubenswrapper[4585]: I1201 14:53:21.873591 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vvd5d" event={"ID":"247829d1-6608-4e90-9a9f-7b769ce169dd","Type":"ContainerDied","Data":"80f87303485335de89f0968a0d80cce7b3a3d5ad2f593f37e4dfb2accf82e8e7"} Dec 01 14:53:22 crc kubenswrapper[4585]: I1201 14:53:22.426147 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0f87320-7c60-4215-9f78-8fca5b6c7900" path="/var/lib/kubelet/pods/d0f87320-7c60-4215-9f78-8fca5b6c7900/volumes" Dec 01 14:53:23 crc kubenswrapper[4585]: I1201 14:53:23.895896 4585 generic.go:334] "Generic (PLEG): container finished" podID="247829d1-6608-4e90-9a9f-7b769ce169dd" containerID="892dd6e99314566cf0a60dcf8667fdf62c040b2bf5e096b67760ceacb0427078" exitCode=0 Dec 01 14:53:23 crc kubenswrapper[4585]: I1201 14:53:23.895995 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vvd5d" event={"ID":"247829d1-6608-4e90-9a9f-7b769ce169dd","Type":"ContainerDied","Data":"892dd6e99314566cf0a60dcf8667fdf62c040b2bf5e096b67760ceacb0427078"} Dec 01 14:53:24 crc kubenswrapper[4585]: I1201 14:53:24.909064 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vvd5d" event={"ID":"247829d1-6608-4e90-9a9f-7b769ce169dd","Type":"ContainerStarted","Data":"aa2c7c07229906924ed21a2b22efc88c953ab306474999813a26c57f4e9951fd"} Dec 01 14:53:24 crc kubenswrapper[4585]: I1201 14:53:24.933057 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vvd5d" podStartSLOduration=3.227345656 podStartE2EDuration="5.933029062s" podCreationTimestamp="2025-12-01 14:53:19 +0000 UTC" firstStartedPulling="2025-12-01 14:53:21.876234295 +0000 UTC m=+3315.860448160" lastFinishedPulling="2025-12-01 14:53:24.581917711 +0000 UTC m=+3318.566131566" observedRunningTime="2025-12-01 14:53:24.931396119 +0000 UTC m=+3318.915609984" watchObservedRunningTime="2025-12-01 14:53:24.933029062 +0000 UTC m=+3318.917242937" Dec 01 14:53:25 crc kubenswrapper[4585]: I1201 14:53:25.412893 4585 scope.go:117] "RemoveContainer" containerID="7d09788976c97c2a3c377b62292f62228c7f604af4c086c3c984900409d574e6" Dec 01 14:53:25 crc kubenswrapper[4585]: E1201 14:53:25.413282 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:53:30 crc kubenswrapper[4585]: I1201 14:53:30.142815 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vvd5d" Dec 01 14:53:30 crc kubenswrapper[4585]: I1201 14:53:30.143737 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vvd5d" Dec 01 14:53:30 crc kubenswrapper[4585]: I1201 14:53:30.189263 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vvd5d" Dec 01 14:53:31 crc kubenswrapper[4585]: I1201 14:53:31.030505 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vvd5d" Dec 01 14:53:31 crc kubenswrapper[4585]: I1201 14:53:31.086320 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vvd5d"] Dec 01 14:53:32 crc kubenswrapper[4585]: I1201 14:53:32.996023 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vvd5d" podUID="247829d1-6608-4e90-9a9f-7b769ce169dd" containerName="registry-server" containerID="cri-o://aa2c7c07229906924ed21a2b22efc88c953ab306474999813a26c57f4e9951fd" gracePeriod=2 Dec 01 14:53:33 crc kubenswrapper[4585]: I1201 14:53:33.437622 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vvd5d" Dec 01 14:53:33 crc kubenswrapper[4585]: I1201 14:53:33.537139 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/247829d1-6608-4e90-9a9f-7b769ce169dd-utilities\") pod \"247829d1-6608-4e90-9a9f-7b769ce169dd\" (UID: \"247829d1-6608-4e90-9a9f-7b769ce169dd\") " Dec 01 14:53:33 crc kubenswrapper[4585]: I1201 14:53:33.537268 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/247829d1-6608-4e90-9a9f-7b769ce169dd-catalog-content\") pod \"247829d1-6608-4e90-9a9f-7b769ce169dd\" (UID: \"247829d1-6608-4e90-9a9f-7b769ce169dd\") " Dec 01 14:53:33 crc kubenswrapper[4585]: I1201 14:53:33.537393 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x99ql\" (UniqueName: \"kubernetes.io/projected/247829d1-6608-4e90-9a9f-7b769ce169dd-kube-api-access-x99ql\") pod \"247829d1-6608-4e90-9a9f-7b769ce169dd\" (UID: \"247829d1-6608-4e90-9a9f-7b769ce169dd\") " Dec 01 14:53:33 crc kubenswrapper[4585]: I1201 14:53:33.538845 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/247829d1-6608-4e90-9a9f-7b769ce169dd-utilities" (OuterVolumeSpecName: "utilities") pod "247829d1-6608-4e90-9a9f-7b769ce169dd" (UID: "247829d1-6608-4e90-9a9f-7b769ce169dd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:53:33 crc kubenswrapper[4585]: I1201 14:53:33.544183 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/247829d1-6608-4e90-9a9f-7b769ce169dd-kube-api-access-x99ql" (OuterVolumeSpecName: "kube-api-access-x99ql") pod "247829d1-6608-4e90-9a9f-7b769ce169dd" (UID: "247829d1-6608-4e90-9a9f-7b769ce169dd"). InnerVolumeSpecName "kube-api-access-x99ql". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:53:33 crc kubenswrapper[4585]: I1201 14:53:33.560673 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/247829d1-6608-4e90-9a9f-7b769ce169dd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "247829d1-6608-4e90-9a9f-7b769ce169dd" (UID: "247829d1-6608-4e90-9a9f-7b769ce169dd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:53:33 crc kubenswrapper[4585]: I1201 14:53:33.639906 4585 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/247829d1-6608-4e90-9a9f-7b769ce169dd-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 14:53:33 crc kubenswrapper[4585]: I1201 14:53:33.640178 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x99ql\" (UniqueName: \"kubernetes.io/projected/247829d1-6608-4e90-9a9f-7b769ce169dd-kube-api-access-x99ql\") on node \"crc\" DevicePath \"\"" Dec 01 14:53:33 crc kubenswrapper[4585]: I1201 14:53:33.640262 4585 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/247829d1-6608-4e90-9a9f-7b769ce169dd-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 14:53:34 crc kubenswrapper[4585]: I1201 14:53:34.008718 4585 generic.go:334] "Generic (PLEG): container finished" podID="247829d1-6608-4e90-9a9f-7b769ce169dd" containerID="aa2c7c07229906924ed21a2b22efc88c953ab306474999813a26c57f4e9951fd" exitCode=0 Dec 01 14:53:34 crc kubenswrapper[4585]: I1201 14:53:34.008868 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vvd5d" event={"ID":"247829d1-6608-4e90-9a9f-7b769ce169dd","Type":"ContainerDied","Data":"aa2c7c07229906924ed21a2b22efc88c953ab306474999813a26c57f4e9951fd"} Dec 01 14:53:34 crc kubenswrapper[4585]: I1201 14:53:34.009116 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vvd5d" event={"ID":"247829d1-6608-4e90-9a9f-7b769ce169dd","Type":"ContainerDied","Data":"8e7bed3950c5b1b2ed3e4ffc00c508295b267285ab994cd570f989ceb75561b6"} Dec 01 14:53:34 crc kubenswrapper[4585]: I1201 14:53:34.009145 4585 scope.go:117] "RemoveContainer" containerID="aa2c7c07229906924ed21a2b22efc88c953ab306474999813a26c57f4e9951fd" Dec 01 14:53:34 crc kubenswrapper[4585]: I1201 14:53:34.008925 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vvd5d" Dec 01 14:53:34 crc kubenswrapper[4585]: I1201 14:53:34.056565 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vvd5d"] Dec 01 14:53:34 crc kubenswrapper[4585]: I1201 14:53:34.057756 4585 scope.go:117] "RemoveContainer" containerID="892dd6e99314566cf0a60dcf8667fdf62c040b2bf5e096b67760ceacb0427078" Dec 01 14:53:34 crc kubenswrapper[4585]: I1201 14:53:34.072788 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vvd5d"] Dec 01 14:53:34 crc kubenswrapper[4585]: I1201 14:53:34.079551 4585 scope.go:117] "RemoveContainer" containerID="80f87303485335de89f0968a0d80cce7b3a3d5ad2f593f37e4dfb2accf82e8e7" Dec 01 14:53:34 crc kubenswrapper[4585]: I1201 14:53:34.123761 4585 scope.go:117] "RemoveContainer" containerID="aa2c7c07229906924ed21a2b22efc88c953ab306474999813a26c57f4e9951fd" Dec 01 14:53:34 crc kubenswrapper[4585]: E1201 14:53:34.124236 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa2c7c07229906924ed21a2b22efc88c953ab306474999813a26c57f4e9951fd\": container with ID starting with aa2c7c07229906924ed21a2b22efc88c953ab306474999813a26c57f4e9951fd not found: ID does not exist" containerID="aa2c7c07229906924ed21a2b22efc88c953ab306474999813a26c57f4e9951fd" Dec 01 14:53:34 crc kubenswrapper[4585]: I1201 14:53:34.124273 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa2c7c07229906924ed21a2b22efc88c953ab306474999813a26c57f4e9951fd"} err="failed to get container status \"aa2c7c07229906924ed21a2b22efc88c953ab306474999813a26c57f4e9951fd\": rpc error: code = NotFound desc = could not find container \"aa2c7c07229906924ed21a2b22efc88c953ab306474999813a26c57f4e9951fd\": container with ID starting with aa2c7c07229906924ed21a2b22efc88c953ab306474999813a26c57f4e9951fd not found: ID does not exist" Dec 01 14:53:34 crc kubenswrapper[4585]: I1201 14:53:34.124298 4585 scope.go:117] "RemoveContainer" containerID="892dd6e99314566cf0a60dcf8667fdf62c040b2bf5e096b67760ceacb0427078" Dec 01 14:53:34 crc kubenswrapper[4585]: E1201 14:53:34.124620 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"892dd6e99314566cf0a60dcf8667fdf62c040b2bf5e096b67760ceacb0427078\": container with ID starting with 892dd6e99314566cf0a60dcf8667fdf62c040b2bf5e096b67760ceacb0427078 not found: ID does not exist" containerID="892dd6e99314566cf0a60dcf8667fdf62c040b2bf5e096b67760ceacb0427078" Dec 01 14:53:34 crc kubenswrapper[4585]: I1201 14:53:34.124648 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"892dd6e99314566cf0a60dcf8667fdf62c040b2bf5e096b67760ceacb0427078"} err="failed to get container status \"892dd6e99314566cf0a60dcf8667fdf62c040b2bf5e096b67760ceacb0427078\": rpc error: code = NotFound desc = could not find container \"892dd6e99314566cf0a60dcf8667fdf62c040b2bf5e096b67760ceacb0427078\": container with ID starting with 892dd6e99314566cf0a60dcf8667fdf62c040b2bf5e096b67760ceacb0427078 not found: ID does not exist" Dec 01 14:53:34 crc kubenswrapper[4585]: I1201 14:53:34.124665 4585 scope.go:117] "RemoveContainer" containerID="80f87303485335de89f0968a0d80cce7b3a3d5ad2f593f37e4dfb2accf82e8e7" Dec 01 14:53:34 crc kubenswrapper[4585]: E1201 14:53:34.124899 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80f87303485335de89f0968a0d80cce7b3a3d5ad2f593f37e4dfb2accf82e8e7\": container with ID starting with 80f87303485335de89f0968a0d80cce7b3a3d5ad2f593f37e4dfb2accf82e8e7 not found: ID does not exist" containerID="80f87303485335de89f0968a0d80cce7b3a3d5ad2f593f37e4dfb2accf82e8e7" Dec 01 14:53:34 crc kubenswrapper[4585]: I1201 14:53:34.124929 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80f87303485335de89f0968a0d80cce7b3a3d5ad2f593f37e4dfb2accf82e8e7"} err="failed to get container status \"80f87303485335de89f0968a0d80cce7b3a3d5ad2f593f37e4dfb2accf82e8e7\": rpc error: code = NotFound desc = could not find container \"80f87303485335de89f0968a0d80cce7b3a3d5ad2f593f37e4dfb2accf82e8e7\": container with ID starting with 80f87303485335de89f0968a0d80cce7b3a3d5ad2f593f37e4dfb2accf82e8e7 not found: ID does not exist" Dec 01 14:53:34 crc kubenswrapper[4585]: I1201 14:53:34.424403 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="247829d1-6608-4e90-9a9f-7b769ce169dd" path="/var/lib/kubelet/pods/247829d1-6608-4e90-9a9f-7b769ce169dd/volumes" Dec 01 14:53:35 crc kubenswrapper[4585]: I1201 14:53:35.436069 4585 scope.go:117] "RemoveContainer" containerID="6d572da308856b6b00d934ab3af9b8de610d0f3946bbed91aeefcda8da0ffe3a" Dec 01 14:53:35 crc kubenswrapper[4585]: I1201 14:53:35.460884 4585 scope.go:117] "RemoveContainer" containerID="baa05d52bdf2eb7b346d84c77f455a3c31addbc1d80676848b7348962187d9c2" Dec 01 14:53:35 crc kubenswrapper[4585]: I1201 14:53:35.492189 4585 scope.go:117] "RemoveContainer" containerID="64d5b79dcc2c3ee0ad987e672c1c09a119869ebbb2d3a8ed66b789bc68960a2f" Dec 01 14:53:38 crc kubenswrapper[4585]: I1201 14:53:38.412354 4585 scope.go:117] "RemoveContainer" containerID="7d09788976c97c2a3c377b62292f62228c7f604af4c086c3c984900409d574e6" Dec 01 14:53:38 crc kubenswrapper[4585]: E1201 14:53:38.413455 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:53:52 crc kubenswrapper[4585]: I1201 14:53:52.413138 4585 scope.go:117] "RemoveContainer" containerID="7d09788976c97c2a3c377b62292f62228c7f604af4c086c3c984900409d574e6" Dec 01 14:53:52 crc kubenswrapper[4585]: E1201 14:53:52.414103 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:54:07 crc kubenswrapper[4585]: I1201 14:54:07.413385 4585 scope.go:117] "RemoveContainer" containerID="7d09788976c97c2a3c377b62292f62228c7f604af4c086c3c984900409d574e6" Dec 01 14:54:07 crc kubenswrapper[4585]: E1201 14:54:07.414294 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:54:21 crc kubenswrapper[4585]: I1201 14:54:21.413120 4585 scope.go:117] "RemoveContainer" containerID="7d09788976c97c2a3c377b62292f62228c7f604af4c086c3c984900409d574e6" Dec 01 14:54:21 crc kubenswrapper[4585]: E1201 14:54:21.414137 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:54:21 crc kubenswrapper[4585]: I1201 14:54:21.526286 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5ngv2"] Dec 01 14:54:21 crc kubenswrapper[4585]: E1201 14:54:21.526757 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0f87320-7c60-4215-9f78-8fca5b6c7900" containerName="extract-content" Dec 01 14:54:21 crc kubenswrapper[4585]: I1201 14:54:21.526782 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0f87320-7c60-4215-9f78-8fca5b6c7900" containerName="extract-content" Dec 01 14:54:21 crc kubenswrapper[4585]: E1201 14:54:21.526815 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0f87320-7c60-4215-9f78-8fca5b6c7900" containerName="extract-utilities" Dec 01 14:54:21 crc kubenswrapper[4585]: I1201 14:54:21.526825 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0f87320-7c60-4215-9f78-8fca5b6c7900" containerName="extract-utilities" Dec 01 14:54:21 crc kubenswrapper[4585]: E1201 14:54:21.526845 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="247829d1-6608-4e90-9a9f-7b769ce169dd" containerName="extract-content" Dec 01 14:54:21 crc kubenswrapper[4585]: I1201 14:54:21.526853 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="247829d1-6608-4e90-9a9f-7b769ce169dd" containerName="extract-content" Dec 01 14:54:21 crc kubenswrapper[4585]: E1201 14:54:21.526868 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="247829d1-6608-4e90-9a9f-7b769ce169dd" containerName="registry-server" Dec 01 14:54:21 crc kubenswrapper[4585]: I1201 14:54:21.526875 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="247829d1-6608-4e90-9a9f-7b769ce169dd" containerName="registry-server" Dec 01 14:54:21 crc kubenswrapper[4585]: E1201 14:54:21.526887 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="247829d1-6608-4e90-9a9f-7b769ce169dd" containerName="extract-utilities" Dec 01 14:54:21 crc kubenswrapper[4585]: I1201 14:54:21.526895 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="247829d1-6608-4e90-9a9f-7b769ce169dd" containerName="extract-utilities" Dec 01 14:54:21 crc kubenswrapper[4585]: E1201 14:54:21.526907 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0f87320-7c60-4215-9f78-8fca5b6c7900" containerName="registry-server" Dec 01 14:54:21 crc kubenswrapper[4585]: I1201 14:54:21.526915 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0f87320-7c60-4215-9f78-8fca5b6c7900" containerName="registry-server" Dec 01 14:54:21 crc kubenswrapper[4585]: I1201 14:54:21.527134 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="247829d1-6608-4e90-9a9f-7b769ce169dd" containerName="registry-server" Dec 01 14:54:21 crc kubenswrapper[4585]: I1201 14:54:21.527150 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0f87320-7c60-4215-9f78-8fca5b6c7900" containerName="registry-server" Dec 01 14:54:21 crc kubenswrapper[4585]: I1201 14:54:21.528810 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5ngv2" Dec 01 14:54:21 crc kubenswrapper[4585]: I1201 14:54:21.550599 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5ngv2"] Dec 01 14:54:21 crc kubenswrapper[4585]: I1201 14:54:21.634917 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/def91c25-0318-4b71-9022-8091782fe144-utilities\") pod \"redhat-operators-5ngv2\" (UID: \"def91c25-0318-4b71-9022-8091782fe144\") " pod="openshift-marketplace/redhat-operators-5ngv2" Dec 01 14:54:21 crc kubenswrapper[4585]: I1201 14:54:21.634998 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/def91c25-0318-4b71-9022-8091782fe144-catalog-content\") pod \"redhat-operators-5ngv2\" (UID: \"def91c25-0318-4b71-9022-8091782fe144\") " pod="openshift-marketplace/redhat-operators-5ngv2" Dec 01 14:54:21 crc kubenswrapper[4585]: I1201 14:54:21.635122 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlmr7\" (UniqueName: \"kubernetes.io/projected/def91c25-0318-4b71-9022-8091782fe144-kube-api-access-qlmr7\") pod \"redhat-operators-5ngv2\" (UID: \"def91c25-0318-4b71-9022-8091782fe144\") " pod="openshift-marketplace/redhat-operators-5ngv2" Dec 01 14:54:21 crc kubenswrapper[4585]: I1201 14:54:21.736430 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/def91c25-0318-4b71-9022-8091782fe144-catalog-content\") pod \"redhat-operators-5ngv2\" (UID: \"def91c25-0318-4b71-9022-8091782fe144\") " pod="openshift-marketplace/redhat-operators-5ngv2" Dec 01 14:54:21 crc kubenswrapper[4585]: I1201 14:54:21.736534 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlmr7\" (UniqueName: \"kubernetes.io/projected/def91c25-0318-4b71-9022-8091782fe144-kube-api-access-qlmr7\") pod \"redhat-operators-5ngv2\" (UID: \"def91c25-0318-4b71-9022-8091782fe144\") " pod="openshift-marketplace/redhat-operators-5ngv2" Dec 01 14:54:21 crc kubenswrapper[4585]: I1201 14:54:21.736686 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/def91c25-0318-4b71-9022-8091782fe144-utilities\") pod \"redhat-operators-5ngv2\" (UID: \"def91c25-0318-4b71-9022-8091782fe144\") " pod="openshift-marketplace/redhat-operators-5ngv2" Dec 01 14:54:21 crc kubenswrapper[4585]: I1201 14:54:21.736820 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/def91c25-0318-4b71-9022-8091782fe144-catalog-content\") pod \"redhat-operators-5ngv2\" (UID: \"def91c25-0318-4b71-9022-8091782fe144\") " pod="openshift-marketplace/redhat-operators-5ngv2" Dec 01 14:54:21 crc kubenswrapper[4585]: I1201 14:54:21.737069 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/def91c25-0318-4b71-9022-8091782fe144-utilities\") pod \"redhat-operators-5ngv2\" (UID: \"def91c25-0318-4b71-9022-8091782fe144\") " pod="openshift-marketplace/redhat-operators-5ngv2" Dec 01 14:54:21 crc kubenswrapper[4585]: I1201 14:54:21.758309 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlmr7\" (UniqueName: \"kubernetes.io/projected/def91c25-0318-4b71-9022-8091782fe144-kube-api-access-qlmr7\") pod \"redhat-operators-5ngv2\" (UID: \"def91c25-0318-4b71-9022-8091782fe144\") " pod="openshift-marketplace/redhat-operators-5ngv2" Dec 01 14:54:21 crc kubenswrapper[4585]: I1201 14:54:21.854894 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5ngv2" Dec 01 14:54:22 crc kubenswrapper[4585]: I1201 14:54:22.352258 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5ngv2"] Dec 01 14:54:22 crc kubenswrapper[4585]: I1201 14:54:22.490731 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5ngv2" event={"ID":"def91c25-0318-4b71-9022-8091782fe144","Type":"ContainerStarted","Data":"7107d07a1abfe64e204beb6dcfddf8d987b76dd0a50a78db80eee8394c45c54e"} Dec 01 14:54:23 crc kubenswrapper[4585]: I1201 14:54:23.499365 4585 generic.go:334] "Generic (PLEG): container finished" podID="def91c25-0318-4b71-9022-8091782fe144" containerID="e4c1286611f354a708fd0193bf74b156edf8c44d925f468e4171edc75c83f1fb" exitCode=0 Dec 01 14:54:23 crc kubenswrapper[4585]: I1201 14:54:23.499397 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5ngv2" event={"ID":"def91c25-0318-4b71-9022-8091782fe144","Type":"ContainerDied","Data":"e4c1286611f354a708fd0193bf74b156edf8c44d925f468e4171edc75c83f1fb"} Dec 01 14:54:25 crc kubenswrapper[4585]: I1201 14:54:25.518207 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5ngv2" event={"ID":"def91c25-0318-4b71-9022-8091782fe144","Type":"ContainerStarted","Data":"d90ea6b559e913886084d753d30b6ac0427b031f209ca81ee750176cf7326aa8"} Dec 01 14:54:28 crc kubenswrapper[4585]: I1201 14:54:28.545328 4585 generic.go:334] "Generic (PLEG): container finished" podID="def91c25-0318-4b71-9022-8091782fe144" containerID="d90ea6b559e913886084d753d30b6ac0427b031f209ca81ee750176cf7326aa8" exitCode=0 Dec 01 14:54:28 crc kubenswrapper[4585]: I1201 14:54:28.545695 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5ngv2" event={"ID":"def91c25-0318-4b71-9022-8091782fe144","Type":"ContainerDied","Data":"d90ea6b559e913886084d753d30b6ac0427b031f209ca81ee750176cf7326aa8"} Dec 01 14:54:29 crc kubenswrapper[4585]: I1201 14:54:29.560768 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5ngv2" event={"ID":"def91c25-0318-4b71-9022-8091782fe144","Type":"ContainerStarted","Data":"383b0c32e6062fd6c2f32431b92be604f49049f9a4dcb3e13673549a5c4f94d8"} Dec 01 14:54:29 crc kubenswrapper[4585]: I1201 14:54:29.593423 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5ngv2" podStartSLOduration=2.829373957 podStartE2EDuration="8.59340031s" podCreationTimestamp="2025-12-01 14:54:21 +0000 UTC" firstStartedPulling="2025-12-01 14:54:23.501309375 +0000 UTC m=+3377.485523240" lastFinishedPulling="2025-12-01 14:54:29.265335738 +0000 UTC m=+3383.249549593" observedRunningTime="2025-12-01 14:54:29.579783591 +0000 UTC m=+3383.563997446" watchObservedRunningTime="2025-12-01 14:54:29.59340031 +0000 UTC m=+3383.577614165" Dec 01 14:54:31 crc kubenswrapper[4585]: I1201 14:54:31.855152 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5ngv2" Dec 01 14:54:31 crc kubenswrapper[4585]: I1201 14:54:31.855519 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5ngv2" Dec 01 14:54:32 crc kubenswrapper[4585]: I1201 14:54:32.901384 4585 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5ngv2" podUID="def91c25-0318-4b71-9022-8091782fe144" containerName="registry-server" probeResult="failure" output=< Dec 01 14:54:32 crc kubenswrapper[4585]: timeout: failed to connect service ":50051" within 1s Dec 01 14:54:32 crc kubenswrapper[4585]: > Dec 01 14:54:35 crc kubenswrapper[4585]: I1201 14:54:35.412057 4585 scope.go:117] "RemoveContainer" containerID="7d09788976c97c2a3c377b62292f62228c7f604af4c086c3c984900409d574e6" Dec 01 14:54:35 crc kubenswrapper[4585]: E1201 14:54:35.412626 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 14:54:41 crc kubenswrapper[4585]: I1201 14:54:41.914568 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5ngv2" Dec 01 14:54:41 crc kubenswrapper[4585]: I1201 14:54:41.977562 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5ngv2" Dec 01 14:54:42 crc kubenswrapper[4585]: I1201 14:54:42.164197 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5ngv2"] Dec 01 14:54:43 crc kubenswrapper[4585]: I1201 14:54:43.673735 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5ngv2" podUID="def91c25-0318-4b71-9022-8091782fe144" containerName="registry-server" containerID="cri-o://383b0c32e6062fd6c2f32431b92be604f49049f9a4dcb3e13673549a5c4f94d8" gracePeriod=2 Dec 01 14:54:44 crc kubenswrapper[4585]: I1201 14:54:44.130913 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5ngv2" Dec 01 14:54:44 crc kubenswrapper[4585]: I1201 14:54:44.291340 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/def91c25-0318-4b71-9022-8091782fe144-utilities\") pod \"def91c25-0318-4b71-9022-8091782fe144\" (UID: \"def91c25-0318-4b71-9022-8091782fe144\") " Dec 01 14:54:44 crc kubenswrapper[4585]: I1201 14:54:44.291500 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/def91c25-0318-4b71-9022-8091782fe144-catalog-content\") pod \"def91c25-0318-4b71-9022-8091782fe144\" (UID: \"def91c25-0318-4b71-9022-8091782fe144\") " Dec 01 14:54:44 crc kubenswrapper[4585]: I1201 14:54:44.291539 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlmr7\" (UniqueName: \"kubernetes.io/projected/def91c25-0318-4b71-9022-8091782fe144-kube-api-access-qlmr7\") pod \"def91c25-0318-4b71-9022-8091782fe144\" (UID: \"def91c25-0318-4b71-9022-8091782fe144\") " Dec 01 14:54:44 crc kubenswrapper[4585]: I1201 14:54:44.297116 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/def91c25-0318-4b71-9022-8091782fe144-utilities" (OuterVolumeSpecName: "utilities") pod "def91c25-0318-4b71-9022-8091782fe144" (UID: "def91c25-0318-4b71-9022-8091782fe144"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:54:44 crc kubenswrapper[4585]: I1201 14:54:44.297837 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/def91c25-0318-4b71-9022-8091782fe144-kube-api-access-qlmr7" (OuterVolumeSpecName: "kube-api-access-qlmr7") pod "def91c25-0318-4b71-9022-8091782fe144" (UID: "def91c25-0318-4b71-9022-8091782fe144"). InnerVolumeSpecName "kube-api-access-qlmr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:54:44 crc kubenswrapper[4585]: I1201 14:54:44.394225 4585 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/def91c25-0318-4b71-9022-8091782fe144-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 14:54:44 crc kubenswrapper[4585]: I1201 14:54:44.394257 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlmr7\" (UniqueName: \"kubernetes.io/projected/def91c25-0318-4b71-9022-8091782fe144-kube-api-access-qlmr7\") on node \"crc\" DevicePath \"\"" Dec 01 14:54:44 crc kubenswrapper[4585]: I1201 14:54:44.415280 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/def91c25-0318-4b71-9022-8091782fe144-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "def91c25-0318-4b71-9022-8091782fe144" (UID: "def91c25-0318-4b71-9022-8091782fe144"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:54:44 crc kubenswrapper[4585]: I1201 14:54:44.497624 4585 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/def91c25-0318-4b71-9022-8091782fe144-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 14:54:44 crc kubenswrapper[4585]: I1201 14:54:44.687107 4585 generic.go:334] "Generic (PLEG): container finished" podID="def91c25-0318-4b71-9022-8091782fe144" containerID="383b0c32e6062fd6c2f32431b92be604f49049f9a4dcb3e13673549a5c4f94d8" exitCode=0 Dec 01 14:54:44 crc kubenswrapper[4585]: I1201 14:54:44.687513 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5ngv2" event={"ID":"def91c25-0318-4b71-9022-8091782fe144","Type":"ContainerDied","Data":"383b0c32e6062fd6c2f32431b92be604f49049f9a4dcb3e13673549a5c4f94d8"} Dec 01 14:54:44 crc kubenswrapper[4585]: I1201 14:54:44.687545 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5ngv2" event={"ID":"def91c25-0318-4b71-9022-8091782fe144","Type":"ContainerDied","Data":"7107d07a1abfe64e204beb6dcfddf8d987b76dd0a50a78db80eee8394c45c54e"} Dec 01 14:54:44 crc kubenswrapper[4585]: I1201 14:54:44.687565 4585 scope.go:117] "RemoveContainer" containerID="383b0c32e6062fd6c2f32431b92be604f49049f9a4dcb3e13673549a5c4f94d8" Dec 01 14:54:44 crc kubenswrapper[4585]: I1201 14:54:44.687722 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5ngv2" Dec 01 14:54:44 crc kubenswrapper[4585]: I1201 14:54:44.723109 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5ngv2"] Dec 01 14:54:44 crc kubenswrapper[4585]: I1201 14:54:44.732546 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5ngv2"] Dec 01 14:54:44 crc kubenswrapper[4585]: I1201 14:54:44.750302 4585 scope.go:117] "RemoveContainer" containerID="d90ea6b559e913886084d753d30b6ac0427b031f209ca81ee750176cf7326aa8" Dec 01 14:54:44 crc kubenswrapper[4585]: I1201 14:54:44.795235 4585 scope.go:117] "RemoveContainer" containerID="e4c1286611f354a708fd0193bf74b156edf8c44d925f468e4171edc75c83f1fb" Dec 01 14:54:44 crc kubenswrapper[4585]: I1201 14:54:44.839541 4585 scope.go:117] "RemoveContainer" containerID="383b0c32e6062fd6c2f32431b92be604f49049f9a4dcb3e13673549a5c4f94d8" Dec 01 14:54:44 crc kubenswrapper[4585]: E1201 14:54:44.840106 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"383b0c32e6062fd6c2f32431b92be604f49049f9a4dcb3e13673549a5c4f94d8\": container with ID starting with 383b0c32e6062fd6c2f32431b92be604f49049f9a4dcb3e13673549a5c4f94d8 not found: ID does not exist" containerID="383b0c32e6062fd6c2f32431b92be604f49049f9a4dcb3e13673549a5c4f94d8" Dec 01 14:54:44 crc kubenswrapper[4585]: I1201 14:54:44.840153 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"383b0c32e6062fd6c2f32431b92be604f49049f9a4dcb3e13673549a5c4f94d8"} err="failed to get container status \"383b0c32e6062fd6c2f32431b92be604f49049f9a4dcb3e13673549a5c4f94d8\": rpc error: code = NotFound desc = could not find container \"383b0c32e6062fd6c2f32431b92be604f49049f9a4dcb3e13673549a5c4f94d8\": container with ID starting with 383b0c32e6062fd6c2f32431b92be604f49049f9a4dcb3e13673549a5c4f94d8 not found: ID does not exist" Dec 01 14:54:44 crc kubenswrapper[4585]: I1201 14:54:44.840172 4585 scope.go:117] "RemoveContainer" containerID="d90ea6b559e913886084d753d30b6ac0427b031f209ca81ee750176cf7326aa8" Dec 01 14:54:44 crc kubenswrapper[4585]: E1201 14:54:44.840568 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d90ea6b559e913886084d753d30b6ac0427b031f209ca81ee750176cf7326aa8\": container with ID starting with d90ea6b559e913886084d753d30b6ac0427b031f209ca81ee750176cf7326aa8 not found: ID does not exist" containerID="d90ea6b559e913886084d753d30b6ac0427b031f209ca81ee750176cf7326aa8" Dec 01 14:54:44 crc kubenswrapper[4585]: I1201 14:54:44.840589 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d90ea6b559e913886084d753d30b6ac0427b031f209ca81ee750176cf7326aa8"} err="failed to get container status \"d90ea6b559e913886084d753d30b6ac0427b031f209ca81ee750176cf7326aa8\": rpc error: code = NotFound desc = could not find container \"d90ea6b559e913886084d753d30b6ac0427b031f209ca81ee750176cf7326aa8\": container with ID starting with d90ea6b559e913886084d753d30b6ac0427b031f209ca81ee750176cf7326aa8 not found: ID does not exist" Dec 01 14:54:44 crc kubenswrapper[4585]: I1201 14:54:44.840628 4585 scope.go:117] "RemoveContainer" containerID="e4c1286611f354a708fd0193bf74b156edf8c44d925f468e4171edc75c83f1fb" Dec 01 14:54:44 crc kubenswrapper[4585]: E1201 14:54:44.841062 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4c1286611f354a708fd0193bf74b156edf8c44d925f468e4171edc75c83f1fb\": container with ID starting with e4c1286611f354a708fd0193bf74b156edf8c44d925f468e4171edc75c83f1fb not found: ID does not exist" containerID="e4c1286611f354a708fd0193bf74b156edf8c44d925f468e4171edc75c83f1fb" Dec 01 14:54:44 crc kubenswrapper[4585]: I1201 14:54:44.841084 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4c1286611f354a708fd0193bf74b156edf8c44d925f468e4171edc75c83f1fb"} err="failed to get container status \"e4c1286611f354a708fd0193bf74b156edf8c44d925f468e4171edc75c83f1fb\": rpc error: code = NotFound desc = could not find container \"e4c1286611f354a708fd0193bf74b156edf8c44d925f468e4171edc75c83f1fb\": container with ID starting with e4c1286611f354a708fd0193bf74b156edf8c44d925f468e4171edc75c83f1fb not found: ID does not exist" Dec 01 14:54:46 crc kubenswrapper[4585]: I1201 14:54:46.424697 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="def91c25-0318-4b71-9022-8091782fe144" path="/var/lib/kubelet/pods/def91c25-0318-4b71-9022-8091782fe144/volumes" Dec 01 14:54:47 crc kubenswrapper[4585]: I1201 14:54:47.412962 4585 scope.go:117] "RemoveContainer" containerID="7d09788976c97c2a3c377b62292f62228c7f604af4c086c3c984900409d574e6" Dec 01 14:54:47 crc kubenswrapper[4585]: I1201 14:54:47.714025 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" event={"ID":"f7beb40d-bcd0-43c8-a9fe-c32408790a4c","Type":"ContainerStarted","Data":"1e0b4194459fe2d90ee7c18b376805cee001984de119c4ef64ab23cb42caa200"} Dec 01 14:55:12 crc kubenswrapper[4585]: I1201 14:55:12.032571 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xg2mx/must-gather-qvcpl"] Dec 01 14:55:12 crc kubenswrapper[4585]: E1201 14:55:12.033477 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="def91c25-0318-4b71-9022-8091782fe144" containerName="extract-utilities" Dec 01 14:55:12 crc kubenswrapper[4585]: I1201 14:55:12.033489 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="def91c25-0318-4b71-9022-8091782fe144" containerName="extract-utilities" Dec 01 14:55:12 crc kubenswrapper[4585]: E1201 14:55:12.033501 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="def91c25-0318-4b71-9022-8091782fe144" containerName="registry-server" Dec 01 14:55:12 crc kubenswrapper[4585]: I1201 14:55:12.033508 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="def91c25-0318-4b71-9022-8091782fe144" containerName="registry-server" Dec 01 14:55:12 crc kubenswrapper[4585]: E1201 14:55:12.033535 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="def91c25-0318-4b71-9022-8091782fe144" containerName="extract-content" Dec 01 14:55:12 crc kubenswrapper[4585]: I1201 14:55:12.033542 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="def91c25-0318-4b71-9022-8091782fe144" containerName="extract-content" Dec 01 14:55:12 crc kubenswrapper[4585]: I1201 14:55:12.033737 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="def91c25-0318-4b71-9022-8091782fe144" containerName="registry-server" Dec 01 14:55:12 crc kubenswrapper[4585]: I1201 14:55:12.035493 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xg2mx/must-gather-qvcpl" Dec 01 14:55:12 crc kubenswrapper[4585]: I1201 14:55:12.058109 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-xg2mx"/"default-dockercfg-fbdtz" Dec 01 14:55:12 crc kubenswrapper[4585]: I1201 14:55:12.058208 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-xg2mx"/"openshift-service-ca.crt" Dec 01 14:55:12 crc kubenswrapper[4585]: I1201 14:55:12.058441 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-xg2mx"/"kube-root-ca.crt" Dec 01 14:55:12 crc kubenswrapper[4585]: I1201 14:55:12.076043 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xg2mx/must-gather-qvcpl"] Dec 01 14:55:12 crc kubenswrapper[4585]: I1201 14:55:12.179321 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8nqq\" (UniqueName: \"kubernetes.io/projected/bcd106c4-731c-46aa-b8a5-bf4fc4d8c464-kube-api-access-z8nqq\") pod \"must-gather-qvcpl\" (UID: \"bcd106c4-731c-46aa-b8a5-bf4fc4d8c464\") " pod="openshift-must-gather-xg2mx/must-gather-qvcpl" Dec 01 14:55:12 crc kubenswrapper[4585]: I1201 14:55:12.179410 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/bcd106c4-731c-46aa-b8a5-bf4fc4d8c464-must-gather-output\") pod \"must-gather-qvcpl\" (UID: \"bcd106c4-731c-46aa-b8a5-bf4fc4d8c464\") " pod="openshift-must-gather-xg2mx/must-gather-qvcpl" Dec 01 14:55:12 crc kubenswrapper[4585]: I1201 14:55:12.281480 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/bcd106c4-731c-46aa-b8a5-bf4fc4d8c464-must-gather-output\") pod \"must-gather-qvcpl\" (UID: \"bcd106c4-731c-46aa-b8a5-bf4fc4d8c464\") " pod="openshift-must-gather-xg2mx/must-gather-qvcpl" Dec 01 14:55:12 crc kubenswrapper[4585]: I1201 14:55:12.283233 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8nqq\" (UniqueName: \"kubernetes.io/projected/bcd106c4-731c-46aa-b8a5-bf4fc4d8c464-kube-api-access-z8nqq\") pod \"must-gather-qvcpl\" (UID: \"bcd106c4-731c-46aa-b8a5-bf4fc4d8c464\") " pod="openshift-must-gather-xg2mx/must-gather-qvcpl" Dec 01 14:55:12 crc kubenswrapper[4585]: I1201 14:55:12.282262 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/bcd106c4-731c-46aa-b8a5-bf4fc4d8c464-must-gather-output\") pod \"must-gather-qvcpl\" (UID: \"bcd106c4-731c-46aa-b8a5-bf4fc4d8c464\") " pod="openshift-must-gather-xg2mx/must-gather-qvcpl" Dec 01 14:55:12 crc kubenswrapper[4585]: I1201 14:55:12.309815 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8nqq\" (UniqueName: \"kubernetes.io/projected/bcd106c4-731c-46aa-b8a5-bf4fc4d8c464-kube-api-access-z8nqq\") pod \"must-gather-qvcpl\" (UID: \"bcd106c4-731c-46aa-b8a5-bf4fc4d8c464\") " pod="openshift-must-gather-xg2mx/must-gather-qvcpl" Dec 01 14:55:12 crc kubenswrapper[4585]: I1201 14:55:12.355316 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xg2mx/must-gather-qvcpl" Dec 01 14:55:13 crc kubenswrapper[4585]: I1201 14:55:13.286730 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xg2mx/must-gather-qvcpl"] Dec 01 14:55:13 crc kubenswrapper[4585]: I1201 14:55:13.958534 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xg2mx/must-gather-qvcpl" event={"ID":"bcd106c4-731c-46aa-b8a5-bf4fc4d8c464","Type":"ContainerStarted","Data":"210950b1e81cfb369091e5b37f2a545c370443254c55b9e51a2704bbad43d73e"} Dec 01 14:55:13 crc kubenswrapper[4585]: I1201 14:55:13.958893 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xg2mx/must-gather-qvcpl" event={"ID":"bcd106c4-731c-46aa-b8a5-bf4fc4d8c464","Type":"ContainerStarted","Data":"ca9d3ea2ced99fc7805b72e4f5ed5eb136047f061564c97ac64f455459b89613"} Dec 01 14:55:13 crc kubenswrapper[4585]: I1201 14:55:13.958907 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xg2mx/must-gather-qvcpl" event={"ID":"bcd106c4-731c-46aa-b8a5-bf4fc4d8c464","Type":"ContainerStarted","Data":"583fe1083d6e5010f2ee7de8dc8892cb4469a6a58d463fe332a9a4df8b838e58"} Dec 01 14:55:13 crc kubenswrapper[4585]: I1201 14:55:13.983614 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xg2mx/must-gather-qvcpl" podStartSLOduration=2.9835892939999997 podStartE2EDuration="2.983589294s" podCreationTimestamp="2025-12-01 14:55:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:55:13.969632716 +0000 UTC m=+3427.953846571" watchObservedRunningTime="2025-12-01 14:55:13.983589294 +0000 UTC m=+3427.967803159" Dec 01 14:55:17 crc kubenswrapper[4585]: I1201 14:55:17.884749 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xg2mx/crc-debug-kslk6"] Dec 01 14:55:17 crc kubenswrapper[4585]: I1201 14:55:17.886645 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xg2mx/crc-debug-kslk6" Dec 01 14:55:17 crc kubenswrapper[4585]: I1201 14:55:17.911213 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5233e17b-3581-4b18-a482-711d0f16f140-host\") pod \"crc-debug-kslk6\" (UID: \"5233e17b-3581-4b18-a482-711d0f16f140\") " pod="openshift-must-gather-xg2mx/crc-debug-kslk6" Dec 01 14:55:17 crc kubenswrapper[4585]: I1201 14:55:17.911283 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2fbc\" (UniqueName: \"kubernetes.io/projected/5233e17b-3581-4b18-a482-711d0f16f140-kube-api-access-g2fbc\") pod \"crc-debug-kslk6\" (UID: \"5233e17b-3581-4b18-a482-711d0f16f140\") " pod="openshift-must-gather-xg2mx/crc-debug-kslk6" Dec 01 14:55:18 crc kubenswrapper[4585]: I1201 14:55:18.013637 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5233e17b-3581-4b18-a482-711d0f16f140-host\") pod \"crc-debug-kslk6\" (UID: \"5233e17b-3581-4b18-a482-711d0f16f140\") " pod="openshift-must-gather-xg2mx/crc-debug-kslk6" Dec 01 14:55:18 crc kubenswrapper[4585]: I1201 14:55:18.013686 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2fbc\" (UniqueName: \"kubernetes.io/projected/5233e17b-3581-4b18-a482-711d0f16f140-kube-api-access-g2fbc\") pod \"crc-debug-kslk6\" (UID: \"5233e17b-3581-4b18-a482-711d0f16f140\") " pod="openshift-must-gather-xg2mx/crc-debug-kslk6" Dec 01 14:55:18 crc kubenswrapper[4585]: I1201 14:55:18.014020 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5233e17b-3581-4b18-a482-711d0f16f140-host\") pod \"crc-debug-kslk6\" (UID: \"5233e17b-3581-4b18-a482-711d0f16f140\") " pod="openshift-must-gather-xg2mx/crc-debug-kslk6" Dec 01 14:55:18 crc kubenswrapper[4585]: I1201 14:55:18.042206 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2fbc\" (UniqueName: \"kubernetes.io/projected/5233e17b-3581-4b18-a482-711d0f16f140-kube-api-access-g2fbc\") pod \"crc-debug-kslk6\" (UID: \"5233e17b-3581-4b18-a482-711d0f16f140\") " pod="openshift-must-gather-xg2mx/crc-debug-kslk6" Dec 01 14:55:18 crc kubenswrapper[4585]: I1201 14:55:18.203616 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xg2mx/crc-debug-kslk6" Dec 01 14:55:19 crc kubenswrapper[4585]: I1201 14:55:19.005058 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xg2mx/crc-debug-kslk6" event={"ID":"5233e17b-3581-4b18-a482-711d0f16f140","Type":"ContainerStarted","Data":"77865c11d7cc0630175435442cf13442d0584636028b440e11b3b26912d69a49"} Dec 01 14:55:19 crc kubenswrapper[4585]: I1201 14:55:19.005727 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xg2mx/crc-debug-kslk6" event={"ID":"5233e17b-3581-4b18-a482-711d0f16f140","Type":"ContainerStarted","Data":"11ef282a67fea550c502cd21328233c5095ea672f19ec43f6831ce4e1eae451f"} Dec 01 14:55:19 crc kubenswrapper[4585]: I1201 14:55:19.028240 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xg2mx/crc-debug-kslk6" podStartSLOduration=2.028217394 podStartE2EDuration="2.028217394s" podCreationTimestamp="2025-12-01 14:55:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:55:19.020461289 +0000 UTC m=+3433.004675154" watchObservedRunningTime="2025-12-01 14:55:19.028217394 +0000 UTC m=+3433.012431269" Dec 01 14:55:55 crc kubenswrapper[4585]: I1201 14:55:55.312984 4585 generic.go:334] "Generic (PLEG): container finished" podID="5233e17b-3581-4b18-a482-711d0f16f140" containerID="77865c11d7cc0630175435442cf13442d0584636028b440e11b3b26912d69a49" exitCode=0 Dec 01 14:55:55 crc kubenswrapper[4585]: I1201 14:55:55.313085 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xg2mx/crc-debug-kslk6" event={"ID":"5233e17b-3581-4b18-a482-711d0f16f140","Type":"ContainerDied","Data":"77865c11d7cc0630175435442cf13442d0584636028b440e11b3b26912d69a49"} Dec 01 14:55:56 crc kubenswrapper[4585]: I1201 14:55:56.421337 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xg2mx/crc-debug-kslk6" Dec 01 14:55:56 crc kubenswrapper[4585]: I1201 14:55:56.452038 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5233e17b-3581-4b18-a482-711d0f16f140-host\") pod \"5233e17b-3581-4b18-a482-711d0f16f140\" (UID: \"5233e17b-3581-4b18-a482-711d0f16f140\") " Dec 01 14:55:56 crc kubenswrapper[4585]: I1201 14:55:56.452136 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2fbc\" (UniqueName: \"kubernetes.io/projected/5233e17b-3581-4b18-a482-711d0f16f140-kube-api-access-g2fbc\") pod \"5233e17b-3581-4b18-a482-711d0f16f140\" (UID: \"5233e17b-3581-4b18-a482-711d0f16f140\") " Dec 01 14:55:56 crc kubenswrapper[4585]: I1201 14:55:56.452236 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5233e17b-3581-4b18-a482-711d0f16f140-host" (OuterVolumeSpecName: "host") pod "5233e17b-3581-4b18-a482-711d0f16f140" (UID: "5233e17b-3581-4b18-a482-711d0f16f140"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 14:55:56 crc kubenswrapper[4585]: I1201 14:55:56.452546 4585 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5233e17b-3581-4b18-a482-711d0f16f140-host\") on node \"crc\" DevicePath \"\"" Dec 01 14:55:56 crc kubenswrapper[4585]: I1201 14:55:56.459337 4585 status_manager.go:875] "Failed to update status for pod" pod="openshift-must-gather-xg2mx/crc-debug-kslk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5233e17b-3581-4b18-a482-711d0f16f140\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T14:55:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"}]}}\" for pod \"openshift-must-gather-xg2mx\"/\"crc-debug-kslk6\": pods \"crc-debug-kslk6\" not found" Dec 01 14:55:56 crc kubenswrapper[4585]: I1201 14:55:56.465023 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5233e17b-3581-4b18-a482-711d0f16f140-kube-api-access-g2fbc" (OuterVolumeSpecName: "kube-api-access-g2fbc") pod "5233e17b-3581-4b18-a482-711d0f16f140" (UID: "5233e17b-3581-4b18-a482-711d0f16f140"). InnerVolumeSpecName "kube-api-access-g2fbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:55:56 crc kubenswrapper[4585]: I1201 14:55:56.481706 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-xg2mx/crc-debug-kslk6"] Dec 01 14:55:56 crc kubenswrapper[4585]: I1201 14:55:56.492194 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-xg2mx/crc-debug-kslk6"] Dec 01 14:55:56 crc kubenswrapper[4585]: I1201 14:55:56.556744 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2fbc\" (UniqueName: \"kubernetes.io/projected/5233e17b-3581-4b18-a482-711d0f16f140-kube-api-access-g2fbc\") on node \"crc\" DevicePath \"\"" Dec 01 14:55:57 crc kubenswrapper[4585]: I1201 14:55:57.336541 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11ef282a67fea550c502cd21328233c5095ea672f19ec43f6831ce4e1eae451f" Dec 01 14:55:57 crc kubenswrapper[4585]: I1201 14:55:57.336699 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xg2mx/crc-debug-kslk6" Dec 01 14:55:57 crc kubenswrapper[4585]: I1201 14:55:57.763231 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xg2mx/crc-debug-t5h6j"] Dec 01 14:55:57 crc kubenswrapper[4585]: E1201 14:55:57.763685 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5233e17b-3581-4b18-a482-711d0f16f140" containerName="container-00" Dec 01 14:55:57 crc kubenswrapper[4585]: I1201 14:55:57.763708 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="5233e17b-3581-4b18-a482-711d0f16f140" containerName="container-00" Dec 01 14:55:57 crc kubenswrapper[4585]: I1201 14:55:57.763911 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="5233e17b-3581-4b18-a482-711d0f16f140" containerName="container-00" Dec 01 14:55:57 crc kubenswrapper[4585]: I1201 14:55:57.764669 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xg2mx/crc-debug-t5h6j" Dec 01 14:55:57 crc kubenswrapper[4585]: I1201 14:55:57.888219 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbdd4\" (UniqueName: \"kubernetes.io/projected/b16af1c5-295a-46c9-8280-de3b133967f1-kube-api-access-pbdd4\") pod \"crc-debug-t5h6j\" (UID: \"b16af1c5-295a-46c9-8280-de3b133967f1\") " pod="openshift-must-gather-xg2mx/crc-debug-t5h6j" Dec 01 14:55:57 crc kubenswrapper[4585]: I1201 14:55:57.888348 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b16af1c5-295a-46c9-8280-de3b133967f1-host\") pod \"crc-debug-t5h6j\" (UID: \"b16af1c5-295a-46c9-8280-de3b133967f1\") " pod="openshift-must-gather-xg2mx/crc-debug-t5h6j" Dec 01 14:55:57 crc kubenswrapper[4585]: I1201 14:55:57.989906 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b16af1c5-295a-46c9-8280-de3b133967f1-host\") pod \"crc-debug-t5h6j\" (UID: \"b16af1c5-295a-46c9-8280-de3b133967f1\") " pod="openshift-must-gather-xg2mx/crc-debug-t5h6j" Dec 01 14:55:57 crc kubenswrapper[4585]: I1201 14:55:57.990453 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbdd4\" (UniqueName: \"kubernetes.io/projected/b16af1c5-295a-46c9-8280-de3b133967f1-kube-api-access-pbdd4\") pod \"crc-debug-t5h6j\" (UID: \"b16af1c5-295a-46c9-8280-de3b133967f1\") " pod="openshift-must-gather-xg2mx/crc-debug-t5h6j" Dec 01 14:55:57 crc kubenswrapper[4585]: I1201 14:55:57.990917 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b16af1c5-295a-46c9-8280-de3b133967f1-host\") pod \"crc-debug-t5h6j\" (UID: \"b16af1c5-295a-46c9-8280-de3b133967f1\") " pod="openshift-must-gather-xg2mx/crc-debug-t5h6j" Dec 01 14:55:58 crc kubenswrapper[4585]: I1201 14:55:58.024068 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbdd4\" (UniqueName: \"kubernetes.io/projected/b16af1c5-295a-46c9-8280-de3b133967f1-kube-api-access-pbdd4\") pod \"crc-debug-t5h6j\" (UID: \"b16af1c5-295a-46c9-8280-de3b133967f1\") " pod="openshift-must-gather-xg2mx/crc-debug-t5h6j" Dec 01 14:55:58 crc kubenswrapper[4585]: I1201 14:55:58.086616 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xg2mx/crc-debug-t5h6j" Dec 01 14:55:58 crc kubenswrapper[4585]: I1201 14:55:58.351487 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xg2mx/crc-debug-t5h6j" event={"ID":"b16af1c5-295a-46c9-8280-de3b133967f1","Type":"ContainerStarted","Data":"7ea08ca6f15d19e75cc2204ed4d58b03151a7e5bc69be3ec784126fbeb3e37eb"} Dec 01 14:55:58 crc kubenswrapper[4585]: I1201 14:55:58.351851 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xg2mx/crc-debug-t5h6j" event={"ID":"b16af1c5-295a-46c9-8280-de3b133967f1","Type":"ContainerStarted","Data":"f52db319f7f6c6c125c7e5b9936b034c726e939544d89a71833c1ebc9093def9"} Dec 01 14:55:58 crc kubenswrapper[4585]: I1201 14:55:58.381053 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xg2mx/crc-debug-t5h6j" podStartSLOduration=1.38103146 podStartE2EDuration="1.38103146s" podCreationTimestamp="2025-12-01 14:55:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 14:55:58.371215671 +0000 UTC m=+3472.355429526" watchObservedRunningTime="2025-12-01 14:55:58.38103146 +0000 UTC m=+3472.365245315" Dec 01 14:55:58 crc kubenswrapper[4585]: I1201 14:55:58.421691 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5233e17b-3581-4b18-a482-711d0f16f140" path="/var/lib/kubelet/pods/5233e17b-3581-4b18-a482-711d0f16f140/volumes" Dec 01 14:55:59 crc kubenswrapper[4585]: I1201 14:55:59.388898 4585 generic.go:334] "Generic (PLEG): container finished" podID="b16af1c5-295a-46c9-8280-de3b133967f1" containerID="7ea08ca6f15d19e75cc2204ed4d58b03151a7e5bc69be3ec784126fbeb3e37eb" exitCode=0 Dec 01 14:55:59 crc kubenswrapper[4585]: I1201 14:55:59.388949 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xg2mx/crc-debug-t5h6j" event={"ID":"b16af1c5-295a-46c9-8280-de3b133967f1","Type":"ContainerDied","Data":"7ea08ca6f15d19e75cc2204ed4d58b03151a7e5bc69be3ec784126fbeb3e37eb"} Dec 01 14:56:00 crc kubenswrapper[4585]: I1201 14:56:00.495668 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xg2mx/crc-debug-t5h6j" Dec 01 14:56:00 crc kubenswrapper[4585]: I1201 14:56:00.522922 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-xg2mx/crc-debug-t5h6j"] Dec 01 14:56:00 crc kubenswrapper[4585]: I1201 14:56:00.531718 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbdd4\" (UniqueName: \"kubernetes.io/projected/b16af1c5-295a-46c9-8280-de3b133967f1-kube-api-access-pbdd4\") pod \"b16af1c5-295a-46c9-8280-de3b133967f1\" (UID: \"b16af1c5-295a-46c9-8280-de3b133967f1\") " Dec 01 14:56:00 crc kubenswrapper[4585]: I1201 14:56:00.531937 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-xg2mx/crc-debug-t5h6j"] Dec 01 14:56:00 crc kubenswrapper[4585]: I1201 14:56:00.532247 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b16af1c5-295a-46c9-8280-de3b133967f1-host\") pod \"b16af1c5-295a-46c9-8280-de3b133967f1\" (UID: \"b16af1c5-295a-46c9-8280-de3b133967f1\") " Dec 01 14:56:00 crc kubenswrapper[4585]: I1201 14:56:00.532304 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b16af1c5-295a-46c9-8280-de3b133967f1-host" (OuterVolumeSpecName: "host") pod "b16af1c5-295a-46c9-8280-de3b133967f1" (UID: "b16af1c5-295a-46c9-8280-de3b133967f1"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 14:56:00 crc kubenswrapper[4585]: I1201 14:56:00.532835 4585 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b16af1c5-295a-46c9-8280-de3b133967f1-host\") on node \"crc\" DevicePath \"\"" Dec 01 14:56:00 crc kubenswrapper[4585]: I1201 14:56:00.537389 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b16af1c5-295a-46c9-8280-de3b133967f1-kube-api-access-pbdd4" (OuterVolumeSpecName: "kube-api-access-pbdd4") pod "b16af1c5-295a-46c9-8280-de3b133967f1" (UID: "b16af1c5-295a-46c9-8280-de3b133967f1"). InnerVolumeSpecName "kube-api-access-pbdd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:56:00 crc kubenswrapper[4585]: I1201 14:56:00.635235 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbdd4\" (UniqueName: \"kubernetes.io/projected/b16af1c5-295a-46c9-8280-de3b133967f1-kube-api-access-pbdd4\") on node \"crc\" DevicePath \"\"" Dec 01 14:56:01 crc kubenswrapper[4585]: I1201 14:56:01.406663 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f52db319f7f6c6c125c7e5b9936b034c726e939544d89a71833c1ebc9093def9" Dec 01 14:56:01 crc kubenswrapper[4585]: I1201 14:56:01.406788 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xg2mx/crc-debug-t5h6j" Dec 01 14:56:01 crc kubenswrapper[4585]: I1201 14:56:01.700434 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xg2mx/crc-debug-mmd82"] Dec 01 14:56:01 crc kubenswrapper[4585]: E1201 14:56:01.701909 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b16af1c5-295a-46c9-8280-de3b133967f1" containerName="container-00" Dec 01 14:56:01 crc kubenswrapper[4585]: I1201 14:56:01.701933 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="b16af1c5-295a-46c9-8280-de3b133967f1" containerName="container-00" Dec 01 14:56:01 crc kubenswrapper[4585]: I1201 14:56:01.702257 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="b16af1c5-295a-46c9-8280-de3b133967f1" containerName="container-00" Dec 01 14:56:01 crc kubenswrapper[4585]: I1201 14:56:01.703103 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xg2mx/crc-debug-mmd82" Dec 01 14:56:01 crc kubenswrapper[4585]: I1201 14:56:01.757410 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmdxk\" (UniqueName: \"kubernetes.io/projected/f4f0138f-de44-4408-99dc-fc5f1a7afe96-kube-api-access-bmdxk\") pod \"crc-debug-mmd82\" (UID: \"f4f0138f-de44-4408-99dc-fc5f1a7afe96\") " pod="openshift-must-gather-xg2mx/crc-debug-mmd82" Dec 01 14:56:01 crc kubenswrapper[4585]: I1201 14:56:01.757654 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f4f0138f-de44-4408-99dc-fc5f1a7afe96-host\") pod \"crc-debug-mmd82\" (UID: \"f4f0138f-de44-4408-99dc-fc5f1a7afe96\") " pod="openshift-must-gather-xg2mx/crc-debug-mmd82" Dec 01 14:56:01 crc kubenswrapper[4585]: I1201 14:56:01.861158 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f4f0138f-de44-4408-99dc-fc5f1a7afe96-host\") pod \"crc-debug-mmd82\" (UID: \"f4f0138f-de44-4408-99dc-fc5f1a7afe96\") " pod="openshift-must-gather-xg2mx/crc-debug-mmd82" Dec 01 14:56:01 crc kubenswrapper[4585]: I1201 14:56:01.861327 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmdxk\" (UniqueName: \"kubernetes.io/projected/f4f0138f-de44-4408-99dc-fc5f1a7afe96-kube-api-access-bmdxk\") pod \"crc-debug-mmd82\" (UID: \"f4f0138f-de44-4408-99dc-fc5f1a7afe96\") " pod="openshift-must-gather-xg2mx/crc-debug-mmd82" Dec 01 14:56:01 crc kubenswrapper[4585]: I1201 14:56:01.861528 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f4f0138f-de44-4408-99dc-fc5f1a7afe96-host\") pod \"crc-debug-mmd82\" (UID: \"f4f0138f-de44-4408-99dc-fc5f1a7afe96\") " pod="openshift-must-gather-xg2mx/crc-debug-mmd82" Dec 01 14:56:01 crc kubenswrapper[4585]: I1201 14:56:01.891911 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmdxk\" (UniqueName: \"kubernetes.io/projected/f4f0138f-de44-4408-99dc-fc5f1a7afe96-kube-api-access-bmdxk\") pod \"crc-debug-mmd82\" (UID: \"f4f0138f-de44-4408-99dc-fc5f1a7afe96\") " pod="openshift-must-gather-xg2mx/crc-debug-mmd82" Dec 01 14:56:02 crc kubenswrapper[4585]: I1201 14:56:02.022115 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xg2mx/crc-debug-mmd82" Dec 01 14:56:02 crc kubenswrapper[4585]: I1201 14:56:02.417300 4585 generic.go:334] "Generic (PLEG): container finished" podID="f4f0138f-de44-4408-99dc-fc5f1a7afe96" containerID="8a4e9b7bd1785df20784e961a33e8b768fdedc1774f04b0b622a9670d3864dbb" exitCode=0 Dec 01 14:56:02 crc kubenswrapper[4585]: I1201 14:56:02.422411 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b16af1c5-295a-46c9-8280-de3b133967f1" path="/var/lib/kubelet/pods/b16af1c5-295a-46c9-8280-de3b133967f1/volumes" Dec 01 14:56:02 crc kubenswrapper[4585]: I1201 14:56:02.423025 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xg2mx/crc-debug-mmd82" event={"ID":"f4f0138f-de44-4408-99dc-fc5f1a7afe96","Type":"ContainerDied","Data":"8a4e9b7bd1785df20784e961a33e8b768fdedc1774f04b0b622a9670d3864dbb"} Dec 01 14:56:02 crc kubenswrapper[4585]: I1201 14:56:02.423056 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xg2mx/crc-debug-mmd82" event={"ID":"f4f0138f-de44-4408-99dc-fc5f1a7afe96","Type":"ContainerStarted","Data":"ba3c053dfd7e791e4d6e3cf9e9a1d77c66cbe148411f225aab62c4e151659bd5"} Dec 01 14:56:02 crc kubenswrapper[4585]: I1201 14:56:02.482103 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-xg2mx/crc-debug-mmd82"] Dec 01 14:56:02 crc kubenswrapper[4585]: I1201 14:56:02.491429 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-xg2mx/crc-debug-mmd82"] Dec 01 14:56:03 crc kubenswrapper[4585]: I1201 14:56:03.553006 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xg2mx/crc-debug-mmd82" Dec 01 14:56:03 crc kubenswrapper[4585]: I1201 14:56:03.600367 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f4f0138f-de44-4408-99dc-fc5f1a7afe96-host\") pod \"f4f0138f-de44-4408-99dc-fc5f1a7afe96\" (UID: \"f4f0138f-de44-4408-99dc-fc5f1a7afe96\") " Dec 01 14:56:03 crc kubenswrapper[4585]: I1201 14:56:03.600487 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmdxk\" (UniqueName: \"kubernetes.io/projected/f4f0138f-de44-4408-99dc-fc5f1a7afe96-kube-api-access-bmdxk\") pod \"f4f0138f-de44-4408-99dc-fc5f1a7afe96\" (UID: \"f4f0138f-de44-4408-99dc-fc5f1a7afe96\") " Dec 01 14:56:03 crc kubenswrapper[4585]: I1201 14:56:03.600501 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4f0138f-de44-4408-99dc-fc5f1a7afe96-host" (OuterVolumeSpecName: "host") pod "f4f0138f-de44-4408-99dc-fc5f1a7afe96" (UID: "f4f0138f-de44-4408-99dc-fc5f1a7afe96"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 14:56:03 crc kubenswrapper[4585]: I1201 14:56:03.601190 4585 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f4f0138f-de44-4408-99dc-fc5f1a7afe96-host\") on node \"crc\" DevicePath \"\"" Dec 01 14:56:03 crc kubenswrapper[4585]: I1201 14:56:03.606286 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4f0138f-de44-4408-99dc-fc5f1a7afe96-kube-api-access-bmdxk" (OuterVolumeSpecName: "kube-api-access-bmdxk") pod "f4f0138f-de44-4408-99dc-fc5f1a7afe96" (UID: "f4f0138f-de44-4408-99dc-fc5f1a7afe96"). InnerVolumeSpecName "kube-api-access-bmdxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:56:03 crc kubenswrapper[4585]: I1201 14:56:03.703070 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmdxk\" (UniqueName: \"kubernetes.io/projected/f4f0138f-de44-4408-99dc-fc5f1a7afe96-kube-api-access-bmdxk\") on node \"crc\" DevicePath \"\"" Dec 01 14:56:04 crc kubenswrapper[4585]: I1201 14:56:04.423784 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4f0138f-de44-4408-99dc-fc5f1a7afe96" path="/var/lib/kubelet/pods/f4f0138f-de44-4408-99dc-fc5f1a7afe96/volumes" Dec 01 14:56:04 crc kubenswrapper[4585]: I1201 14:56:04.435310 4585 scope.go:117] "RemoveContainer" containerID="8a4e9b7bd1785df20784e961a33e8b768fdedc1774f04b0b622a9670d3864dbb" Dec 01 14:56:04 crc kubenswrapper[4585]: I1201 14:56:04.435394 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xg2mx/crc-debug-mmd82" Dec 01 14:56:34 crc kubenswrapper[4585]: I1201 14:56:34.381287 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6564675f78-48rkf_87101522-5785-472c-9563-d86146676171/barbican-api/0.log" Dec 01 14:56:34 crc kubenswrapper[4585]: I1201 14:56:34.536827 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6564675f78-48rkf_87101522-5785-472c-9563-d86146676171/barbican-api-log/0.log" Dec 01 14:56:34 crc kubenswrapper[4585]: I1201 14:56:34.594222 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-868456d7cd-m64gp_31b37f1e-9af0-4922-8e48-55f82175411c/barbican-keystone-listener/0.log" Dec 01 14:56:34 crc kubenswrapper[4585]: I1201 14:56:34.711724 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-868456d7cd-m64gp_31b37f1e-9af0-4922-8e48-55f82175411c/barbican-keystone-listener-log/0.log" Dec 01 14:56:34 crc kubenswrapper[4585]: I1201 14:56:34.881329 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-65f6dd57f9-22fc5_61f0d5fb-5daa-4828-a1a8-92dc39a7c822/barbican-worker/0.log" Dec 01 14:56:34 crc kubenswrapper[4585]: I1201 14:56:34.935342 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-65f6dd57f9-22fc5_61f0d5fb-5daa-4828-a1a8-92dc39a7c822/barbican-worker-log/0.log" Dec 01 14:56:35 crc kubenswrapper[4585]: I1201 14:56:35.184716 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-6t4hr_9d1f4c36-f08f-4359-a950-a506d064998b/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 14:56:35 crc kubenswrapper[4585]: I1201 14:56:35.186865 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_7642d2c9-b2cc-400c-b45a-957690fb2e86/ceilometer-central-agent/0.log" Dec 01 14:56:35 crc kubenswrapper[4585]: I1201 14:56:35.299586 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_7642d2c9-b2cc-400c-b45a-957690fb2e86/ceilometer-notification-agent/0.log" Dec 01 14:56:35 crc kubenswrapper[4585]: I1201 14:56:35.369399 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_7642d2c9-b2cc-400c-b45a-957690fb2e86/proxy-httpd/0.log" Dec 01 14:56:35 crc kubenswrapper[4585]: I1201 14:56:35.439845 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_7642d2c9-b2cc-400c-b45a-957690fb2e86/sg-core/0.log" Dec 01 14:56:35 crc kubenswrapper[4585]: I1201 14:56:35.629195 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_952acec2-d757-4b65-aaf3-61bb69e5d5d7/cinder-api/0.log" Dec 01 14:56:35 crc kubenswrapper[4585]: I1201 14:56:35.641397 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_952acec2-d757-4b65-aaf3-61bb69e5d5d7/cinder-api-log/0.log" Dec 01 14:56:35 crc kubenswrapper[4585]: I1201 14:56:35.809017 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_68b9fd2e-0f05-46d6-86aa-319cbbf01db1/cinder-scheduler/0.log" Dec 01 14:56:35 crc kubenswrapper[4585]: I1201 14:56:35.903188 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_68b9fd2e-0f05-46d6-86aa-319cbbf01db1/probe/0.log" Dec 01 14:56:35 crc kubenswrapper[4585]: I1201 14:56:35.972698 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-xjc7h_16e7590a-927c-4ff1-8eb8-3b4a248ce6f8/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 14:56:36 crc kubenswrapper[4585]: I1201 14:56:36.131550 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-hb7s8_c12f5739-060f-4047-b987-d8c958aeb133/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 14:56:36 crc kubenswrapper[4585]: I1201 14:56:36.231993 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6cd9bffc9-cjbbt_7e89a01c-fb21-4027-bbeb-6bfe70da33d0/init/0.log" Dec 01 14:56:36 crc kubenswrapper[4585]: I1201 14:56:36.412037 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6cd9bffc9-cjbbt_7e89a01c-fb21-4027-bbeb-6bfe70da33d0/init/0.log" Dec 01 14:56:36 crc kubenswrapper[4585]: I1201 14:56:36.544760 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-pw7wp_bed7040b-db55-41ae-9384-7b730ced5331/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 14:56:36 crc kubenswrapper[4585]: I1201 14:56:36.553856 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6cd9bffc9-cjbbt_7e89a01c-fb21-4027-bbeb-6bfe70da33d0/dnsmasq-dns/0.log" Dec 01 14:56:36 crc kubenswrapper[4585]: I1201 14:56:36.786576 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_2e570020-789a-4807-9cff-651caad31856/glance-log/0.log" Dec 01 14:56:36 crc kubenswrapper[4585]: I1201 14:56:36.821919 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_2e570020-789a-4807-9cff-651caad31856/glance-httpd/0.log" Dec 01 14:56:36 crc kubenswrapper[4585]: I1201 14:56:36.996734 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_bba1bc55-ebde-45b0-a2bf-1b05117fc134/glance-httpd/0.log" Dec 01 14:56:37 crc kubenswrapper[4585]: I1201 14:56:37.064127 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_bba1bc55-ebde-45b0-a2bf-1b05117fc134/glance-log/0.log" Dec 01 14:56:37 crc kubenswrapper[4585]: I1201 14:56:37.179881 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6bbf659b46-55tth_e3b59a9b-bef9-4f7c-b861-bf6bc2a8bac1/horizon/1.log" Dec 01 14:56:37 crc kubenswrapper[4585]: I1201 14:56:37.310406 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6bbf659b46-55tth_e3b59a9b-bef9-4f7c-b861-bf6bc2a8bac1/horizon/0.log" Dec 01 14:56:37 crc kubenswrapper[4585]: I1201 14:56:37.480685 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-29vjp_f27fcf2c-32e5-488b-b1f2-3f61b3096f4f/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 14:56:37 crc kubenswrapper[4585]: I1201 14:56:37.635850 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6bbf659b46-55tth_e3b59a9b-bef9-4f7c-b861-bf6bc2a8bac1/horizon-log/0.log" Dec 01 14:56:37 crc kubenswrapper[4585]: I1201 14:56:37.748239 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-pzrlq_0919f038-3bf5-4f3c-baa3-5c85ceef4819/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 14:56:38 crc kubenswrapper[4585]: I1201 14:56:38.006850 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-576d96b8bf-jl74m_2ccd2ad6-369c-4511-9704-9f091dac6dd7/keystone-api/0.log" Dec 01 14:56:38 crc kubenswrapper[4585]: I1201 14:56:38.044061 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_e6dc37b7-09de-4e17-9d88-358b3d3d5908/kube-state-metrics/0.log" Dec 01 14:56:38 crc kubenswrapper[4585]: I1201 14:56:38.261784 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-6x5zw_a592f160-6520-4d70-94bd-5064e63fa1a0/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 14:56:38 crc kubenswrapper[4585]: I1201 14:56:38.668294 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-fc9dbdd9-h6k6z_197759e4-0035-4430-9bee-483578d6804e/neutron-httpd/0.log" Dec 01 14:56:38 crc kubenswrapper[4585]: I1201 14:56:38.672796 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-fc9dbdd9-h6k6z_197759e4-0035-4430-9bee-483578d6804e/neutron-api/0.log" Dec 01 14:56:38 crc kubenswrapper[4585]: I1201 14:56:38.810880 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_bbcd4b11-d625-4425-82d8-7c32d8c24c5c/memcached/0.log" Dec 01 14:56:38 crc kubenswrapper[4585]: I1201 14:56:38.924220 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-nw4f8_1a9f93f2-8afc-428b-9578-dd3353f8a43b/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 14:56:39 crc kubenswrapper[4585]: I1201 14:56:39.124088 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_f2bbc3d0-64bb-4942-8ee2-a05b538ec68f/nova-api-log/0.log" Dec 01 14:56:39 crc kubenswrapper[4585]: I1201 14:56:39.300350 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_f2bbc3d0-64bb-4942-8ee2-a05b538ec68f/nova-api-api/0.log" Dec 01 14:56:39 crc kubenswrapper[4585]: I1201 14:56:39.446326 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_1ae9e96d-f1e0-4183-9034-f553c8af4864/nova-cell0-conductor-conductor/0.log" Dec 01 14:56:39 crc kubenswrapper[4585]: I1201 14:56:39.496214 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_4cf9cb62-4c4f-43ae-8a94-78eafbb19a82/nova-cell1-conductor-conductor/0.log" Dec 01 14:56:39 crc kubenswrapper[4585]: I1201 14:56:39.664091 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_1ce769f5-6fc1-4585-a050-98ec1e1d9915/nova-cell1-novncproxy-novncproxy/0.log" Dec 01 14:56:39 crc kubenswrapper[4585]: I1201 14:56:39.702559 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-b48gn_a10b857d-29b7-46a5-9c12-775200f3ab74/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 14:56:39 crc kubenswrapper[4585]: I1201 14:56:39.979287 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_61c99c9c-c763-4e3a-8e71-6d78e3779991/nova-metadata-log/0.log" Dec 01 14:56:40 crc kubenswrapper[4585]: I1201 14:56:40.212721 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_471a678b-d81a-4526-b826-65b359685c99/mysql-bootstrap/0.log" Dec 01 14:56:40 crc kubenswrapper[4585]: I1201 14:56:40.296556 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_779dee18-6c1e-4e00-b3be-22ce3d8e2259/nova-scheduler-scheduler/0.log" Dec 01 14:56:40 crc kubenswrapper[4585]: I1201 14:56:40.526439 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_471a678b-d81a-4526-b826-65b359685c99/galera/0.log" Dec 01 14:56:40 crc kubenswrapper[4585]: I1201 14:56:40.541925 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_471a678b-d81a-4526-b826-65b359685c99/mysql-bootstrap/0.log" Dec 01 14:56:40 crc kubenswrapper[4585]: I1201 14:56:40.670454 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_61a59437-2c03-417a-839f-6b610fa43a83/mysql-bootstrap/0.log" Dec 01 14:56:40 crc kubenswrapper[4585]: I1201 14:56:40.677466 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_61c99c9c-c763-4e3a-8e71-6d78e3779991/nova-metadata-metadata/0.log" Dec 01 14:56:40 crc kubenswrapper[4585]: I1201 14:56:40.848089 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_61a59437-2c03-417a-839f-6b610fa43a83/mysql-bootstrap/0.log" Dec 01 14:56:40 crc kubenswrapper[4585]: I1201 14:56:40.889186 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_61a59437-2c03-417a-839f-6b610fa43a83/galera/0.log" Dec 01 14:56:40 crc kubenswrapper[4585]: I1201 14:56:40.928187 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_29f23dd7-40f6-4677-bd4c-ebdf3152b72f/openstackclient/0.log" Dec 01 14:56:41 crc kubenswrapper[4585]: I1201 14:56:41.122168 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-m7xvm_2f3d9474-e60e-401e-8597-1bd7af4f34c3/ovn-controller/0.log" Dec 01 14:56:41 crc kubenswrapper[4585]: I1201 14:56:41.131236 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-xddp8_5299e047-b328-440f-a888-8001cad4933b/openstack-network-exporter/0.log" Dec 01 14:56:41 crc kubenswrapper[4585]: I1201 14:56:41.250287 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-rt4xq_7c01d629-7b26-457f-8ab7-e67464b2e578/ovsdb-server-init/0.log" Dec 01 14:56:41 crc kubenswrapper[4585]: I1201 14:56:41.401449 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-rt4xq_7c01d629-7b26-457f-8ab7-e67464b2e578/ovsdb-server-init/0.log" Dec 01 14:56:41 crc kubenswrapper[4585]: I1201 14:56:41.406735 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-rt4xq_7c01d629-7b26-457f-8ab7-e67464b2e578/ovsdb-server/0.log" Dec 01 14:56:41 crc kubenswrapper[4585]: I1201 14:56:41.457305 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-rt4xq_7c01d629-7b26-457f-8ab7-e67464b2e578/ovs-vswitchd/0.log" Dec 01 14:56:41 crc kubenswrapper[4585]: I1201 14:56:41.542568 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-n7w8n_3a6c0545-e1eb-412f-afee-4764733eff64/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 14:56:41 crc kubenswrapper[4585]: I1201 14:56:41.686599 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_e15da9d0-0ba7-4885-8da4-89631b7886f6/openstack-network-exporter/0.log" Dec 01 14:56:41 crc kubenswrapper[4585]: I1201 14:56:41.732402 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_e15da9d0-0ba7-4885-8da4-89631b7886f6/ovn-northd/0.log" Dec 01 14:56:41 crc kubenswrapper[4585]: I1201 14:56:41.778034 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_fa90f38d-8525-4b91-9a7b-717ddc968614/openstack-network-exporter/0.log" Dec 01 14:56:41 crc kubenswrapper[4585]: I1201 14:56:41.884999 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_fa90f38d-8525-4b91-9a7b-717ddc968614/ovsdbserver-nb/0.log" Dec 01 14:56:41 crc kubenswrapper[4585]: I1201 14:56:41.992688 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_a3ceca23-b268-4b00-a4c2-026390eae759/openstack-network-exporter/0.log" Dec 01 14:56:42 crc kubenswrapper[4585]: I1201 14:56:42.001452 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_a3ceca23-b268-4b00-a4c2-026390eae759/ovsdbserver-sb/0.log" Dec 01 14:56:42 crc kubenswrapper[4585]: I1201 14:56:42.139397 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5594675dd-jdqsw_c8ba0e29-a5ba-4540-a9b6-154a30ff9e99/placement-api/0.log" Dec 01 14:56:42 crc kubenswrapper[4585]: I1201 14:56:42.209594 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5594675dd-jdqsw_c8ba0e29-a5ba-4540-a9b6-154a30ff9e99/placement-log/0.log" Dec 01 14:56:42 crc kubenswrapper[4585]: I1201 14:56:42.252670 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_645c2200-d127-4ffe-a91e-9f9ae104dc06/setup-container/0.log" Dec 01 14:56:42 crc kubenswrapper[4585]: I1201 14:56:42.442042 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_645c2200-d127-4ffe-a91e-9f9ae104dc06/setup-container/0.log" Dec 01 14:56:42 crc kubenswrapper[4585]: I1201 14:56:42.453297 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_645c2200-d127-4ffe-a91e-9f9ae104dc06/rabbitmq/0.log" Dec 01 14:56:42 crc kubenswrapper[4585]: I1201 14:56:42.498202 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_b271e13c-b935-4f31-a32d-865af7228e55/setup-container/0.log" Dec 01 14:56:42 crc kubenswrapper[4585]: I1201 14:56:42.680076 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_b271e13c-b935-4f31-a32d-865af7228e55/rabbitmq/0.log" Dec 01 14:56:42 crc kubenswrapper[4585]: I1201 14:56:42.690239 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-g98cz_27a8adc5-7598-4bf7-b46f-9a853afce3e6/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 14:56:42 crc kubenswrapper[4585]: I1201 14:56:42.737999 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_b271e13c-b935-4f31-a32d-865af7228e55/setup-container/0.log" Dec 01 14:56:42 crc kubenswrapper[4585]: I1201 14:56:42.967539 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-47qxg_883ed263-3b11-459f-83d8-c29a49f9c79c/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 14:56:43 crc kubenswrapper[4585]: I1201 14:56:43.000038 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-p647g_c334f141-1564-4112-a013-53207cf5900c/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 14:56:43 crc kubenswrapper[4585]: I1201 14:56:43.051904 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-td45x_3d807047-8744-4a9e-9bf8-1f492a8034b5/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 14:56:43 crc kubenswrapper[4585]: I1201 14:56:43.291081 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-pbt6b_31a42d9c-35e4-437d-8f54-47a3cef27d7e/ssh-known-hosts-edpm-deployment/0.log" Dec 01 14:56:43 crc kubenswrapper[4585]: I1201 14:56:43.397308 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5dc687f89-lzwxh_8f12e73f-f03a-4a68-a3e5-d4373d8fc583/proxy-httpd/0.log" Dec 01 14:56:43 crc kubenswrapper[4585]: I1201 14:56:43.406839 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5dc687f89-lzwxh_8f12e73f-f03a-4a68-a3e5-d4373d8fc583/proxy-server/0.log" Dec 01 14:56:43 crc kubenswrapper[4585]: I1201 14:56:43.512949 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-7zn7j_26bcdef2-b1e8-4848-abc4-b1f6a45c9916/swift-ring-rebalance/0.log" Dec 01 14:56:43 crc kubenswrapper[4585]: I1201 14:56:43.635200 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3bc5c97f-1882-47da-843c-f8dba234f1f3/account-auditor/0.log" Dec 01 14:56:43 crc kubenswrapper[4585]: I1201 14:56:43.651951 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3bc5c97f-1882-47da-843c-f8dba234f1f3/account-reaper/0.log" Dec 01 14:56:43 crc kubenswrapper[4585]: I1201 14:56:43.704289 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3bc5c97f-1882-47da-843c-f8dba234f1f3/account-replicator/0.log" Dec 01 14:56:43 crc kubenswrapper[4585]: I1201 14:56:43.772802 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3bc5c97f-1882-47da-843c-f8dba234f1f3/account-server/0.log" Dec 01 14:56:43 crc kubenswrapper[4585]: I1201 14:56:43.852989 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3bc5c97f-1882-47da-843c-f8dba234f1f3/container-auditor/0.log" Dec 01 14:56:43 crc kubenswrapper[4585]: I1201 14:56:43.898761 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3bc5c97f-1882-47da-843c-f8dba234f1f3/container-replicator/0.log" Dec 01 14:56:43 crc kubenswrapper[4585]: I1201 14:56:43.904179 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3bc5c97f-1882-47da-843c-f8dba234f1f3/container-server/0.log" Dec 01 14:56:43 crc kubenswrapper[4585]: I1201 14:56:43.953663 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3bc5c97f-1882-47da-843c-f8dba234f1f3/container-updater/0.log" Dec 01 14:56:44 crc kubenswrapper[4585]: I1201 14:56:44.010181 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3bc5c97f-1882-47da-843c-f8dba234f1f3/object-auditor/0.log" Dec 01 14:56:44 crc kubenswrapper[4585]: I1201 14:56:44.054804 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3bc5c97f-1882-47da-843c-f8dba234f1f3/object-expirer/0.log" Dec 01 14:56:44 crc kubenswrapper[4585]: I1201 14:56:44.165472 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3bc5c97f-1882-47da-843c-f8dba234f1f3/object-server/0.log" Dec 01 14:56:44 crc kubenswrapper[4585]: I1201 14:56:44.173714 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3bc5c97f-1882-47da-843c-f8dba234f1f3/object-replicator/0.log" Dec 01 14:56:44 crc kubenswrapper[4585]: I1201 14:56:44.384073 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3bc5c97f-1882-47da-843c-f8dba234f1f3/object-updater/0.log" Dec 01 14:56:44 crc kubenswrapper[4585]: I1201 14:56:44.454400 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3bc5c97f-1882-47da-843c-f8dba234f1f3/rsync/0.log" Dec 01 14:56:44 crc kubenswrapper[4585]: I1201 14:56:44.505027 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3bc5c97f-1882-47da-843c-f8dba234f1f3/swift-recon-cron/0.log" Dec 01 14:56:44 crc kubenswrapper[4585]: I1201 14:56:44.608076 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-8cmsb_1232f97e-9bf9-4917-b806-e5de8f180f70/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 14:56:44 crc kubenswrapper[4585]: I1201 14:56:44.709278 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_8c35f110-b7a3-4cbc-b181-1589a74f5d89/tempest-tests-tempest-tests-runner/0.log" Dec 01 14:56:44 crc kubenswrapper[4585]: I1201 14:56:44.828090 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_e0b341a5-c1b2-40a7-b2c4-0128fe7a389f/test-operator-logs-container/0.log" Dec 01 14:56:44 crc kubenswrapper[4585]: I1201 14:56:44.928850 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-jrbqs_a09e5590-d28a-4c20-80eb-ff1f448ec290/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 14:56:59 crc kubenswrapper[4585]: I1201 14:56:59.079024 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-td9hc"] Dec 01 14:56:59 crc kubenswrapper[4585]: E1201 14:56:59.079884 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4f0138f-de44-4408-99dc-fc5f1a7afe96" containerName="container-00" Dec 01 14:56:59 crc kubenswrapper[4585]: I1201 14:56:59.079898 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4f0138f-de44-4408-99dc-fc5f1a7afe96" containerName="container-00" Dec 01 14:56:59 crc kubenswrapper[4585]: I1201 14:56:59.080090 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4f0138f-de44-4408-99dc-fc5f1a7afe96" containerName="container-00" Dec 01 14:56:59 crc kubenswrapper[4585]: I1201 14:56:59.081491 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-td9hc" Dec 01 14:56:59 crc kubenswrapper[4585]: I1201 14:56:59.108316 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-td9hc"] Dec 01 14:56:59 crc kubenswrapper[4585]: I1201 14:56:59.208408 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04714062-a561-43b3-a168-905574386a8c-catalog-content\") pod \"community-operators-td9hc\" (UID: \"04714062-a561-43b3-a168-905574386a8c\") " pod="openshift-marketplace/community-operators-td9hc" Dec 01 14:56:59 crc kubenswrapper[4585]: I1201 14:56:59.208490 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04714062-a561-43b3-a168-905574386a8c-utilities\") pod \"community-operators-td9hc\" (UID: \"04714062-a561-43b3-a168-905574386a8c\") " pod="openshift-marketplace/community-operators-td9hc" Dec 01 14:56:59 crc kubenswrapper[4585]: I1201 14:56:59.208725 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5t4r\" (UniqueName: \"kubernetes.io/projected/04714062-a561-43b3-a168-905574386a8c-kube-api-access-c5t4r\") pod \"community-operators-td9hc\" (UID: \"04714062-a561-43b3-a168-905574386a8c\") " pod="openshift-marketplace/community-operators-td9hc" Dec 01 14:56:59 crc kubenswrapper[4585]: I1201 14:56:59.310961 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04714062-a561-43b3-a168-905574386a8c-catalog-content\") pod \"community-operators-td9hc\" (UID: \"04714062-a561-43b3-a168-905574386a8c\") " pod="openshift-marketplace/community-operators-td9hc" Dec 01 14:56:59 crc kubenswrapper[4585]: I1201 14:56:59.311067 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04714062-a561-43b3-a168-905574386a8c-utilities\") pod \"community-operators-td9hc\" (UID: \"04714062-a561-43b3-a168-905574386a8c\") " pod="openshift-marketplace/community-operators-td9hc" Dec 01 14:56:59 crc kubenswrapper[4585]: I1201 14:56:59.311174 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5t4r\" (UniqueName: \"kubernetes.io/projected/04714062-a561-43b3-a168-905574386a8c-kube-api-access-c5t4r\") pod \"community-operators-td9hc\" (UID: \"04714062-a561-43b3-a168-905574386a8c\") " pod="openshift-marketplace/community-operators-td9hc" Dec 01 14:56:59 crc kubenswrapper[4585]: I1201 14:56:59.311571 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04714062-a561-43b3-a168-905574386a8c-catalog-content\") pod \"community-operators-td9hc\" (UID: \"04714062-a561-43b3-a168-905574386a8c\") " pod="openshift-marketplace/community-operators-td9hc" Dec 01 14:56:59 crc kubenswrapper[4585]: I1201 14:56:59.311657 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04714062-a561-43b3-a168-905574386a8c-utilities\") pod \"community-operators-td9hc\" (UID: \"04714062-a561-43b3-a168-905574386a8c\") " pod="openshift-marketplace/community-operators-td9hc" Dec 01 14:56:59 crc kubenswrapper[4585]: I1201 14:56:59.331268 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5t4r\" (UniqueName: \"kubernetes.io/projected/04714062-a561-43b3-a168-905574386a8c-kube-api-access-c5t4r\") pod \"community-operators-td9hc\" (UID: \"04714062-a561-43b3-a168-905574386a8c\") " pod="openshift-marketplace/community-operators-td9hc" Dec 01 14:56:59 crc kubenswrapper[4585]: I1201 14:56:59.408065 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-td9hc" Dec 01 14:56:59 crc kubenswrapper[4585]: I1201 14:56:59.986943 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-td9hc"] Dec 01 14:57:00 crc kubenswrapper[4585]: I1201 14:57:00.947982 4585 generic.go:334] "Generic (PLEG): container finished" podID="04714062-a561-43b3-a168-905574386a8c" containerID="fd934a15de253ee7f6b37cff90a0be1fda3a9f5c80236654f85d50607ca97e18" exitCode=0 Dec 01 14:57:00 crc kubenswrapper[4585]: I1201 14:57:00.948071 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-td9hc" event={"ID":"04714062-a561-43b3-a168-905574386a8c","Type":"ContainerDied","Data":"fd934a15de253ee7f6b37cff90a0be1fda3a9f5c80236654f85d50607ca97e18"} Dec 01 14:57:00 crc kubenswrapper[4585]: I1201 14:57:00.948271 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-td9hc" event={"ID":"04714062-a561-43b3-a168-905574386a8c","Type":"ContainerStarted","Data":"f0a5112992f399912964d8f65dfda45411dba245fc2692574594dffc937ba96f"} Dec 01 14:57:01 crc kubenswrapper[4585]: I1201 14:57:01.958724 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-td9hc" event={"ID":"04714062-a561-43b3-a168-905574386a8c","Type":"ContainerStarted","Data":"d8e55eb3994f20d50b5c61c436f722c74f661909279c79a8789b469be974c2f0"} Dec 01 14:57:02 crc kubenswrapper[4585]: I1201 14:57:02.970475 4585 generic.go:334] "Generic (PLEG): container finished" podID="04714062-a561-43b3-a168-905574386a8c" containerID="d8e55eb3994f20d50b5c61c436f722c74f661909279c79a8789b469be974c2f0" exitCode=0 Dec 01 14:57:02 crc kubenswrapper[4585]: I1201 14:57:02.970589 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-td9hc" event={"ID":"04714062-a561-43b3-a168-905574386a8c","Type":"ContainerDied","Data":"d8e55eb3994f20d50b5c61c436f722c74f661909279c79a8789b469be974c2f0"} Dec 01 14:57:03 crc kubenswrapper[4585]: I1201 14:57:03.982498 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-td9hc" event={"ID":"04714062-a561-43b3-a168-905574386a8c","Type":"ContainerStarted","Data":"98d0f0af0168b274f1612ed71afe5ec322f93bf224916dd872cef4a9f3a3134a"} Dec 01 14:57:03 crc kubenswrapper[4585]: I1201 14:57:03.999726 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-td9hc" podStartSLOduration=2.385003473 podStartE2EDuration="4.99970795s" podCreationTimestamp="2025-12-01 14:56:59 +0000 UTC" firstStartedPulling="2025-12-01 14:57:00.950421872 +0000 UTC m=+3534.934635727" lastFinishedPulling="2025-12-01 14:57:03.565126349 +0000 UTC m=+3537.549340204" observedRunningTime="2025-12-01 14:57:03.998407726 +0000 UTC m=+3537.982621591" watchObservedRunningTime="2025-12-01 14:57:03.99970795 +0000 UTC m=+3537.983921805" Dec 01 14:57:09 crc kubenswrapper[4585]: I1201 14:57:09.044123 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_92745d039728c2a50a72aca57f7e5e275bb4aa1a0150662e177c6352fbgxw5c_d29365aa-3c8b-46c7-8b46-eb101a582cc2/util/0.log" Dec 01 14:57:09 crc kubenswrapper[4585]: I1201 14:57:09.327597 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_92745d039728c2a50a72aca57f7e5e275bb4aa1a0150662e177c6352fbgxw5c_d29365aa-3c8b-46c7-8b46-eb101a582cc2/pull/0.log" Dec 01 14:57:09 crc kubenswrapper[4585]: I1201 14:57:09.335177 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_92745d039728c2a50a72aca57f7e5e275bb4aa1a0150662e177c6352fbgxw5c_d29365aa-3c8b-46c7-8b46-eb101a582cc2/pull/0.log" Dec 01 14:57:09 crc kubenswrapper[4585]: I1201 14:57:09.340909 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_92745d039728c2a50a72aca57f7e5e275bb4aa1a0150662e177c6352fbgxw5c_d29365aa-3c8b-46c7-8b46-eb101a582cc2/util/0.log" Dec 01 14:57:09 crc kubenswrapper[4585]: I1201 14:57:09.408797 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-td9hc" Dec 01 14:57:09 crc kubenswrapper[4585]: I1201 14:57:09.408842 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-td9hc" Dec 01 14:57:09 crc kubenswrapper[4585]: I1201 14:57:09.454949 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-td9hc" Dec 01 14:57:09 crc kubenswrapper[4585]: I1201 14:57:09.551125 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_92745d039728c2a50a72aca57f7e5e275bb4aa1a0150662e177c6352fbgxw5c_d29365aa-3c8b-46c7-8b46-eb101a582cc2/util/0.log" Dec 01 14:57:09 crc kubenswrapper[4585]: I1201 14:57:09.618242 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_92745d039728c2a50a72aca57f7e5e275bb4aa1a0150662e177c6352fbgxw5c_d29365aa-3c8b-46c7-8b46-eb101a582cc2/extract/0.log" Dec 01 14:57:09 crc kubenswrapper[4585]: I1201 14:57:09.618322 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_92745d039728c2a50a72aca57f7e5e275bb4aa1a0150662e177c6352fbgxw5c_d29365aa-3c8b-46c7-8b46-eb101a582cc2/pull/0.log" Dec 01 14:57:09 crc kubenswrapper[4585]: I1201 14:57:09.766145 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-br8df_8da768c2-cb8c-40f9-b8d1-54a66743b340/kube-rbac-proxy/0.log" Dec 01 14:57:09 crc kubenswrapper[4585]: I1201 14:57:09.881311 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-qnbzj_59994d2c-6485-4beb-bcfc-3fd4a22bd203/kube-rbac-proxy/0.log" Dec 01 14:57:09 crc kubenswrapper[4585]: I1201 14:57:09.887181 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-br8df_8da768c2-cb8c-40f9-b8d1-54a66743b340/manager/0.log" Dec 01 14:57:10 crc kubenswrapper[4585]: I1201 14:57:10.068486 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-qnbzj_59994d2c-6485-4beb-bcfc-3fd4a22bd203/manager/0.log" Dec 01 14:57:10 crc kubenswrapper[4585]: I1201 14:57:10.085014 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-td9hc" Dec 01 14:57:10 crc kubenswrapper[4585]: I1201 14:57:10.138893 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-td9hc"] Dec 01 14:57:10 crc kubenswrapper[4585]: I1201 14:57:10.141865 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-8qd82_c4697227-2800-4a64-89bf-5bf831077ceb/manager/0.log" Dec 01 14:57:10 crc kubenswrapper[4585]: I1201 14:57:10.211762 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-8qd82_c4697227-2800-4a64-89bf-5bf831077ceb/kube-rbac-proxy/0.log" Dec 01 14:57:10 crc kubenswrapper[4585]: I1201 14:57:10.343759 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-668d9c48b9-4kmsq_7f8c91fb-441e-44f0-bf97-1340df47f4b0/kube-rbac-proxy/0.log" Dec 01 14:57:10 crc kubenswrapper[4585]: I1201 14:57:10.511335 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-668d9c48b9-4kmsq_7f8c91fb-441e-44f0-bf97-1340df47f4b0/manager/0.log" Dec 01 14:57:10 crc kubenswrapper[4585]: I1201 14:57:10.644716 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-qpqgr_60bbdebb-4ac8-4971-82b2-252a989a8c3a/kube-rbac-proxy/0.log" Dec 01 14:57:10 crc kubenswrapper[4585]: I1201 14:57:10.645473 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-qpqgr_60bbdebb-4ac8-4971-82b2-252a989a8c3a/manager/0.log" Dec 01 14:57:10 crc kubenswrapper[4585]: I1201 14:57:10.777791 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-vxz95_5afd79b9-5528-4ffe-9d3f-ac7b05502348/kube-rbac-proxy/0.log" Dec 01 14:57:10 crc kubenswrapper[4585]: I1201 14:57:10.891091 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-vxz95_5afd79b9-5528-4ffe-9d3f-ac7b05502348/manager/0.log" Dec 01 14:57:10 crc kubenswrapper[4585]: I1201 14:57:10.974496 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-77nsb_f496d7d1-7362-487d-88d7-33e2c26ce97b/kube-rbac-proxy/0.log" Dec 01 14:57:11 crc kubenswrapper[4585]: I1201 14:57:11.123202 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-sqs7f_1d516bcd-4ed7-4c83-a07e-3a8f66761090/kube-rbac-proxy/0.log" Dec 01 14:57:11 crc kubenswrapper[4585]: I1201 14:57:11.169919 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-77nsb_f496d7d1-7362-487d-88d7-33e2c26ce97b/manager/0.log" Dec 01 14:57:11 crc kubenswrapper[4585]: I1201 14:57:11.247164 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-sqs7f_1d516bcd-4ed7-4c83-a07e-3a8f66761090/manager/0.log" Dec 01 14:57:11 crc kubenswrapper[4585]: I1201 14:57:11.366169 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-546d4bdf48-xsbwl_50b75abe-8fa5-4e48-87bb-560b5609feda/kube-rbac-proxy/0.log" Dec 01 14:57:11 crc kubenswrapper[4585]: I1201 14:57:11.490732 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-546d4bdf48-xsbwl_50b75abe-8fa5-4e48-87bb-560b5609feda/manager/0.log" Dec 01 14:57:11 crc kubenswrapper[4585]: I1201 14:57:11.558615 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6546668bfd-pmpnl_abe5e9b4-4f45-4fb6-92f7-739d4174996b/kube-rbac-proxy/0.log" Dec 01 14:57:11 crc kubenswrapper[4585]: I1201 14:57:11.656108 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6546668bfd-pmpnl_abe5e9b4-4f45-4fb6-92f7-739d4174996b/manager/0.log" Dec 01 14:57:11 crc kubenswrapper[4585]: I1201 14:57:11.709285 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-ndffl_1c21caba-6277-4106-b637-a4874412f527/kube-rbac-proxy/0.log" Dec 01 14:57:11 crc kubenswrapper[4585]: I1201 14:57:11.849869 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-ndffl_1c21caba-6277-4106-b637-a4874412f527/manager/0.log" Dec 01 14:57:11 crc kubenswrapper[4585]: I1201 14:57:11.969618 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-8fdcb_f62dd90c-aa85-4650-92e0-13e52ec60360/kube-rbac-proxy/0.log" Dec 01 14:57:12 crc kubenswrapper[4585]: I1201 14:57:12.030011 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-8fdcb_f62dd90c-aa85-4650-92e0-13e52ec60360/manager/0.log" Dec 01 14:57:12 crc kubenswrapper[4585]: I1201 14:57:12.043334 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-td9hc" podUID="04714062-a561-43b3-a168-905574386a8c" containerName="registry-server" containerID="cri-o://98d0f0af0168b274f1612ed71afe5ec322f93bf224916dd872cef4a9f3a3134a" gracePeriod=2 Dec 01 14:57:12 crc kubenswrapper[4585]: I1201 14:57:12.227856 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-w4g57_baa99a85-be34-458d-bc16-c367d4635b10/kube-rbac-proxy/0.log" Dec 01 14:57:12 crc kubenswrapper[4585]: I1201 14:57:12.275966 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-w4g57_baa99a85-be34-458d-bc16-c367d4635b10/manager/0.log" Dec 01 14:57:12 crc kubenswrapper[4585]: I1201 14:57:12.548421 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-6hdc6_0854e7b6-a6fb-4fd0-9e48-564df5d8fea2/kube-rbac-proxy/0.log" Dec 01 14:57:12 crc kubenswrapper[4585]: I1201 14:57:12.622824 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-td9hc" Dec 01 14:57:12 crc kubenswrapper[4585]: I1201 14:57:12.708055 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-6hdc6_0854e7b6-a6fb-4fd0-9e48-564df5d8fea2/manager/0.log" Dec 01 14:57:12 crc kubenswrapper[4585]: I1201 14:57:12.719109 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04714062-a561-43b3-a168-905574386a8c-catalog-content\") pod \"04714062-a561-43b3-a168-905574386a8c\" (UID: \"04714062-a561-43b3-a168-905574386a8c\") " Dec 01 14:57:12 crc kubenswrapper[4585]: I1201 14:57:12.719333 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5t4r\" (UniqueName: \"kubernetes.io/projected/04714062-a561-43b3-a168-905574386a8c-kube-api-access-c5t4r\") pod \"04714062-a561-43b3-a168-905574386a8c\" (UID: \"04714062-a561-43b3-a168-905574386a8c\") " Dec 01 14:57:12 crc kubenswrapper[4585]: I1201 14:57:12.719381 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04714062-a561-43b3-a168-905574386a8c-utilities\") pod \"04714062-a561-43b3-a168-905574386a8c\" (UID: \"04714062-a561-43b3-a168-905574386a8c\") " Dec 01 14:57:12 crc kubenswrapper[4585]: I1201 14:57:12.720074 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04714062-a561-43b3-a168-905574386a8c-utilities" (OuterVolumeSpecName: "utilities") pod "04714062-a561-43b3-a168-905574386a8c" (UID: "04714062-a561-43b3-a168-905574386a8c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:57:12 crc kubenswrapper[4585]: I1201 14:57:12.722997 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4fr7bh_bcc7d39e-d462-4eaa-89fa-625c72c956b6/kube-rbac-proxy/0.log" Dec 01 14:57:12 crc kubenswrapper[4585]: I1201 14:57:12.725222 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04714062-a561-43b3-a168-905574386a8c-kube-api-access-c5t4r" (OuterVolumeSpecName: "kube-api-access-c5t4r") pod "04714062-a561-43b3-a168-905574386a8c" (UID: "04714062-a561-43b3-a168-905574386a8c"). InnerVolumeSpecName "kube-api-access-c5t4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 14:57:12 crc kubenswrapper[4585]: I1201 14:57:12.767914 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04714062-a561-43b3-a168-905574386a8c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "04714062-a561-43b3-a168-905574386a8c" (UID: "04714062-a561-43b3-a168-905574386a8c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 14:57:12 crc kubenswrapper[4585]: I1201 14:57:12.822055 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5t4r\" (UniqueName: \"kubernetes.io/projected/04714062-a561-43b3-a168-905574386a8c-kube-api-access-c5t4r\") on node \"crc\" DevicePath \"\"" Dec 01 14:57:12 crc kubenswrapper[4585]: I1201 14:57:12.822089 4585 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04714062-a561-43b3-a168-905574386a8c-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 14:57:12 crc kubenswrapper[4585]: I1201 14:57:12.822099 4585 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04714062-a561-43b3-a168-905574386a8c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 14:57:12 crc kubenswrapper[4585]: I1201 14:57:12.862379 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4fr7bh_bcc7d39e-d462-4eaa-89fa-625c72c956b6/manager/0.log" Dec 01 14:57:13 crc kubenswrapper[4585]: I1201 14:57:13.058031 4585 generic.go:334] "Generic (PLEG): container finished" podID="04714062-a561-43b3-a168-905574386a8c" containerID="98d0f0af0168b274f1612ed71afe5ec322f93bf224916dd872cef4a9f3a3134a" exitCode=0 Dec 01 14:57:13 crc kubenswrapper[4585]: I1201 14:57:13.058085 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-td9hc" event={"ID":"04714062-a561-43b3-a168-905574386a8c","Type":"ContainerDied","Data":"98d0f0af0168b274f1612ed71afe5ec322f93bf224916dd872cef4a9f3a3134a"} Dec 01 14:57:13 crc kubenswrapper[4585]: I1201 14:57:13.058117 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-td9hc" event={"ID":"04714062-a561-43b3-a168-905574386a8c","Type":"ContainerDied","Data":"f0a5112992f399912964d8f65dfda45411dba245fc2692574594dffc937ba96f"} Dec 01 14:57:13 crc kubenswrapper[4585]: I1201 14:57:13.058137 4585 scope.go:117] "RemoveContainer" containerID="98d0f0af0168b274f1612ed71afe5ec322f93bf224916dd872cef4a9f3a3134a" Dec 01 14:57:13 crc kubenswrapper[4585]: I1201 14:57:13.058379 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-td9hc" Dec 01 14:57:13 crc kubenswrapper[4585]: I1201 14:57:13.083103 4585 scope.go:117] "RemoveContainer" containerID="d8e55eb3994f20d50b5c61c436f722c74f661909279c79a8789b469be974c2f0" Dec 01 14:57:13 crc kubenswrapper[4585]: I1201 14:57:13.127881 4585 scope.go:117] "RemoveContainer" containerID="fd934a15de253ee7f6b37cff90a0be1fda3a9f5c80236654f85d50607ca97e18" Dec 01 14:57:13 crc kubenswrapper[4585]: I1201 14:57:13.128172 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-td9hc"] Dec 01 14:57:13 crc kubenswrapper[4585]: I1201 14:57:13.161591 4585 scope.go:117] "RemoveContainer" containerID="98d0f0af0168b274f1612ed71afe5ec322f93bf224916dd872cef4a9f3a3134a" Dec 01 14:57:13 crc kubenswrapper[4585]: E1201 14:57:13.163527 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98d0f0af0168b274f1612ed71afe5ec322f93bf224916dd872cef4a9f3a3134a\": container with ID starting with 98d0f0af0168b274f1612ed71afe5ec322f93bf224916dd872cef4a9f3a3134a not found: ID does not exist" containerID="98d0f0af0168b274f1612ed71afe5ec322f93bf224916dd872cef4a9f3a3134a" Dec 01 14:57:13 crc kubenswrapper[4585]: I1201 14:57:13.163567 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98d0f0af0168b274f1612ed71afe5ec322f93bf224916dd872cef4a9f3a3134a"} err="failed to get container status \"98d0f0af0168b274f1612ed71afe5ec322f93bf224916dd872cef4a9f3a3134a\": rpc error: code = NotFound desc = could not find container \"98d0f0af0168b274f1612ed71afe5ec322f93bf224916dd872cef4a9f3a3134a\": container with ID starting with 98d0f0af0168b274f1612ed71afe5ec322f93bf224916dd872cef4a9f3a3134a not found: ID does not exist" Dec 01 14:57:13 crc kubenswrapper[4585]: I1201 14:57:13.163589 4585 scope.go:117] "RemoveContainer" containerID="d8e55eb3994f20d50b5c61c436f722c74f661909279c79a8789b469be974c2f0" Dec 01 14:57:13 crc kubenswrapper[4585]: E1201 14:57:13.170817 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8e55eb3994f20d50b5c61c436f722c74f661909279c79a8789b469be974c2f0\": container with ID starting with d8e55eb3994f20d50b5c61c436f722c74f661909279c79a8789b469be974c2f0 not found: ID does not exist" containerID="d8e55eb3994f20d50b5c61c436f722c74f661909279c79a8789b469be974c2f0" Dec 01 14:57:13 crc kubenswrapper[4585]: I1201 14:57:13.170859 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8e55eb3994f20d50b5c61c436f722c74f661909279c79a8789b469be974c2f0"} err="failed to get container status \"d8e55eb3994f20d50b5c61c436f722c74f661909279c79a8789b469be974c2f0\": rpc error: code = NotFound desc = could not find container \"d8e55eb3994f20d50b5c61c436f722c74f661909279c79a8789b469be974c2f0\": container with ID starting with d8e55eb3994f20d50b5c61c436f722c74f661909279c79a8789b469be974c2f0 not found: ID does not exist" Dec 01 14:57:13 crc kubenswrapper[4585]: I1201 14:57:13.170899 4585 scope.go:117] "RemoveContainer" containerID="fd934a15de253ee7f6b37cff90a0be1fda3a9f5c80236654f85d50607ca97e18" Dec 01 14:57:13 crc kubenswrapper[4585]: E1201 14:57:13.171265 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd934a15de253ee7f6b37cff90a0be1fda3a9f5c80236654f85d50607ca97e18\": container with ID starting with fd934a15de253ee7f6b37cff90a0be1fda3a9f5c80236654f85d50607ca97e18 not found: ID does not exist" containerID="fd934a15de253ee7f6b37cff90a0be1fda3a9f5c80236654f85d50607ca97e18" Dec 01 14:57:13 crc kubenswrapper[4585]: I1201 14:57:13.171287 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd934a15de253ee7f6b37cff90a0be1fda3a9f5c80236654f85d50607ca97e18"} err="failed to get container status \"fd934a15de253ee7f6b37cff90a0be1fda3a9f5c80236654f85d50607ca97e18\": rpc error: code = NotFound desc = could not find container \"fd934a15de253ee7f6b37cff90a0be1fda3a9f5c80236654f85d50607ca97e18\": container with ID starting with fd934a15de253ee7f6b37cff90a0be1fda3a9f5c80236654f85d50607ca97e18 not found: ID does not exist" Dec 01 14:57:13 crc kubenswrapper[4585]: I1201 14:57:13.171755 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-td9hc"] Dec 01 14:57:13 crc kubenswrapper[4585]: I1201 14:57:13.359020 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-47c9p_12b19089-35d3-41e8-b50f-385c3d8bb27a/registry-server/0.log" Dec 01 14:57:13 crc kubenswrapper[4585]: I1201 14:57:13.396174 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-d645d669b-rhjvp_9c187dfe-402b-4e73-8f4f-3d9dcf360954/operator/0.log" Dec 01 14:57:13 crc kubenswrapper[4585]: I1201 14:57:13.715664 4585 patch_prober.go:28] interesting pod/machine-config-daemon-lj9gs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 14:57:13 crc kubenswrapper[4585]: I1201 14:57:13.715995 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 14:57:13 crc kubenswrapper[4585]: I1201 14:57:13.720563 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-r4qhj_7b6381d5-3b01-4c14-a553-e4a51274b140/kube-rbac-proxy/0.log" Dec 01 14:57:13 crc kubenswrapper[4585]: I1201 14:57:13.927544 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-r4qhj_7b6381d5-3b01-4c14-a553-e4a51274b140/manager/0.log" Dec 01 14:57:13 crc kubenswrapper[4585]: I1201 14:57:13.941478 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-b9b8558c-w5sxw_2177fed7-edae-4e55-94fd-2037166cbfdc/manager/0.log" Dec 01 14:57:14 crc kubenswrapper[4585]: I1201 14:57:14.049657 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-m6vwb_cdbd6707-63ae-429d-8111-48ab6f912699/kube-rbac-proxy/0.log" Dec 01 14:57:14 crc kubenswrapper[4585]: I1201 14:57:14.069321 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-m6vwb_cdbd6707-63ae-429d-8111-48ab6f912699/manager/0.log" Dec 01 14:57:14 crc kubenswrapper[4585]: I1201 14:57:14.138853 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-dcz6q_f4100ac0-da14-4d72-88e8-7f7356dad361/operator/0.log" Dec 01 14:57:14 crc kubenswrapper[4585]: I1201 14:57:14.296316 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5448bbd495-75vsz_b847594a-d018-4939-8177-3faf4a42da5a/manager/0.log" Dec 01 14:57:14 crc kubenswrapper[4585]: I1201 14:57:14.323322 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5448bbd495-75vsz_b847594a-d018-4939-8177-3faf4a42da5a/kube-rbac-proxy/0.log" Dec 01 14:57:14 crc kubenswrapper[4585]: I1201 14:57:14.425792 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04714062-a561-43b3-a168-905574386a8c" path="/var/lib/kubelet/pods/04714062-a561-43b3-a168-905574386a8c/volumes" Dec 01 14:57:14 crc kubenswrapper[4585]: I1201 14:57:14.456701 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-tgt7n_a672a71f-0885-4771-811e-fd658d282a84/kube-rbac-proxy/0.log" Dec 01 14:57:14 crc kubenswrapper[4585]: I1201 14:57:14.525426 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-tgt7n_a672a71f-0885-4771-811e-fd658d282a84/manager/0.log" Dec 01 14:57:14 crc kubenswrapper[4585]: I1201 14:57:14.581322 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-798dw_c756d201-c2d0-45f1-af3a-acdff1926a1a/kube-rbac-proxy/0.log" Dec 01 14:57:14 crc kubenswrapper[4585]: I1201 14:57:14.643026 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-798dw_c756d201-c2d0-45f1-af3a-acdff1926a1a/manager/0.log" Dec 01 14:57:14 crc kubenswrapper[4585]: I1201 14:57:14.757327 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-cqpws_59cf60ed-a0d2-4ddc-bfc5-d5973ecbb9e4/kube-rbac-proxy/0.log" Dec 01 14:57:14 crc kubenswrapper[4585]: I1201 14:57:14.784994 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-cqpws_59cf60ed-a0d2-4ddc-bfc5-d5973ecbb9e4/manager/0.log" Dec 01 14:57:35 crc kubenswrapper[4585]: I1201 14:57:35.624462 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-rn9hl_dc83aa4a-2686-47c8-876b-c6cf2192b493/control-plane-machine-set-operator/0.log" Dec 01 14:57:35 crc kubenswrapper[4585]: I1201 14:57:35.743861 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-42pj4_795dab1c-49d5-4b05-a84f-4e1655d459fc/kube-rbac-proxy/0.log" Dec 01 14:57:35 crc kubenswrapper[4585]: I1201 14:57:35.794042 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-42pj4_795dab1c-49d5-4b05-a84f-4e1655d459fc/machine-api-operator/0.log" Dec 01 14:57:43 crc kubenswrapper[4585]: I1201 14:57:43.716135 4585 patch_prober.go:28] interesting pod/machine-config-daemon-lj9gs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 14:57:43 crc kubenswrapper[4585]: I1201 14:57:43.717659 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 14:57:48 crc kubenswrapper[4585]: I1201 14:57:48.401442 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-mk428_45ed0e5e-d1d0-45c5-9710-bcc051a7956e/cert-manager-controller/0.log" Dec 01 14:57:48 crc kubenswrapper[4585]: I1201 14:57:48.563190 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-kfgj7_03ae09b7-07fe-4a7b-9012-c17019e6d0fa/cert-manager-cainjector/0.log" Dec 01 14:57:48 crc kubenswrapper[4585]: I1201 14:57:48.683638 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-2ldxr_f1560ff4-292f-425d-8b6f-d481c951c541/cert-manager-webhook/0.log" Dec 01 14:58:01 crc kubenswrapper[4585]: I1201 14:58:01.373342 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-kmtzd_5de4a007-93f4-45e6-a70a-5a036ff4377c/nmstate-console-plugin/0.log" Dec 01 14:58:01 crc kubenswrapper[4585]: I1201 14:58:01.525247 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-jj8n7_6c8cbbf5-3146-44bc-8533-17523cd27750/nmstate-handler/0.log" Dec 01 14:58:01 crc kubenswrapper[4585]: I1201 14:58:01.555670 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-w84ck_d03ec6db-a14f-40ee-80b7-2232ffc0a321/kube-rbac-proxy/0.log" Dec 01 14:58:01 crc kubenswrapper[4585]: I1201 14:58:01.599603 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-w84ck_d03ec6db-a14f-40ee-80b7-2232ffc0a321/nmstate-metrics/0.log" Dec 01 14:58:01 crc kubenswrapper[4585]: I1201 14:58:01.809442 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-c4cpl_dfe8ef28-9d20-49ce-8084-bfdfbc024e0c/nmstate-operator/0.log" Dec 01 14:58:01 crc kubenswrapper[4585]: I1201 14:58:01.822157 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-rvcwb_a4a15dc7-9cbc-4c39-b9ec-f73877001cd7/nmstate-webhook/0.log" Dec 01 14:58:13 crc kubenswrapper[4585]: I1201 14:58:13.716232 4585 patch_prober.go:28] interesting pod/machine-config-daemon-lj9gs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 14:58:13 crc kubenswrapper[4585]: I1201 14:58:13.716694 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 14:58:13 crc kubenswrapper[4585]: I1201 14:58:13.716736 4585 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" Dec 01 14:58:13 crc kubenswrapper[4585]: I1201 14:58:13.717494 4585 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1e0b4194459fe2d90ee7c18b376805cee001984de119c4ef64ab23cb42caa200"} pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 14:58:13 crc kubenswrapper[4585]: I1201 14:58:13.717546 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerName="machine-config-daemon" containerID="cri-o://1e0b4194459fe2d90ee7c18b376805cee001984de119c4ef64ab23cb42caa200" gracePeriod=600 Dec 01 14:58:14 crc kubenswrapper[4585]: I1201 14:58:14.575513 4585 generic.go:334] "Generic (PLEG): container finished" podID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerID="1e0b4194459fe2d90ee7c18b376805cee001984de119c4ef64ab23cb42caa200" exitCode=0 Dec 01 14:58:14 crc kubenswrapper[4585]: I1201 14:58:14.575555 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" event={"ID":"f7beb40d-bcd0-43c8-a9fe-c32408790a4c","Type":"ContainerDied","Data":"1e0b4194459fe2d90ee7c18b376805cee001984de119c4ef64ab23cb42caa200"} Dec 01 14:58:14 crc kubenswrapper[4585]: I1201 14:58:14.576171 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" event={"ID":"f7beb40d-bcd0-43c8-a9fe-c32408790a4c","Type":"ContainerStarted","Data":"696f9236094d9e4c23cc19363f2464907e2ec3ded59ec494b529b4e5801f208b"} Dec 01 14:58:14 crc kubenswrapper[4585]: I1201 14:58:14.576192 4585 scope.go:117] "RemoveContainer" containerID="7d09788976c97c2a3c377b62292f62228c7f604af4c086c3c984900409d574e6" Dec 01 14:58:18 crc kubenswrapper[4585]: I1201 14:58:18.710991 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-hx8mm_3028ccae-b87c-4752-9558-1399dc8fa279/kube-rbac-proxy/0.log" Dec 01 14:58:18 crc kubenswrapper[4585]: I1201 14:58:18.791981 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-hx8mm_3028ccae-b87c-4752-9558-1399dc8fa279/controller/0.log" Dec 01 14:58:18 crc kubenswrapper[4585]: I1201 14:58:18.968527 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nklgw_acd2a938-907c-443a-bd52-c0dfd4fbd455/cp-frr-files/0.log" Dec 01 14:58:19 crc kubenswrapper[4585]: I1201 14:58:19.202286 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nklgw_acd2a938-907c-443a-bd52-c0dfd4fbd455/cp-frr-files/0.log" Dec 01 14:58:19 crc kubenswrapper[4585]: I1201 14:58:19.207542 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nklgw_acd2a938-907c-443a-bd52-c0dfd4fbd455/cp-reloader/0.log" Dec 01 14:58:19 crc kubenswrapper[4585]: I1201 14:58:19.320751 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nklgw_acd2a938-907c-443a-bd52-c0dfd4fbd455/cp-reloader/0.log" Dec 01 14:58:19 crc kubenswrapper[4585]: I1201 14:58:19.329480 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nklgw_acd2a938-907c-443a-bd52-c0dfd4fbd455/cp-metrics/0.log" Dec 01 14:58:19 crc kubenswrapper[4585]: I1201 14:58:19.491748 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nklgw_acd2a938-907c-443a-bd52-c0dfd4fbd455/cp-reloader/0.log" Dec 01 14:58:19 crc kubenswrapper[4585]: I1201 14:58:19.524092 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nklgw_acd2a938-907c-443a-bd52-c0dfd4fbd455/cp-frr-files/0.log" Dec 01 14:58:19 crc kubenswrapper[4585]: I1201 14:58:19.540370 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nklgw_acd2a938-907c-443a-bd52-c0dfd4fbd455/cp-metrics/0.log" Dec 01 14:58:19 crc kubenswrapper[4585]: I1201 14:58:19.594536 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nklgw_acd2a938-907c-443a-bd52-c0dfd4fbd455/cp-metrics/0.log" Dec 01 14:58:20 crc kubenswrapper[4585]: I1201 14:58:20.080175 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nklgw_acd2a938-907c-443a-bd52-c0dfd4fbd455/cp-frr-files/0.log" Dec 01 14:58:20 crc kubenswrapper[4585]: I1201 14:58:20.117332 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nklgw_acd2a938-907c-443a-bd52-c0dfd4fbd455/cp-metrics/0.log" Dec 01 14:58:20 crc kubenswrapper[4585]: I1201 14:58:20.166135 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nklgw_acd2a938-907c-443a-bd52-c0dfd4fbd455/cp-reloader/0.log" Dec 01 14:58:20 crc kubenswrapper[4585]: I1201 14:58:20.218244 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nklgw_acd2a938-907c-443a-bd52-c0dfd4fbd455/controller/0.log" Dec 01 14:58:20 crc kubenswrapper[4585]: I1201 14:58:20.411619 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nklgw_acd2a938-907c-443a-bd52-c0dfd4fbd455/frr-metrics/0.log" Dec 01 14:58:20 crc kubenswrapper[4585]: I1201 14:58:20.485494 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nklgw_acd2a938-907c-443a-bd52-c0dfd4fbd455/kube-rbac-proxy/0.log" Dec 01 14:58:20 crc kubenswrapper[4585]: I1201 14:58:20.541700 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nklgw_acd2a938-907c-443a-bd52-c0dfd4fbd455/kube-rbac-proxy-frr/0.log" Dec 01 14:58:20 crc kubenswrapper[4585]: I1201 14:58:20.688205 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nklgw_acd2a938-907c-443a-bd52-c0dfd4fbd455/reloader/0.log" Dec 01 14:58:20 crc kubenswrapper[4585]: I1201 14:58:20.838544 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-lvvzw_7ee13572-ff22-43ea-8570-cc0f3a64d44e/frr-k8s-webhook-server/0.log" Dec 01 14:58:21 crc kubenswrapper[4585]: I1201 14:58:21.079633 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-65958ffb48-2555t_661d1aa1-ad66-45b2-8562-69776e5fb5af/manager/0.log" Dec 01 14:58:21 crc kubenswrapper[4585]: I1201 14:58:21.224560 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-64878d448f-cvc5q_99217243-f8e1-4533-925c-a3fac9b81346/webhook-server/0.log" Dec 01 14:58:21 crc kubenswrapper[4585]: I1201 14:58:21.373655 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-nklgw_acd2a938-907c-443a-bd52-c0dfd4fbd455/frr/0.log" Dec 01 14:58:21 crc kubenswrapper[4585]: I1201 14:58:21.597845 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-tnnzj_24c44b85-a153-4622-864f-a0f690044361/kube-rbac-proxy/0.log" Dec 01 14:58:21 crc kubenswrapper[4585]: I1201 14:58:21.725217 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-tnnzj_24c44b85-a153-4622-864f-a0f690044361/speaker/0.log" Dec 01 14:58:36 crc kubenswrapper[4585]: I1201 14:58:36.940584 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ff8sb7_37acd505-ab0f-4779-844d-3dbe65a936c0/util/0.log" Dec 01 14:58:37 crc kubenswrapper[4585]: I1201 14:58:37.189851 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ff8sb7_37acd505-ab0f-4779-844d-3dbe65a936c0/pull/0.log" Dec 01 14:58:37 crc kubenswrapper[4585]: I1201 14:58:37.200353 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ff8sb7_37acd505-ab0f-4779-844d-3dbe65a936c0/util/0.log" Dec 01 14:58:37 crc kubenswrapper[4585]: I1201 14:58:37.204576 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ff8sb7_37acd505-ab0f-4779-844d-3dbe65a936c0/pull/0.log" Dec 01 14:58:37 crc kubenswrapper[4585]: I1201 14:58:37.416682 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ff8sb7_37acd505-ab0f-4779-844d-3dbe65a936c0/pull/0.log" Dec 01 14:58:37 crc kubenswrapper[4585]: I1201 14:58:37.439743 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ff8sb7_37acd505-ab0f-4779-844d-3dbe65a936c0/util/0.log" Dec 01 14:58:37 crc kubenswrapper[4585]: I1201 14:58:37.443564 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ff8sb7_37acd505-ab0f-4779-844d-3dbe65a936c0/extract/0.log" Dec 01 14:58:37 crc kubenswrapper[4585]: I1201 14:58:37.702792 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bnmj8_2fb40ba9-c26f-4fe8-900d-c5bd775febf6/util/0.log" Dec 01 14:58:37 crc kubenswrapper[4585]: I1201 14:58:37.895938 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bnmj8_2fb40ba9-c26f-4fe8-900d-c5bd775febf6/pull/0.log" Dec 01 14:58:37 crc kubenswrapper[4585]: I1201 14:58:37.937673 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bnmj8_2fb40ba9-c26f-4fe8-900d-c5bd775febf6/util/0.log" Dec 01 14:58:37 crc kubenswrapper[4585]: I1201 14:58:37.941854 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bnmj8_2fb40ba9-c26f-4fe8-900d-c5bd775febf6/pull/0.log" Dec 01 14:58:38 crc kubenswrapper[4585]: I1201 14:58:38.142416 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bnmj8_2fb40ba9-c26f-4fe8-900d-c5bd775febf6/util/0.log" Dec 01 14:58:38 crc kubenswrapper[4585]: I1201 14:58:38.165143 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bnmj8_2fb40ba9-c26f-4fe8-900d-c5bd775febf6/pull/0.log" Dec 01 14:58:38 crc kubenswrapper[4585]: I1201 14:58:38.269302 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83bnmj8_2fb40ba9-c26f-4fe8-900d-c5bd775febf6/extract/0.log" Dec 01 14:58:38 crc kubenswrapper[4585]: I1201 14:58:38.388350 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mzg67_d92e1230-1b33-449a-9d96-204cdc4cc3ee/extract-utilities/0.log" Dec 01 14:58:38 crc kubenswrapper[4585]: I1201 14:58:38.908051 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mzg67_d92e1230-1b33-449a-9d96-204cdc4cc3ee/extract-content/0.log" Dec 01 14:58:38 crc kubenswrapper[4585]: I1201 14:58:38.935184 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mzg67_d92e1230-1b33-449a-9d96-204cdc4cc3ee/extract-utilities/0.log" Dec 01 14:58:38 crc kubenswrapper[4585]: I1201 14:58:38.966026 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mzg67_d92e1230-1b33-449a-9d96-204cdc4cc3ee/extract-content/0.log" Dec 01 14:58:39 crc kubenswrapper[4585]: I1201 14:58:39.026884 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mzg67_d92e1230-1b33-449a-9d96-204cdc4cc3ee/extract-utilities/0.log" Dec 01 14:58:39 crc kubenswrapper[4585]: I1201 14:58:39.093526 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mzg67_d92e1230-1b33-449a-9d96-204cdc4cc3ee/extract-content/0.log" Dec 01 14:58:39 crc kubenswrapper[4585]: I1201 14:58:39.328840 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-74dzn_cdaf429b-b6c0-4f60-a032-17262f7466f4/extract-utilities/0.log" Dec 01 14:58:39 crc kubenswrapper[4585]: I1201 14:58:39.552356 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mzg67_d92e1230-1b33-449a-9d96-204cdc4cc3ee/registry-server/0.log" Dec 01 14:58:39 crc kubenswrapper[4585]: I1201 14:58:39.637072 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-74dzn_cdaf429b-b6c0-4f60-a032-17262f7466f4/extract-utilities/0.log" Dec 01 14:58:39 crc kubenswrapper[4585]: I1201 14:58:39.694305 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-74dzn_cdaf429b-b6c0-4f60-a032-17262f7466f4/extract-content/0.log" Dec 01 14:58:39 crc kubenswrapper[4585]: I1201 14:58:39.716516 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-74dzn_cdaf429b-b6c0-4f60-a032-17262f7466f4/extract-content/0.log" Dec 01 14:58:39 crc kubenswrapper[4585]: I1201 14:58:39.999516 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-74dzn_cdaf429b-b6c0-4f60-a032-17262f7466f4/extract-utilities/0.log" Dec 01 14:58:40 crc kubenswrapper[4585]: I1201 14:58:40.101821 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-74dzn_cdaf429b-b6c0-4f60-a032-17262f7466f4/extract-content/0.log" Dec 01 14:58:40 crc kubenswrapper[4585]: I1201 14:58:40.392786 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vnkff_f7e0be82-9218-49a5-a141-605615d845a8/extract-utilities/0.log" Dec 01 14:58:40 crc kubenswrapper[4585]: I1201 14:58:40.439452 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-lwz9j_73887a19-b0ad-43de-a7d3-bda4a7a2a06a/marketplace-operator/0.log" Dec 01 14:58:40 crc kubenswrapper[4585]: I1201 14:58:40.443439 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-74dzn_cdaf429b-b6c0-4f60-a032-17262f7466f4/registry-server/0.log" Dec 01 14:58:40 crc kubenswrapper[4585]: I1201 14:58:40.691690 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vnkff_f7e0be82-9218-49a5-a141-605615d845a8/extract-utilities/0.log" Dec 01 14:58:40 crc kubenswrapper[4585]: I1201 14:58:40.697203 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vnkff_f7e0be82-9218-49a5-a141-605615d845a8/extract-content/0.log" Dec 01 14:58:40 crc kubenswrapper[4585]: I1201 14:58:40.704026 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vnkff_f7e0be82-9218-49a5-a141-605615d845a8/extract-content/0.log" Dec 01 14:58:40 crc kubenswrapper[4585]: I1201 14:58:40.973262 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vnkff_f7e0be82-9218-49a5-a141-605615d845a8/extract-utilities/0.log" Dec 01 14:58:41 crc kubenswrapper[4585]: I1201 14:58:41.018789 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vnkff_f7e0be82-9218-49a5-a141-605615d845a8/extract-content/0.log" Dec 01 14:58:41 crc kubenswrapper[4585]: I1201 14:58:41.166465 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vnkff_f7e0be82-9218-49a5-a141-605615d845a8/registry-server/0.log" Dec 01 14:58:41 crc kubenswrapper[4585]: I1201 14:58:41.172743 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4mtgw_d7ac0d5a-4c26-4734-9386-f775f8dc5461/extract-utilities/0.log" Dec 01 14:58:41 crc kubenswrapper[4585]: I1201 14:58:41.361699 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4mtgw_d7ac0d5a-4c26-4734-9386-f775f8dc5461/extract-content/0.log" Dec 01 14:58:41 crc kubenswrapper[4585]: I1201 14:58:41.377828 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4mtgw_d7ac0d5a-4c26-4734-9386-f775f8dc5461/extract-content/0.log" Dec 01 14:58:41 crc kubenswrapper[4585]: I1201 14:58:41.379886 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4mtgw_d7ac0d5a-4c26-4734-9386-f775f8dc5461/extract-utilities/0.log" Dec 01 14:58:41 crc kubenswrapper[4585]: I1201 14:58:41.525100 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4mtgw_d7ac0d5a-4c26-4734-9386-f775f8dc5461/extract-content/0.log" Dec 01 14:58:41 crc kubenswrapper[4585]: I1201 14:58:41.553096 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4mtgw_d7ac0d5a-4c26-4734-9386-f775f8dc5461/extract-utilities/0.log" Dec 01 14:58:41 crc kubenswrapper[4585]: I1201 14:58:41.939109 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4mtgw_d7ac0d5a-4c26-4734-9386-f775f8dc5461/registry-server/0.log" Dec 01 14:59:03 crc kubenswrapper[4585]: E1201 14:59:03.846984 4585 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.44:48756->38.102.83.44:34393: write tcp 38.102.83.44:48756->38.102.83.44:34393: write: broken pipe Dec 01 15:00:00 crc kubenswrapper[4585]: I1201 15:00:00.170757 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410020-h98t2"] Dec 01 15:00:00 crc kubenswrapper[4585]: E1201 15:00:00.173489 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04714062-a561-43b3-a168-905574386a8c" containerName="extract-utilities" Dec 01 15:00:00 crc kubenswrapper[4585]: I1201 15:00:00.173574 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="04714062-a561-43b3-a168-905574386a8c" containerName="extract-utilities" Dec 01 15:00:00 crc kubenswrapper[4585]: E1201 15:00:00.173633 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04714062-a561-43b3-a168-905574386a8c" containerName="registry-server" Dec 01 15:00:00 crc kubenswrapper[4585]: I1201 15:00:00.173684 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="04714062-a561-43b3-a168-905574386a8c" containerName="registry-server" Dec 01 15:00:00 crc kubenswrapper[4585]: E1201 15:00:00.173755 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04714062-a561-43b3-a168-905574386a8c" containerName="extract-content" Dec 01 15:00:00 crc kubenswrapper[4585]: I1201 15:00:00.173822 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="04714062-a561-43b3-a168-905574386a8c" containerName="extract-content" Dec 01 15:00:00 crc kubenswrapper[4585]: I1201 15:00:00.174111 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="04714062-a561-43b3-a168-905574386a8c" containerName="registry-server" Dec 01 15:00:00 crc kubenswrapper[4585]: I1201 15:00:00.174803 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410020-h98t2" Dec 01 15:00:00 crc kubenswrapper[4585]: I1201 15:00:00.177711 4585 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 15:00:00 crc kubenswrapper[4585]: I1201 15:00:00.180532 4585 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 15:00:00 crc kubenswrapper[4585]: I1201 15:00:00.186577 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410020-h98t2"] Dec 01 15:00:00 crc kubenswrapper[4585]: I1201 15:00:00.205028 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fbcbc065-131e-4cf6-9cfe-c79b0f729b49-config-volume\") pod \"collect-profiles-29410020-h98t2\" (UID: \"fbcbc065-131e-4cf6-9cfe-c79b0f729b49\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410020-h98t2" Dec 01 15:00:00 crc kubenswrapper[4585]: I1201 15:00:00.205217 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skrbj\" (UniqueName: \"kubernetes.io/projected/fbcbc065-131e-4cf6-9cfe-c79b0f729b49-kube-api-access-skrbj\") pod \"collect-profiles-29410020-h98t2\" (UID: \"fbcbc065-131e-4cf6-9cfe-c79b0f729b49\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410020-h98t2" Dec 01 15:00:00 crc kubenswrapper[4585]: I1201 15:00:00.205302 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fbcbc065-131e-4cf6-9cfe-c79b0f729b49-secret-volume\") pod \"collect-profiles-29410020-h98t2\" (UID: \"fbcbc065-131e-4cf6-9cfe-c79b0f729b49\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410020-h98t2" Dec 01 15:00:00 crc kubenswrapper[4585]: I1201 15:00:00.306915 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skrbj\" (UniqueName: \"kubernetes.io/projected/fbcbc065-131e-4cf6-9cfe-c79b0f729b49-kube-api-access-skrbj\") pod \"collect-profiles-29410020-h98t2\" (UID: \"fbcbc065-131e-4cf6-9cfe-c79b0f729b49\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410020-h98t2" Dec 01 15:00:00 crc kubenswrapper[4585]: I1201 15:00:00.307280 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fbcbc065-131e-4cf6-9cfe-c79b0f729b49-secret-volume\") pod \"collect-profiles-29410020-h98t2\" (UID: \"fbcbc065-131e-4cf6-9cfe-c79b0f729b49\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410020-h98t2" Dec 01 15:00:00 crc kubenswrapper[4585]: I1201 15:00:00.307396 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fbcbc065-131e-4cf6-9cfe-c79b0f729b49-config-volume\") pod \"collect-profiles-29410020-h98t2\" (UID: \"fbcbc065-131e-4cf6-9cfe-c79b0f729b49\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410020-h98t2" Dec 01 15:00:00 crc kubenswrapper[4585]: I1201 15:00:00.308368 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fbcbc065-131e-4cf6-9cfe-c79b0f729b49-config-volume\") pod \"collect-profiles-29410020-h98t2\" (UID: \"fbcbc065-131e-4cf6-9cfe-c79b0f729b49\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410020-h98t2" Dec 01 15:00:00 crc kubenswrapper[4585]: I1201 15:00:00.323286 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fbcbc065-131e-4cf6-9cfe-c79b0f729b49-secret-volume\") pod \"collect-profiles-29410020-h98t2\" (UID: \"fbcbc065-131e-4cf6-9cfe-c79b0f729b49\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410020-h98t2" Dec 01 15:00:00 crc kubenswrapper[4585]: I1201 15:00:00.330721 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skrbj\" (UniqueName: \"kubernetes.io/projected/fbcbc065-131e-4cf6-9cfe-c79b0f729b49-kube-api-access-skrbj\") pod \"collect-profiles-29410020-h98t2\" (UID: \"fbcbc065-131e-4cf6-9cfe-c79b0f729b49\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29410020-h98t2" Dec 01 15:00:00 crc kubenswrapper[4585]: I1201 15:00:00.510835 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410020-h98t2" Dec 01 15:00:01 crc kubenswrapper[4585]: I1201 15:00:01.006513 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29410020-h98t2"] Dec 01 15:00:01 crc kubenswrapper[4585]: I1201 15:00:01.618399 4585 generic.go:334] "Generic (PLEG): container finished" podID="fbcbc065-131e-4cf6-9cfe-c79b0f729b49" containerID="3154a553f2c6e7088045905855f346c1bb94ad9189de4e1f287e7b6319a2d449" exitCode=0 Dec 01 15:00:01 crc kubenswrapper[4585]: I1201 15:00:01.618582 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410020-h98t2" event={"ID":"fbcbc065-131e-4cf6-9cfe-c79b0f729b49","Type":"ContainerDied","Data":"3154a553f2c6e7088045905855f346c1bb94ad9189de4e1f287e7b6319a2d449"} Dec 01 15:00:01 crc kubenswrapper[4585]: I1201 15:00:01.618742 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410020-h98t2" event={"ID":"fbcbc065-131e-4cf6-9cfe-c79b0f729b49","Type":"ContainerStarted","Data":"813fe000694d0b6feec5a79bd33b740f34ec8c82da04b792e5e7ce01bc9aaf5f"} Dec 01 15:00:03 crc kubenswrapper[4585]: I1201 15:00:03.030324 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410020-h98t2" Dec 01 15:00:03 crc kubenswrapper[4585]: I1201 15:00:03.072561 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fbcbc065-131e-4cf6-9cfe-c79b0f729b49-secret-volume\") pod \"fbcbc065-131e-4cf6-9cfe-c79b0f729b49\" (UID: \"fbcbc065-131e-4cf6-9cfe-c79b0f729b49\") " Dec 01 15:00:03 crc kubenswrapper[4585]: I1201 15:00:03.073009 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fbcbc065-131e-4cf6-9cfe-c79b0f729b49-config-volume\") pod \"fbcbc065-131e-4cf6-9cfe-c79b0f729b49\" (UID: \"fbcbc065-131e-4cf6-9cfe-c79b0f729b49\") " Dec 01 15:00:03 crc kubenswrapper[4585]: I1201 15:00:03.073174 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skrbj\" (UniqueName: \"kubernetes.io/projected/fbcbc065-131e-4cf6-9cfe-c79b0f729b49-kube-api-access-skrbj\") pod \"fbcbc065-131e-4cf6-9cfe-c79b0f729b49\" (UID: \"fbcbc065-131e-4cf6-9cfe-c79b0f729b49\") " Dec 01 15:00:03 crc kubenswrapper[4585]: I1201 15:00:03.075354 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbcbc065-131e-4cf6-9cfe-c79b0f729b49-config-volume" (OuterVolumeSpecName: "config-volume") pod "fbcbc065-131e-4cf6-9cfe-c79b0f729b49" (UID: "fbcbc065-131e-4cf6-9cfe-c79b0f729b49"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 15:00:03 crc kubenswrapper[4585]: I1201 15:00:03.086607 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbcbc065-131e-4cf6-9cfe-c79b0f729b49-kube-api-access-skrbj" (OuterVolumeSpecName: "kube-api-access-skrbj") pod "fbcbc065-131e-4cf6-9cfe-c79b0f729b49" (UID: "fbcbc065-131e-4cf6-9cfe-c79b0f729b49"). InnerVolumeSpecName "kube-api-access-skrbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:00:03 crc kubenswrapper[4585]: I1201 15:00:03.088988 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbcbc065-131e-4cf6-9cfe-c79b0f729b49-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "fbcbc065-131e-4cf6-9cfe-c79b0f729b49" (UID: "fbcbc065-131e-4cf6-9cfe-c79b0f729b49"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:00:03 crc kubenswrapper[4585]: I1201 15:00:03.175198 4585 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fbcbc065-131e-4cf6-9cfe-c79b0f729b49-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 15:00:03 crc kubenswrapper[4585]: I1201 15:00:03.175230 4585 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fbcbc065-131e-4cf6-9cfe-c79b0f729b49-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 15:00:03 crc kubenswrapper[4585]: I1201 15:00:03.175241 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skrbj\" (UniqueName: \"kubernetes.io/projected/fbcbc065-131e-4cf6-9cfe-c79b0f729b49-kube-api-access-skrbj\") on node \"crc\" DevicePath \"\"" Dec 01 15:00:03 crc kubenswrapper[4585]: I1201 15:00:03.635291 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29410020-h98t2" event={"ID":"fbcbc065-131e-4cf6-9cfe-c79b0f729b49","Type":"ContainerDied","Data":"813fe000694d0b6feec5a79bd33b740f34ec8c82da04b792e5e7ce01bc9aaf5f"} Dec 01 15:00:03 crc kubenswrapper[4585]: I1201 15:00:03.635326 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="813fe000694d0b6feec5a79bd33b740f34ec8c82da04b792e5e7ce01bc9aaf5f" Dec 01 15:00:03 crc kubenswrapper[4585]: I1201 15:00:03.635423 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29410020-h98t2" Dec 01 15:00:04 crc kubenswrapper[4585]: I1201 15:00:04.130104 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409975-6pxl5"] Dec 01 15:00:04 crc kubenswrapper[4585]: I1201 15:00:04.139762 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409975-6pxl5"] Dec 01 15:00:04 crc kubenswrapper[4585]: I1201 15:00:04.423875 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dab17cce-6a4f-4ca4-9e77-f1451868e1d3" path="/var/lib/kubelet/pods/dab17cce-6a4f-4ca4-9e77-f1451868e1d3/volumes" Dec 01 15:00:32 crc kubenswrapper[4585]: I1201 15:00:32.903733 4585 generic.go:334] "Generic (PLEG): container finished" podID="bcd106c4-731c-46aa-b8a5-bf4fc4d8c464" containerID="ca9d3ea2ced99fc7805b72e4f5ed5eb136047f061564c97ac64f455459b89613" exitCode=0 Dec 01 15:00:32 crc kubenswrapper[4585]: I1201 15:00:32.904346 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xg2mx/must-gather-qvcpl" event={"ID":"bcd106c4-731c-46aa-b8a5-bf4fc4d8c464","Type":"ContainerDied","Data":"ca9d3ea2ced99fc7805b72e4f5ed5eb136047f061564c97ac64f455459b89613"} Dec 01 15:00:32 crc kubenswrapper[4585]: I1201 15:00:32.905100 4585 scope.go:117] "RemoveContainer" containerID="ca9d3ea2ced99fc7805b72e4f5ed5eb136047f061564c97ac64f455459b89613" Dec 01 15:00:33 crc kubenswrapper[4585]: I1201 15:00:33.567802 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-xg2mx_must-gather-qvcpl_bcd106c4-731c-46aa-b8a5-bf4fc4d8c464/gather/0.log" Dec 01 15:00:35 crc kubenswrapper[4585]: I1201 15:00:35.859033 4585 scope.go:117] "RemoveContainer" containerID="5df85fd39531ca2cf08712fe18c70d2665ca45b8156a399546334e4f9deaee47" Dec 01 15:00:43 crc kubenswrapper[4585]: I1201 15:00:43.716610 4585 patch_prober.go:28] interesting pod/machine-config-daemon-lj9gs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 15:00:43 crc kubenswrapper[4585]: I1201 15:00:43.717143 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 15:00:44 crc kubenswrapper[4585]: I1201 15:00:44.711044 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-xg2mx/must-gather-qvcpl"] Dec 01 15:00:44 crc kubenswrapper[4585]: I1201 15:00:44.711299 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-xg2mx/must-gather-qvcpl" podUID="bcd106c4-731c-46aa-b8a5-bf4fc4d8c464" containerName="copy" containerID="cri-o://210950b1e81cfb369091e5b37f2a545c370443254c55b9e51a2704bbad43d73e" gracePeriod=2 Dec 01 15:00:44 crc kubenswrapper[4585]: I1201 15:00:44.726555 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-xg2mx/must-gather-qvcpl"] Dec 01 15:00:45 crc kubenswrapper[4585]: I1201 15:00:45.028624 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-xg2mx_must-gather-qvcpl_bcd106c4-731c-46aa-b8a5-bf4fc4d8c464/copy/0.log" Dec 01 15:00:45 crc kubenswrapper[4585]: I1201 15:00:45.029228 4585 generic.go:334] "Generic (PLEG): container finished" podID="bcd106c4-731c-46aa-b8a5-bf4fc4d8c464" containerID="210950b1e81cfb369091e5b37f2a545c370443254c55b9e51a2704bbad43d73e" exitCode=143 Dec 01 15:00:45 crc kubenswrapper[4585]: I1201 15:00:45.123862 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-xg2mx_must-gather-qvcpl_bcd106c4-731c-46aa-b8a5-bf4fc4d8c464/copy/0.log" Dec 01 15:00:45 crc kubenswrapper[4585]: I1201 15:00:45.124488 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xg2mx/must-gather-qvcpl" Dec 01 15:00:45 crc kubenswrapper[4585]: I1201 15:00:45.195309 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/bcd106c4-731c-46aa-b8a5-bf4fc4d8c464-must-gather-output\") pod \"bcd106c4-731c-46aa-b8a5-bf4fc4d8c464\" (UID: \"bcd106c4-731c-46aa-b8a5-bf4fc4d8c464\") " Dec 01 15:00:45 crc kubenswrapper[4585]: I1201 15:00:45.195463 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8nqq\" (UniqueName: \"kubernetes.io/projected/bcd106c4-731c-46aa-b8a5-bf4fc4d8c464-kube-api-access-z8nqq\") pod \"bcd106c4-731c-46aa-b8a5-bf4fc4d8c464\" (UID: \"bcd106c4-731c-46aa-b8a5-bf4fc4d8c464\") " Dec 01 15:00:45 crc kubenswrapper[4585]: I1201 15:00:45.201950 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcd106c4-731c-46aa-b8a5-bf4fc4d8c464-kube-api-access-z8nqq" (OuterVolumeSpecName: "kube-api-access-z8nqq") pod "bcd106c4-731c-46aa-b8a5-bf4fc4d8c464" (UID: "bcd106c4-731c-46aa-b8a5-bf4fc4d8c464"). InnerVolumeSpecName "kube-api-access-z8nqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:00:45 crc kubenswrapper[4585]: I1201 15:00:45.298157 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8nqq\" (UniqueName: \"kubernetes.io/projected/bcd106c4-731c-46aa-b8a5-bf4fc4d8c464-kube-api-access-z8nqq\") on node \"crc\" DevicePath \"\"" Dec 01 15:00:45 crc kubenswrapper[4585]: I1201 15:00:45.352166 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcd106c4-731c-46aa-b8a5-bf4fc4d8c464-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "bcd106c4-731c-46aa-b8a5-bf4fc4d8c464" (UID: "bcd106c4-731c-46aa-b8a5-bf4fc4d8c464"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:00:45 crc kubenswrapper[4585]: I1201 15:00:45.400105 4585 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/bcd106c4-731c-46aa-b8a5-bf4fc4d8c464-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 01 15:00:46 crc kubenswrapper[4585]: I1201 15:00:46.052038 4585 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-xg2mx_must-gather-qvcpl_bcd106c4-731c-46aa-b8a5-bf4fc4d8c464/copy/0.log" Dec 01 15:00:46 crc kubenswrapper[4585]: I1201 15:00:46.054249 4585 scope.go:117] "RemoveContainer" containerID="210950b1e81cfb369091e5b37f2a545c370443254c55b9e51a2704bbad43d73e" Dec 01 15:00:46 crc kubenswrapper[4585]: I1201 15:00:46.054392 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xg2mx/must-gather-qvcpl" Dec 01 15:00:46 crc kubenswrapper[4585]: I1201 15:00:46.087238 4585 scope.go:117] "RemoveContainer" containerID="ca9d3ea2ced99fc7805b72e4f5ed5eb136047f061564c97ac64f455459b89613" Dec 01 15:00:46 crc kubenswrapper[4585]: I1201 15:00:46.425069 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcd106c4-731c-46aa-b8a5-bf4fc4d8c464" path="/var/lib/kubelet/pods/bcd106c4-731c-46aa-b8a5-bf4fc4d8c464/volumes" Dec 01 15:01:00 crc kubenswrapper[4585]: I1201 15:01:00.150020 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29410021-xkmd9"] Dec 01 15:01:00 crc kubenswrapper[4585]: E1201 15:01:00.150964 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcd106c4-731c-46aa-b8a5-bf4fc4d8c464" containerName="gather" Dec 01 15:01:00 crc kubenswrapper[4585]: I1201 15:01:00.151037 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcd106c4-731c-46aa-b8a5-bf4fc4d8c464" containerName="gather" Dec 01 15:01:00 crc kubenswrapper[4585]: E1201 15:01:00.151052 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcd106c4-731c-46aa-b8a5-bf4fc4d8c464" containerName="copy" Dec 01 15:01:00 crc kubenswrapper[4585]: I1201 15:01:00.151058 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcd106c4-731c-46aa-b8a5-bf4fc4d8c464" containerName="copy" Dec 01 15:01:00 crc kubenswrapper[4585]: E1201 15:01:00.151075 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbcbc065-131e-4cf6-9cfe-c79b0f729b49" containerName="collect-profiles" Dec 01 15:01:00 crc kubenswrapper[4585]: I1201 15:01:00.151081 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbcbc065-131e-4cf6-9cfe-c79b0f729b49" containerName="collect-profiles" Dec 01 15:01:00 crc kubenswrapper[4585]: I1201 15:01:00.151292 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbcbc065-131e-4cf6-9cfe-c79b0f729b49" containerName="collect-profiles" Dec 01 15:01:00 crc kubenswrapper[4585]: I1201 15:01:00.151306 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcd106c4-731c-46aa-b8a5-bf4fc4d8c464" containerName="gather" Dec 01 15:01:00 crc kubenswrapper[4585]: I1201 15:01:00.151334 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcd106c4-731c-46aa-b8a5-bf4fc4d8c464" containerName="copy" Dec 01 15:01:00 crc kubenswrapper[4585]: I1201 15:01:00.151896 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29410021-xkmd9" Dec 01 15:01:00 crc kubenswrapper[4585]: I1201 15:01:00.164415 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29410021-xkmd9"] Dec 01 15:01:00 crc kubenswrapper[4585]: I1201 15:01:00.275728 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a3b06f9-33c4-4bc8-9d94-3bd503665381-config-data\") pod \"keystone-cron-29410021-xkmd9\" (UID: \"1a3b06f9-33c4-4bc8-9d94-3bd503665381\") " pod="openstack/keystone-cron-29410021-xkmd9" Dec 01 15:01:00 crc kubenswrapper[4585]: I1201 15:01:00.276216 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jk92b\" (UniqueName: \"kubernetes.io/projected/1a3b06f9-33c4-4bc8-9d94-3bd503665381-kube-api-access-jk92b\") pod \"keystone-cron-29410021-xkmd9\" (UID: \"1a3b06f9-33c4-4bc8-9d94-3bd503665381\") " pod="openstack/keystone-cron-29410021-xkmd9" Dec 01 15:01:00 crc kubenswrapper[4585]: I1201 15:01:00.276443 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a3b06f9-33c4-4bc8-9d94-3bd503665381-combined-ca-bundle\") pod \"keystone-cron-29410021-xkmd9\" (UID: \"1a3b06f9-33c4-4bc8-9d94-3bd503665381\") " pod="openstack/keystone-cron-29410021-xkmd9" Dec 01 15:01:00 crc kubenswrapper[4585]: I1201 15:01:00.276568 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1a3b06f9-33c4-4bc8-9d94-3bd503665381-fernet-keys\") pod \"keystone-cron-29410021-xkmd9\" (UID: \"1a3b06f9-33c4-4bc8-9d94-3bd503665381\") " pod="openstack/keystone-cron-29410021-xkmd9" Dec 01 15:01:00 crc kubenswrapper[4585]: I1201 15:01:00.378728 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jk92b\" (UniqueName: \"kubernetes.io/projected/1a3b06f9-33c4-4bc8-9d94-3bd503665381-kube-api-access-jk92b\") pod \"keystone-cron-29410021-xkmd9\" (UID: \"1a3b06f9-33c4-4bc8-9d94-3bd503665381\") " pod="openstack/keystone-cron-29410021-xkmd9" Dec 01 15:01:00 crc kubenswrapper[4585]: I1201 15:01:00.379046 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a3b06f9-33c4-4bc8-9d94-3bd503665381-combined-ca-bundle\") pod \"keystone-cron-29410021-xkmd9\" (UID: \"1a3b06f9-33c4-4bc8-9d94-3bd503665381\") " pod="openstack/keystone-cron-29410021-xkmd9" Dec 01 15:01:00 crc kubenswrapper[4585]: I1201 15:01:00.379157 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1a3b06f9-33c4-4bc8-9d94-3bd503665381-fernet-keys\") pod \"keystone-cron-29410021-xkmd9\" (UID: \"1a3b06f9-33c4-4bc8-9d94-3bd503665381\") " pod="openstack/keystone-cron-29410021-xkmd9" Dec 01 15:01:00 crc kubenswrapper[4585]: I1201 15:01:00.379279 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a3b06f9-33c4-4bc8-9d94-3bd503665381-config-data\") pod \"keystone-cron-29410021-xkmd9\" (UID: \"1a3b06f9-33c4-4bc8-9d94-3bd503665381\") " pod="openstack/keystone-cron-29410021-xkmd9" Dec 01 15:01:00 crc kubenswrapper[4585]: I1201 15:01:00.384707 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a3b06f9-33c4-4bc8-9d94-3bd503665381-config-data\") pod \"keystone-cron-29410021-xkmd9\" (UID: \"1a3b06f9-33c4-4bc8-9d94-3bd503665381\") " pod="openstack/keystone-cron-29410021-xkmd9" Dec 01 15:01:00 crc kubenswrapper[4585]: I1201 15:01:00.385057 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1a3b06f9-33c4-4bc8-9d94-3bd503665381-fernet-keys\") pod \"keystone-cron-29410021-xkmd9\" (UID: \"1a3b06f9-33c4-4bc8-9d94-3bd503665381\") " pod="openstack/keystone-cron-29410021-xkmd9" Dec 01 15:01:00 crc kubenswrapper[4585]: I1201 15:01:00.385649 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a3b06f9-33c4-4bc8-9d94-3bd503665381-combined-ca-bundle\") pod \"keystone-cron-29410021-xkmd9\" (UID: \"1a3b06f9-33c4-4bc8-9d94-3bd503665381\") " pod="openstack/keystone-cron-29410021-xkmd9" Dec 01 15:01:00 crc kubenswrapper[4585]: I1201 15:01:00.396380 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jk92b\" (UniqueName: \"kubernetes.io/projected/1a3b06f9-33c4-4bc8-9d94-3bd503665381-kube-api-access-jk92b\") pod \"keystone-cron-29410021-xkmd9\" (UID: \"1a3b06f9-33c4-4bc8-9d94-3bd503665381\") " pod="openstack/keystone-cron-29410021-xkmd9" Dec 01 15:01:00 crc kubenswrapper[4585]: I1201 15:01:00.516042 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29410021-xkmd9" Dec 01 15:01:00 crc kubenswrapper[4585]: I1201 15:01:00.941336 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29410021-xkmd9"] Dec 01 15:01:00 crc kubenswrapper[4585]: W1201 15:01:00.947584 4585 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a3b06f9_33c4_4bc8_9d94_3bd503665381.slice/crio-543ba0f4719a5a3b6d0ba7b2a21be9d8159bb2b6a48bfe8c887e95ff5d127770 WatchSource:0}: Error finding container 543ba0f4719a5a3b6d0ba7b2a21be9d8159bb2b6a48bfe8c887e95ff5d127770: Status 404 returned error can't find the container with id 543ba0f4719a5a3b6d0ba7b2a21be9d8159bb2b6a48bfe8c887e95ff5d127770 Dec 01 15:01:01 crc kubenswrapper[4585]: I1201 15:01:01.181709 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29410021-xkmd9" event={"ID":"1a3b06f9-33c4-4bc8-9d94-3bd503665381","Type":"ContainerStarted","Data":"9a9a487b35f30c0d75a86d5cba0caf3fe7bbe45ab9d54be83d66fa768a342f72"} Dec 01 15:01:01 crc kubenswrapper[4585]: I1201 15:01:01.181753 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29410021-xkmd9" event={"ID":"1a3b06f9-33c4-4bc8-9d94-3bd503665381","Type":"ContainerStarted","Data":"543ba0f4719a5a3b6d0ba7b2a21be9d8159bb2b6a48bfe8c887e95ff5d127770"} Dec 01 15:01:01 crc kubenswrapper[4585]: I1201 15:01:01.198605 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29410021-xkmd9" podStartSLOduration=1.1985901509999999 podStartE2EDuration="1.198590151s" podCreationTimestamp="2025-12-01 15:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 15:01:01.195002266 +0000 UTC m=+3775.179216121" watchObservedRunningTime="2025-12-01 15:01:01.198590151 +0000 UTC m=+3775.182804006" Dec 01 15:01:04 crc kubenswrapper[4585]: I1201 15:01:04.212056 4585 generic.go:334] "Generic (PLEG): container finished" podID="1a3b06f9-33c4-4bc8-9d94-3bd503665381" containerID="9a9a487b35f30c0d75a86d5cba0caf3fe7bbe45ab9d54be83d66fa768a342f72" exitCode=0 Dec 01 15:01:04 crc kubenswrapper[4585]: I1201 15:01:04.212101 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29410021-xkmd9" event={"ID":"1a3b06f9-33c4-4bc8-9d94-3bd503665381","Type":"ContainerDied","Data":"9a9a487b35f30c0d75a86d5cba0caf3fe7bbe45ab9d54be83d66fa768a342f72"} Dec 01 15:01:05 crc kubenswrapper[4585]: I1201 15:01:05.534845 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29410021-xkmd9" Dec 01 15:01:05 crc kubenswrapper[4585]: I1201 15:01:05.590732 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1a3b06f9-33c4-4bc8-9d94-3bd503665381-fernet-keys\") pod \"1a3b06f9-33c4-4bc8-9d94-3bd503665381\" (UID: \"1a3b06f9-33c4-4bc8-9d94-3bd503665381\") " Dec 01 15:01:05 crc kubenswrapper[4585]: I1201 15:01:05.590792 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a3b06f9-33c4-4bc8-9d94-3bd503665381-config-data\") pod \"1a3b06f9-33c4-4bc8-9d94-3bd503665381\" (UID: \"1a3b06f9-33c4-4bc8-9d94-3bd503665381\") " Dec 01 15:01:05 crc kubenswrapper[4585]: I1201 15:01:05.590912 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a3b06f9-33c4-4bc8-9d94-3bd503665381-combined-ca-bundle\") pod \"1a3b06f9-33c4-4bc8-9d94-3bd503665381\" (UID: \"1a3b06f9-33c4-4bc8-9d94-3bd503665381\") " Dec 01 15:01:05 crc kubenswrapper[4585]: I1201 15:01:05.590937 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jk92b\" (UniqueName: \"kubernetes.io/projected/1a3b06f9-33c4-4bc8-9d94-3bd503665381-kube-api-access-jk92b\") pod \"1a3b06f9-33c4-4bc8-9d94-3bd503665381\" (UID: \"1a3b06f9-33c4-4bc8-9d94-3bd503665381\") " Dec 01 15:01:05 crc kubenswrapper[4585]: I1201 15:01:05.597465 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a3b06f9-33c4-4bc8-9d94-3bd503665381-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "1a3b06f9-33c4-4bc8-9d94-3bd503665381" (UID: "1a3b06f9-33c4-4bc8-9d94-3bd503665381"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:01:05 crc kubenswrapper[4585]: I1201 15:01:05.598545 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a3b06f9-33c4-4bc8-9d94-3bd503665381-kube-api-access-jk92b" (OuterVolumeSpecName: "kube-api-access-jk92b") pod "1a3b06f9-33c4-4bc8-9d94-3bd503665381" (UID: "1a3b06f9-33c4-4bc8-9d94-3bd503665381"). InnerVolumeSpecName "kube-api-access-jk92b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:01:05 crc kubenswrapper[4585]: I1201 15:01:05.634939 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a3b06f9-33c4-4bc8-9d94-3bd503665381-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1a3b06f9-33c4-4bc8-9d94-3bd503665381" (UID: "1a3b06f9-33c4-4bc8-9d94-3bd503665381"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:01:05 crc kubenswrapper[4585]: I1201 15:01:05.643355 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a3b06f9-33c4-4bc8-9d94-3bd503665381-config-data" (OuterVolumeSpecName: "config-data") pod "1a3b06f9-33c4-4bc8-9d94-3bd503665381" (UID: "1a3b06f9-33c4-4bc8-9d94-3bd503665381"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 15:01:05 crc kubenswrapper[4585]: I1201 15:01:05.692531 4585 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1a3b06f9-33c4-4bc8-9d94-3bd503665381-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:05 crc kubenswrapper[4585]: I1201 15:01:05.692562 4585 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a3b06f9-33c4-4bc8-9d94-3bd503665381-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:05 crc kubenswrapper[4585]: I1201 15:01:05.692573 4585 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a3b06f9-33c4-4bc8-9d94-3bd503665381-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:05 crc kubenswrapper[4585]: I1201 15:01:05.692586 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jk92b\" (UniqueName: \"kubernetes.io/projected/1a3b06f9-33c4-4bc8-9d94-3bd503665381-kube-api-access-jk92b\") on node \"crc\" DevicePath \"\"" Dec 01 15:01:06 crc kubenswrapper[4585]: I1201 15:01:06.229350 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29410021-xkmd9" event={"ID":"1a3b06f9-33c4-4bc8-9d94-3bd503665381","Type":"ContainerDied","Data":"543ba0f4719a5a3b6d0ba7b2a21be9d8159bb2b6a48bfe8c887e95ff5d127770"} Dec 01 15:01:06 crc kubenswrapper[4585]: I1201 15:01:06.229577 4585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="543ba0f4719a5a3b6d0ba7b2a21be9d8159bb2b6a48bfe8c887e95ff5d127770" Dec 01 15:01:06 crc kubenswrapper[4585]: I1201 15:01:06.229411 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29410021-xkmd9" Dec 01 15:01:13 crc kubenswrapper[4585]: I1201 15:01:13.716064 4585 patch_prober.go:28] interesting pod/machine-config-daemon-lj9gs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 15:01:13 crc kubenswrapper[4585]: I1201 15:01:13.716578 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 15:01:35 crc kubenswrapper[4585]: I1201 15:01:35.940348 4585 scope.go:117] "RemoveContainer" containerID="77865c11d7cc0630175435442cf13442d0584636028b440e11b3b26912d69a49" Dec 01 15:01:43 crc kubenswrapper[4585]: I1201 15:01:43.716132 4585 patch_prober.go:28] interesting pod/machine-config-daemon-lj9gs container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 15:01:43 crc kubenswrapper[4585]: I1201 15:01:43.717637 4585 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 15:01:43 crc kubenswrapper[4585]: I1201 15:01:43.717773 4585 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" Dec 01 15:01:43 crc kubenswrapper[4585]: I1201 15:01:43.718716 4585 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"696f9236094d9e4c23cc19363f2464907e2ec3ded59ec494b529b4e5801f208b"} pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 15:01:43 crc kubenswrapper[4585]: I1201 15:01:43.718874 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerName="machine-config-daemon" containerID="cri-o://696f9236094d9e4c23cc19363f2464907e2ec3ded59ec494b529b4e5801f208b" gracePeriod=600 Dec 01 15:01:43 crc kubenswrapper[4585]: E1201 15:01:43.838545 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 15:01:44 crc kubenswrapper[4585]: I1201 15:01:44.572015 4585 generic.go:334] "Generic (PLEG): container finished" podID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" containerID="696f9236094d9e4c23cc19363f2464907e2ec3ded59ec494b529b4e5801f208b" exitCode=0 Dec 01 15:01:44 crc kubenswrapper[4585]: I1201 15:01:44.572060 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" event={"ID":"f7beb40d-bcd0-43c8-a9fe-c32408790a4c","Type":"ContainerDied","Data":"696f9236094d9e4c23cc19363f2464907e2ec3ded59ec494b529b4e5801f208b"} Dec 01 15:01:44 crc kubenswrapper[4585]: I1201 15:01:44.572093 4585 scope.go:117] "RemoveContainer" containerID="1e0b4194459fe2d90ee7c18b376805cee001984de119c4ef64ab23cb42caa200" Dec 01 15:01:44 crc kubenswrapper[4585]: I1201 15:01:44.573632 4585 scope.go:117] "RemoveContainer" containerID="696f9236094d9e4c23cc19363f2464907e2ec3ded59ec494b529b4e5801f208b" Dec 01 15:01:44 crc kubenswrapper[4585]: E1201 15:01:44.573998 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 15:01:59 crc kubenswrapper[4585]: I1201 15:01:59.412400 4585 scope.go:117] "RemoveContainer" containerID="696f9236094d9e4c23cc19363f2464907e2ec3ded59ec494b529b4e5801f208b" Dec 01 15:01:59 crc kubenswrapper[4585]: E1201 15:01:59.413209 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 15:02:14 crc kubenswrapper[4585]: I1201 15:02:14.413393 4585 scope.go:117] "RemoveContainer" containerID="696f9236094d9e4c23cc19363f2464907e2ec3ded59ec494b529b4e5801f208b" Dec 01 15:02:14 crc kubenswrapper[4585]: E1201 15:02:14.414142 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 15:02:25 crc kubenswrapper[4585]: I1201 15:02:25.413097 4585 scope.go:117] "RemoveContainer" containerID="696f9236094d9e4c23cc19363f2464907e2ec3ded59ec494b529b4e5801f208b" Dec 01 15:02:25 crc kubenswrapper[4585]: E1201 15:02:25.413821 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 15:02:36 crc kubenswrapper[4585]: I1201 15:02:36.049394 4585 scope.go:117] "RemoveContainer" containerID="7ea08ca6f15d19e75cc2204ed4d58b03151a7e5bc69be3ec784126fbeb3e37eb" Dec 01 15:02:39 crc kubenswrapper[4585]: I1201 15:02:39.412153 4585 scope.go:117] "RemoveContainer" containerID="696f9236094d9e4c23cc19363f2464907e2ec3ded59ec494b529b4e5801f208b" Dec 01 15:02:39 crc kubenswrapper[4585]: E1201 15:02:39.412953 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 15:02:54 crc kubenswrapper[4585]: I1201 15:02:54.412406 4585 scope.go:117] "RemoveContainer" containerID="696f9236094d9e4c23cc19363f2464907e2ec3ded59ec494b529b4e5801f208b" Dec 01 15:02:54 crc kubenswrapper[4585]: E1201 15:02:54.413142 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 15:03:08 crc kubenswrapper[4585]: I1201 15:03:08.411944 4585 scope.go:117] "RemoveContainer" containerID="696f9236094d9e4c23cc19363f2464907e2ec3ded59ec494b529b4e5801f208b" Dec 01 15:03:08 crc kubenswrapper[4585]: E1201 15:03:08.412818 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 15:03:22 crc kubenswrapper[4585]: I1201 15:03:22.412438 4585 scope.go:117] "RemoveContainer" containerID="696f9236094d9e4c23cc19363f2464907e2ec3ded59ec494b529b4e5801f208b" Dec 01 15:03:22 crc kubenswrapper[4585]: E1201 15:03:22.413396 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 15:03:37 crc kubenswrapper[4585]: I1201 15:03:37.412840 4585 scope.go:117] "RemoveContainer" containerID="696f9236094d9e4c23cc19363f2464907e2ec3ded59ec494b529b4e5801f208b" Dec 01 15:03:37 crc kubenswrapper[4585]: E1201 15:03:37.413652 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 15:03:42 crc kubenswrapper[4585]: I1201 15:03:42.660852 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-64hn9"] Dec 01 15:03:42 crc kubenswrapper[4585]: E1201 15:03:42.662060 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a3b06f9-33c4-4bc8-9d94-3bd503665381" containerName="keystone-cron" Dec 01 15:03:42 crc kubenswrapper[4585]: I1201 15:03:42.662084 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a3b06f9-33c4-4bc8-9d94-3bd503665381" containerName="keystone-cron" Dec 01 15:03:42 crc kubenswrapper[4585]: I1201 15:03:42.662446 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a3b06f9-33c4-4bc8-9d94-3bd503665381" containerName="keystone-cron" Dec 01 15:03:42 crc kubenswrapper[4585]: I1201 15:03:42.664796 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-64hn9" Dec 01 15:03:42 crc kubenswrapper[4585]: I1201 15:03:42.694121 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-64hn9"] Dec 01 15:03:42 crc kubenswrapper[4585]: I1201 15:03:42.714376 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f18d0f0e-a538-4195-8108-e7324c83cc1a-utilities\") pod \"redhat-marketplace-64hn9\" (UID: \"f18d0f0e-a538-4195-8108-e7324c83cc1a\") " pod="openshift-marketplace/redhat-marketplace-64hn9" Dec 01 15:03:42 crc kubenswrapper[4585]: I1201 15:03:42.714488 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f18d0f0e-a538-4195-8108-e7324c83cc1a-catalog-content\") pod \"redhat-marketplace-64hn9\" (UID: \"f18d0f0e-a538-4195-8108-e7324c83cc1a\") " pod="openshift-marketplace/redhat-marketplace-64hn9" Dec 01 15:03:42 crc kubenswrapper[4585]: I1201 15:03:42.714770 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lm5hz\" (UniqueName: \"kubernetes.io/projected/f18d0f0e-a538-4195-8108-e7324c83cc1a-kube-api-access-lm5hz\") pod \"redhat-marketplace-64hn9\" (UID: \"f18d0f0e-a538-4195-8108-e7324c83cc1a\") " pod="openshift-marketplace/redhat-marketplace-64hn9" Dec 01 15:03:42 crc kubenswrapper[4585]: I1201 15:03:42.816870 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lm5hz\" (UniqueName: \"kubernetes.io/projected/f18d0f0e-a538-4195-8108-e7324c83cc1a-kube-api-access-lm5hz\") pod \"redhat-marketplace-64hn9\" (UID: \"f18d0f0e-a538-4195-8108-e7324c83cc1a\") " pod="openshift-marketplace/redhat-marketplace-64hn9" Dec 01 15:03:42 crc kubenswrapper[4585]: I1201 15:03:42.816960 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f18d0f0e-a538-4195-8108-e7324c83cc1a-utilities\") pod \"redhat-marketplace-64hn9\" (UID: \"f18d0f0e-a538-4195-8108-e7324c83cc1a\") " pod="openshift-marketplace/redhat-marketplace-64hn9" Dec 01 15:03:42 crc kubenswrapper[4585]: I1201 15:03:42.817035 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f18d0f0e-a538-4195-8108-e7324c83cc1a-catalog-content\") pod \"redhat-marketplace-64hn9\" (UID: \"f18d0f0e-a538-4195-8108-e7324c83cc1a\") " pod="openshift-marketplace/redhat-marketplace-64hn9" Dec 01 15:03:42 crc kubenswrapper[4585]: I1201 15:03:42.817539 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f18d0f0e-a538-4195-8108-e7324c83cc1a-utilities\") pod \"redhat-marketplace-64hn9\" (UID: \"f18d0f0e-a538-4195-8108-e7324c83cc1a\") " pod="openshift-marketplace/redhat-marketplace-64hn9" Dec 01 15:03:42 crc kubenswrapper[4585]: I1201 15:03:42.817551 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f18d0f0e-a538-4195-8108-e7324c83cc1a-catalog-content\") pod \"redhat-marketplace-64hn9\" (UID: \"f18d0f0e-a538-4195-8108-e7324c83cc1a\") " pod="openshift-marketplace/redhat-marketplace-64hn9" Dec 01 15:03:42 crc kubenswrapper[4585]: I1201 15:03:42.834935 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lm5hz\" (UniqueName: \"kubernetes.io/projected/f18d0f0e-a538-4195-8108-e7324c83cc1a-kube-api-access-lm5hz\") pod \"redhat-marketplace-64hn9\" (UID: \"f18d0f0e-a538-4195-8108-e7324c83cc1a\") " pod="openshift-marketplace/redhat-marketplace-64hn9" Dec 01 15:03:43 crc kubenswrapper[4585]: I1201 15:03:43.003126 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-64hn9" Dec 01 15:03:43 crc kubenswrapper[4585]: I1201 15:03:43.598039 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-64hn9"] Dec 01 15:03:44 crc kubenswrapper[4585]: I1201 15:03:44.253789 4585 generic.go:334] "Generic (PLEG): container finished" podID="f18d0f0e-a538-4195-8108-e7324c83cc1a" containerID="03a96e80299d717c60b2a3110850928f48e2386a808e771ee293e669e5e7681b" exitCode=0 Dec 01 15:03:44 crc kubenswrapper[4585]: I1201 15:03:44.253824 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64hn9" event={"ID":"f18d0f0e-a538-4195-8108-e7324c83cc1a","Type":"ContainerDied","Data":"03a96e80299d717c60b2a3110850928f48e2386a808e771ee293e669e5e7681b"} Dec 01 15:03:44 crc kubenswrapper[4585]: I1201 15:03:44.254119 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64hn9" event={"ID":"f18d0f0e-a538-4195-8108-e7324c83cc1a","Type":"ContainerStarted","Data":"d5dedb85f2acd8053c1bd41b2dd0a2479aab50129caf41011fd90b8390148911"} Dec 01 15:03:44 crc kubenswrapper[4585]: I1201 15:03:44.256062 4585 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 15:03:46 crc kubenswrapper[4585]: I1201 15:03:46.270064 4585 generic.go:334] "Generic (PLEG): container finished" podID="f18d0f0e-a538-4195-8108-e7324c83cc1a" containerID="5961a0709bd39df23bacf9acf0b3b9043f671c7cd71f94783e8e025dd4c0041f" exitCode=0 Dec 01 15:03:46 crc kubenswrapper[4585]: I1201 15:03:46.270152 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64hn9" event={"ID":"f18d0f0e-a538-4195-8108-e7324c83cc1a","Type":"ContainerDied","Data":"5961a0709bd39df23bacf9acf0b3b9043f671c7cd71f94783e8e025dd4c0041f"} Dec 01 15:03:47 crc kubenswrapper[4585]: I1201 15:03:47.283422 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64hn9" event={"ID":"f18d0f0e-a538-4195-8108-e7324c83cc1a","Type":"ContainerStarted","Data":"5f5c7fcab1776f37849e5f40e9c8976abca93dcf0ae88d280c258340473bc4a6"} Dec 01 15:03:47 crc kubenswrapper[4585]: I1201 15:03:47.308555 4585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-64hn9" podStartSLOduration=2.856608984 podStartE2EDuration="5.30853542s" podCreationTimestamp="2025-12-01 15:03:42 +0000 UTC" firstStartedPulling="2025-12-01 15:03:44.255853044 +0000 UTC m=+3938.240066899" lastFinishedPulling="2025-12-01 15:03:46.70777948 +0000 UTC m=+3940.691993335" observedRunningTime="2025-12-01 15:03:47.303769962 +0000 UTC m=+3941.287983807" watchObservedRunningTime="2025-12-01 15:03:47.30853542 +0000 UTC m=+3941.292749275" Dec 01 15:03:50 crc kubenswrapper[4585]: I1201 15:03:50.412616 4585 scope.go:117] "RemoveContainer" containerID="696f9236094d9e4c23cc19363f2464907e2ec3ded59ec494b529b4e5801f208b" Dec 01 15:03:50 crc kubenswrapper[4585]: E1201 15:03:50.413269 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 15:03:53 crc kubenswrapper[4585]: I1201 15:03:53.003825 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-64hn9" Dec 01 15:03:53 crc kubenswrapper[4585]: I1201 15:03:53.004096 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-64hn9" Dec 01 15:03:53 crc kubenswrapper[4585]: I1201 15:03:53.048638 4585 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-64hn9" Dec 01 15:03:53 crc kubenswrapper[4585]: I1201 15:03:53.386244 4585 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-64hn9" Dec 01 15:03:53 crc kubenswrapper[4585]: I1201 15:03:53.435846 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-64hn9"] Dec 01 15:03:55 crc kubenswrapper[4585]: I1201 15:03:55.353352 4585 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-64hn9" podUID="f18d0f0e-a538-4195-8108-e7324c83cc1a" containerName="registry-server" containerID="cri-o://5f5c7fcab1776f37849e5f40e9c8976abca93dcf0ae88d280c258340473bc4a6" gracePeriod=2 Dec 01 15:03:55 crc kubenswrapper[4585]: I1201 15:03:55.782813 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-64hn9" Dec 01 15:03:55 crc kubenswrapper[4585]: I1201 15:03:55.788999 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lm5hz\" (UniqueName: \"kubernetes.io/projected/f18d0f0e-a538-4195-8108-e7324c83cc1a-kube-api-access-lm5hz\") pod \"f18d0f0e-a538-4195-8108-e7324c83cc1a\" (UID: \"f18d0f0e-a538-4195-8108-e7324c83cc1a\") " Dec 01 15:03:55 crc kubenswrapper[4585]: I1201 15:03:55.789165 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f18d0f0e-a538-4195-8108-e7324c83cc1a-utilities\") pod \"f18d0f0e-a538-4195-8108-e7324c83cc1a\" (UID: \"f18d0f0e-a538-4195-8108-e7324c83cc1a\") " Dec 01 15:03:55 crc kubenswrapper[4585]: I1201 15:03:55.789311 4585 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f18d0f0e-a538-4195-8108-e7324c83cc1a-catalog-content\") pod \"f18d0f0e-a538-4195-8108-e7324c83cc1a\" (UID: \"f18d0f0e-a538-4195-8108-e7324c83cc1a\") " Dec 01 15:03:55 crc kubenswrapper[4585]: I1201 15:03:55.790182 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f18d0f0e-a538-4195-8108-e7324c83cc1a-utilities" (OuterVolumeSpecName: "utilities") pod "f18d0f0e-a538-4195-8108-e7324c83cc1a" (UID: "f18d0f0e-a538-4195-8108-e7324c83cc1a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:03:55 crc kubenswrapper[4585]: I1201 15:03:55.796281 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f18d0f0e-a538-4195-8108-e7324c83cc1a-kube-api-access-lm5hz" (OuterVolumeSpecName: "kube-api-access-lm5hz") pod "f18d0f0e-a538-4195-8108-e7324c83cc1a" (UID: "f18d0f0e-a538-4195-8108-e7324c83cc1a"). InnerVolumeSpecName "kube-api-access-lm5hz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 15:03:55 crc kubenswrapper[4585]: I1201 15:03:55.811541 4585 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f18d0f0e-a538-4195-8108-e7324c83cc1a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f18d0f0e-a538-4195-8108-e7324c83cc1a" (UID: "f18d0f0e-a538-4195-8108-e7324c83cc1a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 15:03:55 crc kubenswrapper[4585]: I1201 15:03:55.891387 4585 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f18d0f0e-a538-4195-8108-e7324c83cc1a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 15:03:55 crc kubenswrapper[4585]: I1201 15:03:55.891687 4585 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lm5hz\" (UniqueName: \"kubernetes.io/projected/f18d0f0e-a538-4195-8108-e7324c83cc1a-kube-api-access-lm5hz\") on node \"crc\" DevicePath \"\"" Dec 01 15:03:55 crc kubenswrapper[4585]: I1201 15:03:55.891795 4585 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f18d0f0e-a538-4195-8108-e7324c83cc1a-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 15:03:56 crc kubenswrapper[4585]: I1201 15:03:56.364747 4585 generic.go:334] "Generic (PLEG): container finished" podID="f18d0f0e-a538-4195-8108-e7324c83cc1a" containerID="5f5c7fcab1776f37849e5f40e9c8976abca93dcf0ae88d280c258340473bc4a6" exitCode=0 Dec 01 15:03:56 crc kubenswrapper[4585]: I1201 15:03:56.364793 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64hn9" event={"ID":"f18d0f0e-a538-4195-8108-e7324c83cc1a","Type":"ContainerDied","Data":"5f5c7fcab1776f37849e5f40e9c8976abca93dcf0ae88d280c258340473bc4a6"} Dec 01 15:03:56 crc kubenswrapper[4585]: I1201 15:03:56.364825 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64hn9" event={"ID":"f18d0f0e-a538-4195-8108-e7324c83cc1a","Type":"ContainerDied","Data":"d5dedb85f2acd8053c1bd41b2dd0a2479aab50129caf41011fd90b8390148911"} Dec 01 15:03:56 crc kubenswrapper[4585]: I1201 15:03:56.364851 4585 scope.go:117] "RemoveContainer" containerID="5f5c7fcab1776f37849e5f40e9c8976abca93dcf0ae88d280c258340473bc4a6" Dec 01 15:03:56 crc kubenswrapper[4585]: I1201 15:03:56.364851 4585 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-64hn9" Dec 01 15:03:56 crc kubenswrapper[4585]: I1201 15:03:56.385392 4585 scope.go:117] "RemoveContainer" containerID="5961a0709bd39df23bacf9acf0b3b9043f671c7cd71f94783e8e025dd4c0041f" Dec 01 15:03:56 crc kubenswrapper[4585]: I1201 15:03:56.407236 4585 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-64hn9"] Dec 01 15:03:56 crc kubenswrapper[4585]: I1201 15:03:56.416539 4585 scope.go:117] "RemoveContainer" containerID="03a96e80299d717c60b2a3110850928f48e2386a808e771ee293e669e5e7681b" Dec 01 15:03:56 crc kubenswrapper[4585]: I1201 15:03:56.441827 4585 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-64hn9"] Dec 01 15:03:56 crc kubenswrapper[4585]: I1201 15:03:56.470127 4585 scope.go:117] "RemoveContainer" containerID="5f5c7fcab1776f37849e5f40e9c8976abca93dcf0ae88d280c258340473bc4a6" Dec 01 15:03:56 crc kubenswrapper[4585]: E1201 15:03:56.472448 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f5c7fcab1776f37849e5f40e9c8976abca93dcf0ae88d280c258340473bc4a6\": container with ID starting with 5f5c7fcab1776f37849e5f40e9c8976abca93dcf0ae88d280c258340473bc4a6 not found: ID does not exist" containerID="5f5c7fcab1776f37849e5f40e9c8976abca93dcf0ae88d280c258340473bc4a6" Dec 01 15:03:56 crc kubenswrapper[4585]: I1201 15:03:56.472683 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f5c7fcab1776f37849e5f40e9c8976abca93dcf0ae88d280c258340473bc4a6"} err="failed to get container status \"5f5c7fcab1776f37849e5f40e9c8976abca93dcf0ae88d280c258340473bc4a6\": rpc error: code = NotFound desc = could not find container \"5f5c7fcab1776f37849e5f40e9c8976abca93dcf0ae88d280c258340473bc4a6\": container with ID starting with 5f5c7fcab1776f37849e5f40e9c8976abca93dcf0ae88d280c258340473bc4a6 not found: ID does not exist" Dec 01 15:03:56 crc kubenswrapper[4585]: I1201 15:03:56.472720 4585 scope.go:117] "RemoveContainer" containerID="5961a0709bd39df23bacf9acf0b3b9043f671c7cd71f94783e8e025dd4c0041f" Dec 01 15:03:56 crc kubenswrapper[4585]: E1201 15:03:56.473224 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5961a0709bd39df23bacf9acf0b3b9043f671c7cd71f94783e8e025dd4c0041f\": container with ID starting with 5961a0709bd39df23bacf9acf0b3b9043f671c7cd71f94783e8e025dd4c0041f not found: ID does not exist" containerID="5961a0709bd39df23bacf9acf0b3b9043f671c7cd71f94783e8e025dd4c0041f" Dec 01 15:03:56 crc kubenswrapper[4585]: I1201 15:03:56.473265 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5961a0709bd39df23bacf9acf0b3b9043f671c7cd71f94783e8e025dd4c0041f"} err="failed to get container status \"5961a0709bd39df23bacf9acf0b3b9043f671c7cd71f94783e8e025dd4c0041f\": rpc error: code = NotFound desc = could not find container \"5961a0709bd39df23bacf9acf0b3b9043f671c7cd71f94783e8e025dd4c0041f\": container with ID starting with 5961a0709bd39df23bacf9acf0b3b9043f671c7cd71f94783e8e025dd4c0041f not found: ID does not exist" Dec 01 15:03:56 crc kubenswrapper[4585]: I1201 15:03:56.473294 4585 scope.go:117] "RemoveContainer" containerID="03a96e80299d717c60b2a3110850928f48e2386a808e771ee293e669e5e7681b" Dec 01 15:03:56 crc kubenswrapper[4585]: E1201 15:03:56.473744 4585 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03a96e80299d717c60b2a3110850928f48e2386a808e771ee293e669e5e7681b\": container with ID starting with 03a96e80299d717c60b2a3110850928f48e2386a808e771ee293e669e5e7681b not found: ID does not exist" containerID="03a96e80299d717c60b2a3110850928f48e2386a808e771ee293e669e5e7681b" Dec 01 15:03:56 crc kubenswrapper[4585]: I1201 15:03:56.473765 4585 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03a96e80299d717c60b2a3110850928f48e2386a808e771ee293e669e5e7681b"} err="failed to get container status \"03a96e80299d717c60b2a3110850928f48e2386a808e771ee293e669e5e7681b\": rpc error: code = NotFound desc = could not find container \"03a96e80299d717c60b2a3110850928f48e2386a808e771ee293e669e5e7681b\": container with ID starting with 03a96e80299d717c60b2a3110850928f48e2386a808e771ee293e669e5e7681b not found: ID does not exist" Dec 01 15:03:58 crc kubenswrapper[4585]: I1201 15:03:58.423458 4585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f18d0f0e-a538-4195-8108-e7324c83cc1a" path="/var/lib/kubelet/pods/f18d0f0e-a538-4195-8108-e7324c83cc1a/volumes" Dec 01 15:04:05 crc kubenswrapper[4585]: I1201 15:04:05.412824 4585 scope.go:117] "RemoveContainer" containerID="696f9236094d9e4c23cc19363f2464907e2ec3ded59ec494b529b4e5801f208b" Dec 01 15:04:05 crc kubenswrapper[4585]: E1201 15:04:05.413679 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 15:04:19 crc kubenswrapper[4585]: I1201 15:04:19.412640 4585 scope.go:117] "RemoveContainer" containerID="696f9236094d9e4c23cc19363f2464907e2ec3ded59ec494b529b4e5801f208b" Dec 01 15:04:19 crc kubenswrapper[4585]: E1201 15:04:19.413430 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c" Dec 01 15:04:24 crc kubenswrapper[4585]: I1201 15:04:24.453441 4585 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hgwt6"] Dec 01 15:04:24 crc kubenswrapper[4585]: E1201 15:04:24.455234 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f18d0f0e-a538-4195-8108-e7324c83cc1a" containerName="registry-server" Dec 01 15:04:24 crc kubenswrapper[4585]: I1201 15:04:24.455346 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="f18d0f0e-a538-4195-8108-e7324c83cc1a" containerName="registry-server" Dec 01 15:04:24 crc kubenswrapper[4585]: E1201 15:04:24.455435 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f18d0f0e-a538-4195-8108-e7324c83cc1a" containerName="extract-content" Dec 01 15:04:24 crc kubenswrapper[4585]: I1201 15:04:24.455507 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="f18d0f0e-a538-4195-8108-e7324c83cc1a" containerName="extract-content" Dec 01 15:04:24 crc kubenswrapper[4585]: E1201 15:04:24.455592 4585 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f18d0f0e-a538-4195-8108-e7324c83cc1a" containerName="extract-utilities" Dec 01 15:04:24 crc kubenswrapper[4585]: I1201 15:04:24.455667 4585 state_mem.go:107] "Deleted CPUSet assignment" podUID="f18d0f0e-a538-4195-8108-e7324c83cc1a" containerName="extract-utilities" Dec 01 15:04:24 crc kubenswrapper[4585]: I1201 15:04:24.456019 4585 memory_manager.go:354] "RemoveStaleState removing state" podUID="f18d0f0e-a538-4195-8108-e7324c83cc1a" containerName="registry-server" Dec 01 15:04:24 crc kubenswrapper[4585]: I1201 15:04:24.457994 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hgwt6" Dec 01 15:04:24 crc kubenswrapper[4585]: I1201 15:04:24.480093 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hgwt6"] Dec 01 15:04:24 crc kubenswrapper[4585]: I1201 15:04:24.529165 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kj8b\" (UniqueName: \"kubernetes.io/projected/1c744d2e-81ee-4ea9-b804-71d6fdfaae4b-kube-api-access-9kj8b\") pod \"certified-operators-hgwt6\" (UID: \"1c744d2e-81ee-4ea9-b804-71d6fdfaae4b\") " pod="openshift-marketplace/certified-operators-hgwt6" Dec 01 15:04:24 crc kubenswrapper[4585]: I1201 15:04:24.529409 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c744d2e-81ee-4ea9-b804-71d6fdfaae4b-catalog-content\") pod \"certified-operators-hgwt6\" (UID: \"1c744d2e-81ee-4ea9-b804-71d6fdfaae4b\") " pod="openshift-marketplace/certified-operators-hgwt6" Dec 01 15:04:24 crc kubenswrapper[4585]: I1201 15:04:24.529434 4585 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c744d2e-81ee-4ea9-b804-71d6fdfaae4b-utilities\") pod \"certified-operators-hgwt6\" (UID: \"1c744d2e-81ee-4ea9-b804-71d6fdfaae4b\") " pod="openshift-marketplace/certified-operators-hgwt6" Dec 01 15:04:24 crc kubenswrapper[4585]: I1201 15:04:24.630838 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c744d2e-81ee-4ea9-b804-71d6fdfaae4b-catalog-content\") pod \"certified-operators-hgwt6\" (UID: \"1c744d2e-81ee-4ea9-b804-71d6fdfaae4b\") " pod="openshift-marketplace/certified-operators-hgwt6" Dec 01 15:04:24 crc kubenswrapper[4585]: I1201 15:04:24.630897 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c744d2e-81ee-4ea9-b804-71d6fdfaae4b-utilities\") pod \"certified-operators-hgwt6\" (UID: \"1c744d2e-81ee-4ea9-b804-71d6fdfaae4b\") " pod="openshift-marketplace/certified-operators-hgwt6" Dec 01 15:04:24 crc kubenswrapper[4585]: I1201 15:04:24.631004 4585 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kj8b\" (UniqueName: \"kubernetes.io/projected/1c744d2e-81ee-4ea9-b804-71d6fdfaae4b-kube-api-access-9kj8b\") pod \"certified-operators-hgwt6\" (UID: \"1c744d2e-81ee-4ea9-b804-71d6fdfaae4b\") " pod="openshift-marketplace/certified-operators-hgwt6" Dec 01 15:04:24 crc kubenswrapper[4585]: I1201 15:04:24.631526 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c744d2e-81ee-4ea9-b804-71d6fdfaae4b-catalog-content\") pod \"certified-operators-hgwt6\" (UID: \"1c744d2e-81ee-4ea9-b804-71d6fdfaae4b\") " pod="openshift-marketplace/certified-operators-hgwt6" Dec 01 15:04:24 crc kubenswrapper[4585]: I1201 15:04:24.631541 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c744d2e-81ee-4ea9-b804-71d6fdfaae4b-utilities\") pod \"certified-operators-hgwt6\" (UID: \"1c744d2e-81ee-4ea9-b804-71d6fdfaae4b\") " pod="openshift-marketplace/certified-operators-hgwt6" Dec 01 15:04:24 crc kubenswrapper[4585]: I1201 15:04:24.656684 4585 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kj8b\" (UniqueName: \"kubernetes.io/projected/1c744d2e-81ee-4ea9-b804-71d6fdfaae4b-kube-api-access-9kj8b\") pod \"certified-operators-hgwt6\" (UID: \"1c744d2e-81ee-4ea9-b804-71d6fdfaae4b\") " pod="openshift-marketplace/certified-operators-hgwt6" Dec 01 15:04:24 crc kubenswrapper[4585]: I1201 15:04:24.793476 4585 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hgwt6" Dec 01 15:04:25 crc kubenswrapper[4585]: I1201 15:04:25.433381 4585 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hgwt6"] Dec 01 15:04:25 crc kubenswrapper[4585]: I1201 15:04:25.634171 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hgwt6" event={"ID":"1c744d2e-81ee-4ea9-b804-71d6fdfaae4b","Type":"ContainerStarted","Data":"8a16a90e94c4bf93ca18444755a1dff0754423fea4e6df19f01eaf80b350fa77"} Dec 01 15:04:26 crc kubenswrapper[4585]: I1201 15:04:26.645470 4585 generic.go:334] "Generic (PLEG): container finished" podID="1c744d2e-81ee-4ea9-b804-71d6fdfaae4b" containerID="ceea5dfd6d9ff992579260067e4ba5fd4963f73b906d7a3cf2e412c08fe2e281" exitCode=0 Dec 01 15:04:26 crc kubenswrapper[4585]: I1201 15:04:26.645572 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hgwt6" event={"ID":"1c744d2e-81ee-4ea9-b804-71d6fdfaae4b","Type":"ContainerDied","Data":"ceea5dfd6d9ff992579260067e4ba5fd4963f73b906d7a3cf2e412c08fe2e281"} Dec 01 15:04:28 crc kubenswrapper[4585]: I1201 15:04:28.676518 4585 generic.go:334] "Generic (PLEG): container finished" podID="1c744d2e-81ee-4ea9-b804-71d6fdfaae4b" containerID="b55dcd041eba9df30edd4491be6193d7af910e06feb379f75cf6217eae6a0792" exitCode=0 Dec 01 15:04:28 crc kubenswrapper[4585]: I1201 15:04:28.676606 4585 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hgwt6" event={"ID":"1c744d2e-81ee-4ea9-b804-71d6fdfaae4b","Type":"ContainerDied","Data":"b55dcd041eba9df30edd4491be6193d7af910e06feb379f75cf6217eae6a0792"} Dec 01 15:04:30 crc kubenswrapper[4585]: I1201 15:04:30.413056 4585 scope.go:117] "RemoveContainer" containerID="696f9236094d9e4c23cc19363f2464907e2ec3ded59ec494b529b4e5801f208b" Dec 01 15:04:30 crc kubenswrapper[4585]: E1201 15:04:30.415056 4585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lj9gs_openshift-machine-config-operator(f7beb40d-bcd0-43c8-a9fe-c32408790a4c)\"" pod="openshift-machine-config-operator/machine-config-daemon-lj9gs" podUID="f7beb40d-bcd0-43c8-a9fe-c32408790a4c"